“Social media has been cleverly exploited by anti-scientific forces .” This is how forceful Herbert Holden Thorp , editor-in-chief of the prestigious magazine ‘ Science ‘, is shown in a harsh editorial that criticizes how Facebook has given wings to a spiral of misinformation about the coronavirus and conspiracy theories against vaccines .
Although the toxicity of the business model of social networks is not something new, it is the open criticism of the reputed weekly – which has already criticized Trump’s speeches on the covid – against these platforms , which have allowed the popularization of all kinds of hoaxes . Thorp thus refers to a report from the ‘New York Times’ in which it is revealed as “Facebook fills its coffers by exploiting the viral spread of disinformation while trying to convince everyone of its noble mission to connect the world.”
The editorial points out how the same design of social networks such as Facebook, Twitter or YouTube directly damages scientific communication . The algorithms of these platforms seek to retain the user on them for as long as possible and to achieve this objective they promote the content that generates the most reactions, even if it is lies or incendiary hate messages .
This situation has meant that communicating the findings of science to the world “has reached what seems to be an all-time low.” “Communicating about research in real time is difficult because science is always a work in progress, with precautions and answers that aren’t always definitive. That doesn’t translate well to social media or the Facebook algorithms that determine which posts to post. should promote “notes.
Proliferation of antiscientific hoaxes
That same design has allowed the proliferation of radical content and messages that, without any foundation, deny the coronavirus or criminalize vaccines . A reality that has been taken advantage of by groups from the extreme right to opportunists who have seen a whole business in disinformation. “The antiscientific opposition doesn’t care about precautions,” says Thorp.
The disinformation impact of the world’s largest social network, he points out, has two ramifications: one is purely fake news, which Facebook’s algorithms detect and label with warnings “that most people ignore”; the other is the misinformation that arises from conversations between users in the community, which is more dangerous and difficult to detect. “The result is that both types of misinformation tend to rise to the top of Facebook news because they get more engagement than posts about recent research results published in scientific papers or even in the mainstream press,” he laments.
That is why the journal ‘Science’ calls on the scientific community not to choose to delete their social accounts and to do so to be more aggressive in the networks to better communicate their findings. “Refusing to play hard in the field of social networks does not serve science or society,” he remarks. “Since the end of WWII, scientists have clung to the idea that if they stick with goals and expose the science, the rest of the world will follow. As climate change rages and the pandemic drags on, it is Time to face the fact that this old notion is naive. “