When you receive a dose of a traditional vaccine, including some made against the new coronavirus, the attenuated or inactive viral particles present in the vaccine trigger an immune response in the body to train your body to fight off the disease.
The same logic may be valid from a psychological point of view, in the face of another current “epidemic” – that of disinformation, information manipulation and the spread of fake news.
Researchers at the University of Cambridge in the UK are testing how small, preventive and ‘mitigated’ doses of disinformation techniques can inoculate us against the environment of false or distorted information on the internet, especially in the time of Covid -19.
“[O objetivo] is to create a kind of psychological resistance against persuasion, so that in the future when you are exposed to disinformation it will be less convincing, because you will have ‘antibodies’, “Jon Roozenbeek told BBC News Brazil , researcher in the Social Decision-Making Laboratory, Department of Psychology, University of Cambridge.
“In other words, if you know the techniques and tricks used to deceive or persuade people, you are less likely to fall in love with them.”
One of these vaccines tested is an online game called Go Viral! (“Viralize”, in free translation, and available in Portuguese here), lasting just over five minutes.
In it, the player assumes the character of someone who wants to go viral on the Internet at all costs. In this role, he puts into practice the tactics most used to spread disinformation and fake news, and these tactics can be summed up in three points:
1) Explore the emotions of the viewer. Fake news is often written or manipulated in such a way that it causes us anger, indignation, fear, anguish and, therefore, provokes the impulse to share that content quickly. This is how this non-informative content gains in reach.
2) Invent experts to back up claims, whatever they are, by giving them a false ballast, or a false aura of importance.
3) Foster conspiracy theories, which provide their followers with consistent (if false) explanations and ideal scapegoats for complex global issues. Alluring, these theories tend to generate a lot of engagement on the internet.
“Especially in times of crisis, people are looking for a cohesive story behind the madness. Your followers are counting on you to provide it. Maybe it’s time to come up with your own theory,” suggests Go Viral! already in the final “immunization” phase of the game.
In a study published in May in the journal Big Data & Society, Roozenbeek and colleagues submitted users of Go Viral! questionnaires and identified that in general, players have increased the perception of what is and is not manipulation in the news of the covid-19 pandemic.
Gamers also gained confidence in their ability to identify manipulative content – and as a result, many stopped sharing this fake news with others.
Now, researchers want to understand how long this inoculation lasts, that is, how long the understanding of these manipulation techniques remains “fresh” in the minds of players.
“The minimum (inoculation) time appears to be at least a week and the maximum time is not known, but we do know that the effects wane over time, unless people are given an extra dose (d ‘inoculation), as well as in vaccines against covid-19 “, continues Roozenbeek.
Other studies on media literacy in general also point out that constant actions to combat disinformation are more likely to be successful than simple one-off interventions.
One of the bets is that, like any other knowledge, putting it into practice keeps it alive. According to the Cambridge study, simply ranking social media posts based on their apparent reliability has already caused study participants to maintain their “psychological immunity” at higher levels.
“Herd immunity”?
The preventive inoculation strategy is also called “prebunking”, a word derived from debunking which means to unmask in English. The idea is that in addition to checking and identifying lies in the content or speeches of politicians (an equally important action), it is possible to teach Internet users to prevent them.
So instead of focusing only on one particular post or conspiracy theory, it focuses on how the strategies behind them work – which in theory helps increase the scale of the inoculation. .
But is there collective psychological immunity?
“Theoretically yes. In practice, I have doubts”, answers Roozenbeek.
“Physically, herd immunity reduces the chances of circulating a virus to next to nothing. Herd immunity to measles means no one has measles. Against disinformation, it wouldn’t work that way – it would be impossible. with any intervention, “he said. .
“But what is theoretically achievable, and we modeled it, is to reduce the susceptibility to disinformation, not necessarily at the individual level, but at the network level. Assuming that a certain percentage of people in that network would be inoculated, the amount of disinformation is reduced that is consumed and shared. “
It remains to be seen whether this modeling can be replicated in real life, specifies the researcher, “in particular in the face of effects such as ‘internet bubbles’ (groups of people with similar worldviews, whose consumption of information reinforces this vision). of the world), such as that of the anti-vaccine communities “.
“These groups are less likely to step in and change their minds. It is unrealistic to think that a game will change that,” says Roozenbeek. “Instead, we want to prevent some people from starting this kind of unwanted behavior.”
In addition to Go Viral !, other games, videos and interventions are tested by Roozenbeek and his colleagues. Here’s a caveat: “We don’t want an intervention that says ‘Believe only the truth, not lies.’ t be realistic. We also don’t want to say “I, a university researcher, have the truth and you have to believe me. This would be wrong and very likely not to work. So our goal with these games is to show people how they can be manipulated on the internet, and it doesn’t suit us (determine) what they’re going to do with that information. more precise when it comes to differentiating information from disinformation. “
Misinformation in the pandemic
The WHO (World Health Organization) has defined the Covid-19 pandemic as “the first in history in which technology and social networks are used on a large scale”, contributing both to providing scientific data and preventive measures to the general population, as well as to disseminate an unprecedented volume of false or scientifically objectionable information in the field of health.
In Brazil, an example comes from President Jair Bolsonaro himself (no party), who has consistently defended, in government actions and Internet lives, the so-called “early treatment”, despite successive health agencies and the scientific studies having concluded that the drugs of it has been shown that this kit has no power to prevent Covid-19.
“Misinformation costs lives,” WHO said last September in a joint statement with other UN agencies regarding the fake news crisis in the pandemic. “Without the right trust and the right information, diagnostic tests are rendered unnecessary, vaccination (or vaccine promotion) campaigns will not meet their targets, and the virus will continue to progress.” “
Cambridge Roozenbeek agrees on the detrimental effect of disinformation. At the same time, it takes a cautious approach to the magnitude of this problem.
“Not everyone believes in everything,” he argues. “We clearly see the negative effect of the spread of disinformation; we know that people don’t get vaccinated because they believe in untruths (…), we know that these beliefs have the power to affect behavior. But it is also true that exposure to disinformation and the belief that it is asymmetric: the vast majority exposed do not believe it. “
“For me, the important thing is to know more about the drivers of susceptibility to disinformation, its geographical distribution, by social network groups, etc. And to design prevention in this spirit.”
Finally, says the researcher, there are aspects of this phenomenon that go beyond psychology and behavioral sciences, as they are not individual, but structural – linked, for example, to the algorithms that govern social networks. and which have the power to stimulate the consumption of fake news as a means of increasing user engagement and the time they spend online.
So “it’s beneficial to approach the problem holistically,” argues Roozenbeek. “A scientific discussion about this should not focus exclusively on psychology or any other problem. “