It is difficult to hazard a figure for the number of false news that circulated during the pandemic. If we believe the little data that Facebook executives revealed, after being pointed out by President Joe Biden of “to be killing people” For disinformation, this platform removed more than 18 million cases of misinformation about covid-19, while labeling and reducing the visibility of more than 167 million discredited pieces about the coronavirus.
One of the first jobs he ventured into was an experiment within the university community itself in which found that 72 percent of students blindly trusted links that appeared to come from friendseven to the point of entering personal login information on phishing sites (data theft).
This fact led him to design a simple experiment. He created a web page with false information. He also included the possibility of receiving publicity. “At the end of the month I received a check in the mail with the earnings from the ads. That was my test: fake news can make money by polluting the internet with falsehoods”, he would tell later.
“Over this last decade we have seen that some of the platforms try to control the abuses that occur at different levels, but we have also seen that the manipulation strategies are becoming more sophisticated. There is a market to buy fake likes, followers and apps for users to put their accounts at the service of disinformation campaigns. The bots become more sophisticated and delete their own content”, says Menczer.
Various investigations have led him to group the biases that have us involved in this ‘mess’ into three groups.
We have seen that some platforms try to control the abuses that occur at different levels, but we have also seen that the manipulation strategies are becoming more sophisticated
The other group of biases is given by the way our society works. It is social prejudices that generally guide the way we choose friends and role models. In other words, those ‘bubbles’ in social media are also a reflection of the ‘bubbles’ we create by forming groups.
Finally, there are the biases in the algorithms that govern the networks. On Twitter alone, more than 500 million messages are generated every day. That’s 6,000 tweets per second. Who chooses what we see each of the users? An algorithm designed by humans themselves full of mental biases and social prejudices. Outcome? In other research, Menczer and colleagues showed that social media platforms expose users to a less diverse set of sources than non-social media sites like Wikipedia: “We call it homogeneity bias“, He says. “Another important ingredient of social media is the information that is trending on the platform, based on what gets the most clicks. We call this popularity bias,” he explains.
And a study published in 2018 led by Soroush Vosoughi, from the Massachusetts Institute of Technology, after analyzing nearly 126,000 verified, true and false news stories, shared by more than three million people on Twitter from 2006 to 2017, concluded that false information “ it spread significantly further, faster, deeper and more widely than the truth in all categories of information.” One figure says it all: The top 1 percent of the fake news cascade spread to between 1,000 and 100,000 people, while the truth rarely spread to more than 1,000 people.
Perhaps one of the most pessimistic conclusions of the work of Menczer and his group is that we have created a system to communicate, social networks, in which quality is not the criterion that determines the dispersion of information. What goes viral is simply a statistical consequence of the proliferation of information in a social network of users with little attention.
Another important ingredient of social networks is the information that is trending on the platform, according to what receives the most clicks
“The only way I see to combat this problem is to reduce the volume of information. Add friction. Since the Internet began, there has been a great tendency to reduce the price of generating content,” says Menczer.
One problem with wanting to “add friction” to the system is the risk of creating models that cause censorship. That the remedy is worse than the disease as it usually happens. “I think there is a lot to think about the friction that we should put in,” Menczer reflects. The truth is that neither the technology we have nor the strategies with humans verifying data work at the scale and with the precision that is needed.”
He is cautious when drawing conclusions about the infodemic: “Yes, it is true that the proliferation of information can devalue information and can lead us astray. But having more information can also allow us to think new things, connect new ideas and even new ways of understanding old problems”.
Something similar warns about the issue of mental biases and social prejudices. Like everything in biology, there is a flip side: “Most of our mental biases are there for a reason. We tend to believe what our families tell us, we tend to believe what our group tells us, we tend to believe people who have helped us in the past.”
The proliferation of information can devalue it and lead us astray. But it can also allow us to think new things, connect ideas, and even new ways of understanding old problems.
In 2013 it was estimated that more than 50% of the US population believed in at least one of these crazy theories. And the consequences associated with the circulation of such theories are not trivial, Hills warns. They can lead to lower vaccination rates, but also to rejection of environmental protection policies, reduction of protection against sexually transmitted diseases, general mistrust, political alienation and even justifications for violence.
“We still don’t understand how much information is needed to influence people’s behaviorHills says. That is precisely the question he dreams of answering. “The question is how much does it matter and how much does it take to make a difference. If a single lie destabilizes a country, then we have a problem”, insists Hills.
Hills agrees that the solution lies in education: “Education is always the answer. If we can’t teach people how to better understand and interact with their world, trying to control information will be like trying to control illegal drugs.”
While we find a way out of the problem of misinformation, or at least tools to contain it, we must not forget what Jesus of Nazareth said: “Do not believe everything you read on the internet.”
More news in depth