On Facebook, information gets shared faster than sources can be checked or the data verified. A recent study used a sample size of 2.3 million to discover how the users interact with data from different sources to find the reason behind viral spread of misinformation. People believe strange things and a lot of false information, however, the study pinpointed which users are more vulnerable to passing around unfounded claims. Their published study finds that those Facebook users are not stupid, they are just suspicious of the sources with factual claims. And when they are not around reliable sources, those users are more likely to believe hilarious stories like The Onion and dangerous false medical information.
There is a website dedicated to posting instances of people taking the parody site, The Onion, far too seriously. Each post shows Facebook users reacting to satirical and absurd titles in humorous ways. “McDonald’s Now Offering Bereavement Prices,” is met with questions about how employees verify if a customer is properly mourning. Another article from The Onion says “The Department of Health and Human Services” released “Report: Leading Cause Of Death In U.S. Is God Needing Another Angel,” and with 7,237 “likes.” One person expressed their concern that it was Satan’s doing, and not God responsible for the most deaths.
The Onion is clear about being “a satirical weekly publication” and viewing the website should clear up any confusion a person has. However, the danger can start with individuals using scare tactics in the content that is passed along by people who are not checking the validity of the claims. Such as claims that a new diet can prevent cancer, or that the government actively suppresses natural cancer remedies, both of these being unfounded stories that have gone viral.
Posts trying to scare people into a cause can easily exploit an area that the public is not knowledgeable about. One example, easily debunked by Google, claims that the drug Molly is a combination of cocain, crack, ecstasy, meth and bath salts. The same post reports that this imaginative combination can slow heart rate “10x the normal limit.” Any quick search would show that the term Molly predominately references a chemically pure sample of the chemical MDMA, an empathogen which has been used in doctor supervised therapy to treat post traumatic stress disorder (PTSD.)
The people believing such irrational claims, the study found, were mostly those who did not follow credible news sources. Physicist Delia Mocanu, with the help of four colleagues at Northeastern University’s Laboratory for the Modeling of Biological and Socio-Technical Systems recently published “Collective Attention in the Age of (Mis)information.” With a sample of 2.3 million people, they used the 2013 Italian electoral competition and any relevant information and discussions. There were three categories that pages were organized into. The alternative sources with topics that “are neglected by science and main stream media,” then there was political activism, and finally, actual main stream media.
“This social ecosystem” was then studied as each group responded to “the injection of 2788 false information posts.” Despite the difference of quality of information that those three categories can provide, they perform in similar ways. This means the conspiracy theories will actually be passed around as much as verified truth will. Meaning, it is not just ignorance, or even stupidity, but the study found that some Facebook users are too suspicious of mainstream media that they put trust in these theories instead. The published findings showed that people more critical of mainstream media are also the least critical of information they consume and share from alternative sources that have untrue content.
The study looked at caricature-style political activism and alternative news stories that had “peculiarity to include false information,” and how users interacted with them. It revealed that one reason why there was mistrust of the mainstream media was that sometimes they too were believing the misinformation. In one example, a report said that the Italian Senate voted on a law that Senator Cirenga proposed. It was accepted with 257 in favor and 165 abstentions, and it would use 134 billion Euros to help policy-makers find jobs if defeated by their political competition.
It would have taken little searching to discover the holes in this story. Senator Cirenga does not exist, the Senate of the Republic has only 320 senators, and the amount of money proposed is 10 percent of Italy’s GDP. The story not only went viral during the election, it was reposted by multiple mainstream political organizations, and was used in protests that occurred in several different Italian cities.
The most likely to share stories similar to that example, and other false content, were those who regularly received their news from the alternative sources. Of 1,279 users that were labeled as interacting with those sources, 56 percent of users accepted the false claims. The other groups had only 26 percent and 18 percent of users accepting or sharing that content.
Nearly every Facebook user has witnessed the phenomenon first hand, at least now users can understand why these stories end up on their news feed. Sometimes these mistakes can be humorous. Such as watching friends fall into the trap of believing The Onion when they publish, “Report: Ocean Levels Could Rise Foot Or More If Lots Of People Go Swimming.” But sometimes, it can lead to dangerous behavior when people act on the misinformation. The study’s findings are a good reminder that those annoying Facebook friends are not stupid, just suspicious of the wrong news sources.
By Whitney Hudson