Facebook Emotional Manipulation Study Not Scientific

Facebook

The social media giant is once again under fire for a recent experiment on its users to see how the tailoring of the feed could sway their moods. While the Facebook’s little emotional manipulation study might be legal, according to their terms and services, it might not be entirely scientific. Shadows of doubt have been cast both on the methodology and the significance of its outcome. Perhaps all the fear and anxiety have been wasted on a flawed study.

The Atlantic explained that in the research, published in the prestigious Proceedings of the National Academy of Sciences, Facebook has tweaked hundreds of thousands of individual feeds. Some of the users saw more items with positive and happy wording, whereas others were shown more upsetting terms than the average. The adjustment lasted a week, after which Facebook compared the users’ statuses. Perhaps unsurprisingly, those with more negative feeds were indeed posting more negative updates, whereas those shown positive items reflected that as well. The change was very small, but not entirely statistically insignificant.

Despite all the anger and criticism, the study is most likely legal. According to The Atlantic, the terms and conditions of the social service reserve the right to analyze data and perform tests and research on its users. Even though the Facebook emotional manipulation study might not be wrong in the light of the law, it might not be entirely valid in the scientific one.

John Grohol, an author, researcher and mental health expert who writes on the popular PsychCentral website, criticized the method used. To measure the mood changes, Facebook employed an automatic tool called the Linguistic Inquiry and Word Count, designed mainly for analysis of large bodies of texts (like books or essays) and not short blurbs like status updates. Grohol gives a few examples where the analysis of a simple update (“I am not having a great day”) using this application could produce inaccurate results. The moods were thus loosely implied, and never directly assessed.

Furthermore, The Washington Post reported that the research was not pre-approved by Cornell University’s ethics board. The implied user permission to performed the study is also questionable. Informed consent means informing the participants about the study and giving them an option to opt out, but neither were the case. Simply put, the standards imposed by the government and professional associations for performing psychological experiments were not strictly adhered to by Facebook.

Aside from these  huge methodological problems, Grohol also criticized the tiny changes reported by the study. The research found a 0.07% decrease in negative words on people’s statuses following a more positive feed. This would require hundreds, if not thousands, of words before a change caused by the feed adjustment really manifested. He called it a “statistical blip” more so than an actual effect.

It is no secret that Facebook tailors the main page feeds for each individual based on their likes, preferences and even browsing history. The goal has always been to provide a more interesting list of items as well as more targeted advertising for its partners. However, using the feed to deliberately influence moods and opinions could have far scarier and Orwellian purposes, especially in the age of Snowden revelations and NSA privacy concerns. Even if the emotional manipulations study performed by Facebook was not entirely scientific, it only encourages further such experiments and sets a strong precedence that tinkering with the user moods is acceptable. Media curation and censorship already have a significant impact on people’s opinion and emotions, so it is worrisome to see such power seep into a massive social network which has no responsibility to be objective or unbiased.

By Jakub Kasztalski

Sources
The Atlantic
Washington Post
PsychCentral

Your Thoughts?