Facebook Apologizes for Manipulating Feeds

Facebook

Facebook has apologized to its users for manipulating their feeds. The controversy started when it was announced the social media giant had been performing a psychological experiment. This experiment was not approved by the Cornell ethics board, leading to the question of whether the company was right in performing the experiment.

The experiment took place in 2012, and only lasted one week. Fewer than 690,000 users found their newsfeeds were manipulated to help with the experiment. Facebook wanted to see whether the type of posts on the newsfeeds affected the way people felt. It turns out that those who saw more negative posts were more likely to feel negatively, and their status updates reflected that. Meanwhile more positive posts led to more feelings of positivity among users.

The results of the psychological study were published in scientific journal PNAS. However, the company has been forced to apologize as there has been an international outcry as people complain once again that Facebook only shows what it wants. This has been an issue for a while, as the social media giant continually tinkers with newsfeeds and shows sponsored posts that people have no interest in. There is also the issue of most commented on posts showing up at the top of feeds, instead of the most recent. This can cause confusion for some, and is annoying as people just want to see the recent updates from their friends.

Facebook has apologized for the anxiety caused by manipulating the feeds. However, it has defended its actions, by saying that the research was needed. Researchers at the tech giant wanted to know whether positive posts from friends led to users feeling left out or upset. It actually found that fewer positive posts led to the negative feelings. More negative posts did not necessarily lead to negative feelings, but it did depend on the number of positive posts in the feed at the same time.

Posts were de-prioritized based on the words within the posts. It was a very basic algorithm, so there were certainly limitations within the research. There was also a basis of the status updates from the specific users afterwards, and not all users will leave an update.

Data scientist at Facebook, Adam D.I. Kramer defended the decision, stating that the company cares about users’ well-beings. The research was needed to make sure the mental health of users was cared for. The posts were never actually hidden, but did not always show up. In the end just around 0.04 percent of people were affected during the week-long testing.

There was a report that the Cornell Unversity’s Institutional Review Board (IRB) approved the use f “pre-existing data.” However, it turns out that the IRB was not asked until after the research had been carried out. The process of collecting the data was not approved.

Users have questioned whether the manipulation of feeds for research is legal. It turns out that the fine print in the terms of service on the site does allow the company to do something like this. Even with that, Facebook has apologized for the distress and anxiety caused by manipulating the feeds for that one week in 2012.

By Alexandria Ingham

Sources:

The Daily Mail

Washington Post

BBC

Your Thoughts?