Social media mogul Facebook has even been tracking what their users don’t type. It’s a pretty sure bet that nearly every Facebook user has typed out a comment or a status update, and then had second thoughts only to delete it before hitting Enter or Post. Well it turns out that Facebook knows about it and tracked it.
Employees at Facebook have dubbed this “self-censorship,” and they have just conducted a study of this personal filtering method. Facebook summer software engineer and Carnegie Mellon Ph.D. student, Sauvik Das, along with Facebook data scientist, Adam Kramer, have been conducting a study on this “self-censorship.”
The two project leaders studied data collected in July 2012 from 5 million English-speaking users over a 17-day period. The study found that 71 percent of the users censored their own posts. The users that held back did so on an average of 4.52 status updates and an average of 3.2 comments.
The censored data is collected when a user enters text it triggers code to be sent to the web browser from Facebook. This code sent from Facebook then analyzes what was typed and sends the metadata back to Facebook. Experts have compared this method to how an email program creates a draft of an email message, but the difference is that a user knows that the draft is being created, unlike the Facebook tracking method.
If a user has a Facebook account, it means that they have agreed to the Facebook’s Data Use Policy. Most users are aware that agreeing to the policy means Facebook collects information that users choose to share. However, the policy goes even deeper. Facebook’s policy also covers tracking what users choose to type but do not share.
The study did say that Facebook was not tracking what users actually did say, and they only tracked if users censored their comment or status post. However, both Das and Kramer have stated in the study’s conclusion that they are looking to expand their study to see what was actually being censored. Facebook says that the more they know about why and how their users are self-censoring their posts, the more they can help to minimize it.
The study found five common reasons why people were possibly filtering their posts.
- User will censor posts to stop an argument or to avoid instigating one.
- People will also think twice about posting a comment if they fear they may offend someone.
- Users may fear that they will bore their friends with the comment.
- People may have a fear of being inconsistent with their own self-representations.
- The user may be stopped by technical reasons.
The study also gathered demographics from the Facebook users with some behavioral features as well. In addition to these, Facebook also tracked the average number of friends, and their political ideology in relation to their friends’ beliefs. This information was used to cross reference the data gathered from their friends. Those cross sections were:
- The user’s political stance compared to their friends views.
- The user’s political ideology compared to how homogenous their friends are.
- How gender-diverse their network is, in relation to their own gender.
Facebook has had much interest in how their users think and how they react with their site. There may be plans in future to track a user’s mouse movement over the Facebook page to further understand the user’s habits. So it should really come as no surprise when they are even tracking what users don’t type.
By Brent Matsalla