Quantcast

Facebook mainpulated 600,000 users’ NewsFeeds to test emotional responses

Facebook tweaked the NewsFeeds' of over 600,000 users to test moods.
Facebook tweaked the NewsFeeds’ of over 600,000 users to test moods. Photo Credit: Mark Mainz/Getty Images

If you feel like your Facebook friends were particularly negative in 2012, you might have been part of an experiment without your knowledge.

In a test to see how people are affected by the posts, Facebook researchers manipulated the NewsFeeds of over 600,000 users without their knowledge, according to a report first published by New Scientist.

Facebook’s right to do this is all apparently in their Privacy Policy, but that didn’t stop some of the social network’s users from feeling betrayed. Privacy activist Lauren Weinstein tweeted on Saturday “I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it’s possible.”

By Sunday afternoon, Adam I. Kramer, the Facebook data scientist who co-authored the study, posted a “brief public explanation” on his Facebook page.

“The goal of all of our research at Facebook is to learn how to provide a better service,” Kramer wrote. Kramer insisted that the experiment only used “a small percentage of content in News Feed (about 0.04% of users, or 1 in 2,500) for a short period (one week, in early 2012)” and that “nobody’s posts were ‘hidden.'”

To be exact, Facebook altered the number of positive and negative posts for one week in January 2012 for 689,003 randomly selected users to see how changes in tone in the NewsFeed affects users’ own mood.

As for the play, Mrs. Lincoln? The study found that people who saw more positive posts felt more positively and those who saw more negative posts felt more negatively. For example, people were more likely to use positive words in their posts if they had seen less negative posts in their NewsFeed, and vice versa.