According to a paper published in last month’s edition of the ‘Proceedings of the National Academy of Scientists’ (PNAS) social media giant Facebook conducted a psychological experiment on hundreds of thousands of it’s users back in January of 2012 – manipulating their emotions by delivering negative and/or positive items into those user’s newsfeeds in an attempt to see if emotional states are contagious – all without notifying those users of what was going on.
The report states that the study affected 689,003 English-speaking Facebook users who were “randomly selected,” to exposure from two parallel experiments, reducing the number of positive or negative updates in each user’s news feed.
The authors of the paper who include researchers from Facebook, Cornell University and the University of California had this to say regarding the experiment:
“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. We also observed a withdrawal effect: People who were exposed to fewer emotional posts (of either valence) in their News Feed were less expressive overall on the following days.”
The study is the first of it’s kind and indeed does provide positive proof that moods expressed via social networks are contagious representing “emotional contagion” and influences the moods of others:
“These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
The Facebook users were not notified of the experiment but legally, according to Facebook’s terms of service and Data Use Policy, users’ data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement,” so there’s little to no recourse for users other than switching it off.
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Kramer added.
“While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.”
The report was co-authored by Cornell’s Jeff Hancock and Jamie Guillory.
In a weak defense of facebook’s antics it’s also a well known fact that Google and Amazon also collect user data, referred to as “data mining,” and use it to their advantage.