7186553884_flickr_zeevveez

A recently published study headed up by Facebook is causing a wave of concern from both the public and media this week as it gives the details of a secret mood manipulation experiment on almost 700,000 users – without their knowledge.

The study, published publicly in the journal PNAS, was a joint effort between researchers at Facebook, the University of California and Cornell University. The aim of the study was to find out whether it was possible to affect people’s moods by influencing the number of positive or negative posts they were exposed to in their news feeds.

Their conclusion was: “When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

The results have been met with a mixed reaction – it’s an important study in today’s online-focused landscape and it’s the first of its kind to prove emotional transfer can happen without direct, real-life contact. However, it has been heavy criticised for both the way the data was collected and Facebook’s motives for exploring “massive-scale emotional contagion” as well as the real-world implications of being able to unconsciously change how people feel.

The study clearly states that the data collection was “consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” But this has raised eyebrows from academics including Susan Fiske, the Princeton academic who edited the study and James Grimmelmann, professor of law at Maryland University, who both agreed that while it might be within Facebook’s internal policy, the methods used go against the laws and norms of academic research studies where it’s imperative that people be given the option to participate or not.

Rights activist and politicians feel that this new information could have sinister implications that boil down to a form of mind control. Jim Sheridan, a British MP, has called for a parliamentary inquiry into the matter, stating: “They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it.”

Both Facebook and Adam Kramer, the lead researcher of the study and employee at Facebook’s Core Data Science Team, have issued statements to quell fears about privacy breaches and the nature of the research.

Facebook’s spokesperson said, “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible” while Mr Kramer issued a public statement on his Facebook page apologising for upsetting people and not making their motives clear: “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

Image: Flickr/zeevveez