Facebook’s controversial study that manipulated users’ newsfeeds was not pre-approved by Cornell University’s ethics board, and Facebook may not have had “implied” user permission to conduct the study as researchers previously claimed.
In the study, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing content to be more positive or negative than normal in an attempt to manipulate their mood. Then they checked users’ status updates to see if the content affected what they wrote. They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts.
(For a refresher on the controversy, check out The Washington Post’s story from Monday).
Ethics board consulted after the fact
As reported by The Post and other news outlets, Princeton University psychology professor Susan Fiske told the Atlantic that an independent ethics committee, Cornell University’s Institutional Review Board (IRB), had approved use of Facebook’s “pre-existing data set” in the experiment. Fiske edited the study, which was published in the June 17 issue of Proceedings of the National Academy of Sciences.
A statement issued Monday by Cornell University clarified the experiment was conducted before the IRB was consulted. A Cornell professor, Jeffrey Hancock, and doctoral student Jamie Guillory worked with Facebook on the study, but the university made a point of distancing itself from the research. Its statement said:
Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.
Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
User consent called into question
Facebook researchers claimed the fine print users agreed to when they signed up was tantamount to “informed consent” to participate in the study. Facebook’s current data use policy says user information can be used for “internal operations” including “research.” However, that’s not what it said in 2012 when the study was conducted. According to Forbes:
In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that ‘research’ is something that might happen on the platform.
Four months after the study, in May 2012, Facebook made changes to its data use policy, and that’s when it introduced this line about how it might use your information: ‘For internal operations, including troubleshooting, data analysis, testing, research and service improvement.’ Facebook helpfully posted a ‘red-line’ version of the new policy, contrasting it with the prior version from September 2011 — which did not mention anything about user information being used in ‘research.’
This revelation will likely further rile critics already angered that Facebook fell short of the standards imposed by the government and professional associations for informed consent in studies conducted on humans. Informed consent involves disclosing information about the study before it takes place and giving subjects a chance to opt out – and Facebook did neither. Since Facebook is a private company, it isn’t held to those standards, according to legal experts interviewed by the International Business Times, but that hasn’t stopped some from feeling violated and angry.
If international headlines are an accurate gauge of public opinion, people worldwide are angry at Facebook. Here’s a sampling, translated badly by Google Translate:
Manipulated Newsfeeds: Facebook, the permanent Psycho-Experiment
Facebook has for us lab rats? Experiment provoked strong reactions
Opinion: The big problem of Facebook? ethical blindness
Facebook responds to massive criticism
Facebook has made trial users’ emotions, and now they’re angry
Experimental patkányokként manages the Facebook users? Indignation of the people
Facebook “admits” to manipulate the mood of the users
Facebook was playing with your emotions, feed tampered with!
Your emotions are controlled face book yet?
Facebook users “manipulated” for a psychological experiment