Ethics board consulted after the fact
As reported by The Post and other news outlets, Princeton University psychology professor Susan Fiske told the Atlantic that an independent ethics committee, Cornell University’s Institutional Review Board (IRB), had approved use of Facebook’s “pre-existing data set” in the experiment. Fiske edited the study, which was published in the June 17 issue of Proceedings of the National Academy of Sciences.
A statement issued Monday by Cornell University clarified the experiment was conducted before the IRB was consulted. A Cornell professor, Jeffrey Hancock, and doctoral student Jamie Guillory worked with Facebook on the study, but the university made a point of distancing itself from the research. Its statement said:
Professor Hancock and Dr. Guillory did not participate in data collection and did not have access to user data. Their work was limited to initial discussions, analyzing the research results and working with colleagues from Facebook to prepare the peer-reviewed paper “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” published online June 2 in Proceedings of the National Academy of Science-Social Science.
Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.
User consent called into question
Facebook researchers claimed the fine print users agreed to when they signed up was tantamount to “informed consent” to participate in the study. Facebook’s current data use policy says user information can be used for “internal operations” including “research.” However, that’s not what it said in 2012 when the study was conducted. According to Forbes:
In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that ‘research’ is something that might happen on the platform.
Four months after the study, in May 2012, Facebook made changes
to its data use policy, and that’s when it introduced this line about how it might use your information: ‘For internal operations, including troubleshooting, data analysis, testing, research and service improvement.’ Facebook helpfully posted a ‘red-line’ version of the new policy, contrasting it with the prior version
from September 2011 — which did not mention anything about user information being used in ‘research.’
If international headlines are an accurate gauge of public opinion, people worldwide are angry at Facebook. Here’s a sampling, translated badly by Google Translate:
Manipulated Newsfeeds: Facebook, the permanent Psycho-Experiment
Facebook has for us lab rats? Experiment provoked strong reactions
Opinion: The big problem of Facebook? ethical blindness
Facebook responds to massive criticism
Facebook has made trial users’ emotions, and now they’re angry
Experimental patkányokként manages the Facebook users? Indignation of the people
Facebook “admits” to manipulate the mood of the users
Facebook was playing with your emotions, feed tampered with!
Your emotions are controlled face book yet?
Facebook users “manipulated” for a psychological experiment