The Washington Post

Facebook responds to criticism of its experiment on users

UPDATE: Facebook’s mood manipulation study was not pre-approved by Cornell University’s ethics board and the company may not have had user permission as previously claimed. See story for details: Cornell ethics board did not pre-approve Facebook mood manipulation study

Facebook is unapologetic about the “emotional contagion” experiment it was conducting on customers.

Reports about the ethics of it (see the Atlantic) have been kicking up emotion, and it was getting contagious.

Here’s what happened. For a research project and ultimately a paper, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing the content to be more positive or negative than normal. Then they checked users’ status updates to see if the content affected what they wrote. They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts.

Facebook did not tell users they were being experimented on, or, as the New York Times put it, were being used as “lab rats.”

The results were published in the June 17 issue of Proceedings of the National Academy of Sciences. The title is: “Experimental evidence of massive-scale emotional contagion through social networks.” Here’s the summary of the finding as described in the paper:

We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.

The experiment was controversial in part because the participants never explicitly agreed to be part of it. They didn’t know about the tweaking. Even Princeton University psychology professor Susan Fiske, who edited the article, told Business Insider she was “a little creeped out” by the study.

In a formal statement issued to the Atlantic, a Facebook spokesman defended the research. “We do research to improve our services and make the content people see on Facebook as relevant and engaging as possible…. We carefully consider what research we do and have a strong internal review process.”

Sunday, one of the Facebook researchers in the study was somewhat more contrite.

In a Facebook post, Adam D.I. Kramer, a Facebook employee and one of the study’s authors, wrote: “Our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

He defended the goal of the experiment: “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

In their paper, the researchers said the fine print Facebook users agreed to when they signed up for the site was tantamount to informed consent. Facebook’s data use policy does say user information can be used for “internal operations” including research, though it is unclear whether that includes experiments such as this one that was conducted with experts outside the company from Cornell University and the University of California who collaborated with Facebook researchers.

Fiske told the Atlantic that an independent ethics board, Cornell University’s institutional review board (IRB), had approved use of Facebook’s “pre-existing data set” in the experiment. The “pre-existing” bit matters — it gets at the difference between observing Facebook data, which plenty of studies have done, and manipulating it. “Under IRB regulations, pre-existing dataset would have been approved previously and someone is just analyzing data already collected, often by someone else,” Fisk explained to the Atlantic. She added she wasn’t second-guessing the decision but called the experiment an “open ethical question.”

University of Ottawa law professor Gordon DuVal, who chairs the research ethics board at the National Research Council Canada, told the Globe and Mail that the experiment doesn’t comply with current standards, which require an informed consent from the subjects. Even in experiments where deception is required, the subjects need to be informed, debriefed and notified afterward, he said.

Jeffrey Sherman, a psychology professor at the University of California at Davis, agreed. “This study attempted to manipulate participants’ emotional experience. No IRB I have ever worked with would waive consent or debriefing for such an intervention,” he told the Globe and Mail.

In his Facebook post, Kramer emphasized that the study affected only .04 percent of users and only for a one-week period. He added that Facebook’s internal review practices have “come a long way” since the experiment was done.

H/t the Atlantic

Gail Sullivan covers business for the Morning Mix blog.



Success! Check your inbox for details. You might also like:

Please enter a valid email address

See all newsletters

Show Comments

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read



Success! Check your inbox for details.

See all newsletters

Your Three. Videos curated for you.
Play Videos
Be a man and cry
Program turns prisoners into poets
Unconventional warfare with a side of ale
Play Videos
The signature dish of Charleston, S.C.
For good coffee, sniff, slurp and spit
The most interesting woman you've never heard of
Play Videos
How to prevent 'e-barrassment'
The art of tortilla-making
A man committed to journalism, caught in the crossfire
Play Videos
Tips for (relatively) stress-free dining out with kids
How to get organized for back to school
How the new credit card chip makes purchases more secure
Next Story
Terrence McCoy · June 30, 2014

To keep reading, please enter your email address.

You’ll also receive from The Washington Post:
  • A free 6-week digital subscription
  • Our daily newsletter in your inbox

Please enter a valid email address

I have read and agree to the Terms of Service and Privacy Policy.

Please indicate agreement.

Thank you.

Check your inbox. We’ve sent an email explaining how to set up an account and activate your free digital subscription.