The Washington Post

Sheryl Sandberg not sorry for Facebook mood manipulation study

Sheryl Sandberg. (Bloomberg)

On Wednesday, Facebook’s second-in-command, Sheryl Sandberg, expressed regret over how the company communicated its 2012 mood manipulation study of 700,000 unwitting users, but she did not apologize for conducting the controversial experiment. It’s just what companies do, she said.

“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” Sandberg, Facebook’s chief operating officer, told the Wall Street Journal while travelling in New Delhi. “And for that communication we apologize. We never meant to upset you.”

Sandberg’s statement was the first public comment by a Facebook executive on the controversy since it erupted over the weekend, prompting anger from many Facebook users and criticism from some academics who said it was unethical to manipulate users’ emotions without informed consent.

In the study, researchers at Facebook tweaked what hundreds of thousands of users saw in their news feeds, skewing content to be more positive or negative than normal in an attempt to manipulate their moods. Then they checked users’ status updates to see if the content affected what they wrote. They found that, yes, Facebook users’ moods are affected by what they see in their news feeds. Users who saw more negative posts would write more negative things on their own walls, and likewise for positive posts.

Sandberg’s apology is not likely to appease some, such as Robert Klitzman, a psychiatrist and ethics professor critical of the study, who said in a column for CNN that “the problem is not only how the study was described, but how it was conducted.”

It seems that until now, Facebook data scientists have been pretty much free to do as they please.  “There’s no review process, per se,” Andrew Ledvina, who worked at Facebook as a data scientist from 2012 to 2013, told the Journal. “Anyone on that team could run a test,” he said. “They’re always trying to alter people’s behavior.” Ledvina told the Journal that tests were so frequent that some data scientists worried that the same users might be used in different studies, tainting the results.

Facebook has since implemented stricter guidelines, the Journal reported. Research other than routine product testing is reviewed by a panel of 50 internal experts in fields such as privacy and data security. Company research intended for publication in academic journals goes through a second round of review, again by in-house experts.

The upset over Facebook’s mood study is “a glimpse into a wide-ranging practice,” Kate Crawford, a visiting professor at the Massachusetts Institute of Technology’s Center for Civic Media and a principal researcher at Microsoft Research told the Journal. Companies “really do see users as a willing experimental test bed” to be used at the companies’ discretion.

Plenty of companies may do this sort of testing. But Facebook is different, John Gapper argued in the Financial Times. Here’s why, he said:

  • “Facebook holds more intimate information about its users than other internet companies.”
  • Unlike testing products to see what appeals to users, which many companies do, with Facebook, “we are the product” being tested.
  • “Facebook wields incredible power over the behavior of users. This is partly because of its size.” He points to another Facebook study of 235 million users – noting that their sample size is four times the population of France.
  • Facebook “focuses its judgments on personal material,” unlike Google, which uses algorithms to analyze material across the Web. “An algorithm that selects from thousands of links about, say, Buckingham Palace feels like a service; one that weeds out the posts of friends and family feels like a moral guardian.”
  • “Facebook has demonstrated that it can alter behavior,” he writes, citing studies that show users who see more status updates will write more themselves and another than encouraged users to becomes organ donors by allowing existing donors to display their that status.

Related: Cornell ethics board did not pre-approve Facebook mood manipulation study

Related: Facebook responds to criticism of its experiment on users

Related: Angry mood manipulation subjects seize on interview with Facebook researcher Adam Kramer

Gail Sullivan covers business for the Morning Mix blog.

The Freddie Gray case

Please provide a valid email address.

You’re all set!

Campaign 2016 Email Updates

Please provide a valid email address.

You’re all set!

Get Zika news by email

Please provide a valid email address.

You’re all set!
Show Comments

Sign up for email updates from the "Confronting the Caliphate" series.

You have signed up for the "Confronting the Caliphate" series.

Thank you for signing up
You'll receive e-mail when new stories are published in this series.
Most Read



Success! Check your inbox for details.

See all newsletters

Close video player
Now Playing

To keep reading, please enter your email address.

You’ll also receive from The Washington Post:
  • A free 6-week digital subscription
  • Our daily newsletter in your inbox

Please enter a valid email address

I have read and agree to the Terms of Service and Privacy Policy.

Please indicate agreement.

Thank you.

Check your inbox. We’ve sent an email explaining how to set up an account and activate your free digital subscription.