(Photo by Eric Thayer/Reuters)

Facebook is changing the way it conducts research using its site, following a summer uproar over a mood study that the social network's researchers published in the Proceedings of the National Academy of Sciences.

For the study, researchers altered nearly 700,000 users' News Feeds to show either only happy or sad posts from friends, and found that the tone of friends' posts had a corresponding effect on Facebook users' moods. Once the article was published, many Facebook users complained that the social network had no right to manipulate their feelings -- and certainly not without explicitly informing Facebook users that they were part of a study.

Facebook chief technical officer Mike Schropefer said in a blog post Thursday that the company was caught off guard by reaction to the study. "It is clear now that there are things we should have done differently," he said. "For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it."

Several academics have questioned whether the study complied with ethics rules. PNAS took the unusual step to issue a statement saying that while the study did not break any research guidelines, it "may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out."

Facebook says that while it will continue doing research -- both to test changes on its Web site and for academic purposes -- it will be more thoughtful about it in the future. "We're committed to doing research to make Facebook better, but we want to do it in the most responsible way," Schroepfer said.

It's not clear whether any of the changes to Facebook's internal review process will include provisions to inform users that they're part of a particular study. Facebook did not immediately respond to a request for comment. The guidelines do, however, call for more extensive review in cases where a study touches on sensitive content, such as emotions. Facebook will also expand the number of people who review its research, provide additional training to researchers and create a central site to share all of its published academic research.

Here's a full rundown of the changes, from Schropefer's blog post:

• Guidelines: we’ve given researchers clearer guidelines. If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin. The guidelines also require further review if the work involves a collaboration with someone in the academic community.
• Review: we've created a panel including our most senior subject area researchers, along with people from our engineering, research, legal, privacy and policy teams, that will review projects falling within these guidelines. This is in addition to our existing privacy cross-functional review for products and research.
• Training: we’ve incorporated education on our research practices into Facebook's six-week training program, called bootcamp, that new engineers go through, as well as training for others doing research. We'll also include a section on research in the annual privacy and security training that is required of everyone at Facebook.
• Research website: our published academic research is now available at a single location and will be updated regularly.

Update: James Grimmelmann, a University of Maryland law professor who's challenged Facebook's methods, says the modifications don't go far enough.

"This is a positive development on oversight, but Facebook still hasn't given an inch on informed consent," said Grimmelmann. "As long as this panel says OK, Facebook will still feel free to manipulate its users in the name of science. I hope that other companies emulate Facebook's commitment to better training and consistent policies, but there's still a long way to go."

Facebook has said users consent to have their information used when they agree to the company's terms of service.

 

Have more to say about this topic? Join us today for our weekly Friday live chat, Switchback. We'll kick things off at 11 a.m. Eastern time. You can submit your questions now, right here.