One message sent to 60 million Facebook users on Election Day 2010 got at least 300,000 additional people to vote, according to a study published Wednesday in Nature magazine.
The message showed the recipient which friends said they had already voted. The result was that the recipient was likely to also go out and vote. In addition, for every user who was persuaded to vote, four friends were as well.
But if no friends were shown as having voted, the message had little impact.
The leader of the research said that 300,000 was a conservative estimate. “The actual number is probably closer to a million people,” said James Fowler, a professor of medical genetics and political science at the University of California at San Diego.
Fowler said he got the idea for the project when he saw that Facebook displayed messages urging its users to vote. In collaboration with Cameron Marlow, who manages the data science team at Facebook, he persuaded the company to leave out a small portion of users, enabling a randomized study.
On Nov. 2, 2010, every U.S. resident 18 and older who accessed Facebook became part of a huge trial.
More than 60 million people received what Fowler calls a “social message” on top of Facebook’s News Feed. It encouraged voting and showed a clickable button reading “I Voted,” a counter of how many Facebook users had reported voting and up to six random profile pictures of friends who had clicked the “I Voted” button.
One control group of about 600,000 people received an “informational message” that lacked the profile pictures and another group of the same number got no message at all.
The data of 6.3 million users were matched anonymously to voter records.
Users who received the social message were 2 percent more likely to click the “I Voted” button than those who got the informational message. Furthermore, they were 0.4 percent more likely to vote than receivers of the informational message or no message.
“Given the modest treatment, it is surprising that they found any effect at all,” said Gerald Kane, an associate professor of information systems at Boston College, who was not involved with the study.
The scientists concluded that the informational message was ineffective. The social message, on the other hand, was powerful because it probably provided social pressure on the users to vote by showing them that friends had reported voting, Fowler said.
The scientists found that for every recipient persuaded by the message to vote, four close Facebook friends — relationships identified by a high number of Facebook interactions — ended up voting as a result.
“We don’t know how this works; it could be that on Facebook, you see that a friend has voted or that he simply takes you with him to vote,” Fowler said.
The scientists could not find a difference between Republicans or Democrats who were activated by the message. But Fowler said the methods for discriminating between the two groups were primitive in this study — it involved reading the political preference written by users on their profiles, and only 1 percent of the users provided one. But for future studies, the team is developing more intricate analyses, inferring political orientation from what comments users write and where they place their “likes”.
Many social scientists are eager to use Facebook’s data for their experiments. “This is the most exciting research of my lifetime,” Fowler said.
“This kind of approach should revolutionize our understanding of human behavior,” said David Lazer of Northeastern University, a social network expert who is not associated with the study but is pursuing similar work.
In a conference call with reporters, Facebook’s Marlow said that he could not comment directly on whether Facebook is planning something similar for this year’s election but added that Facebook is committed to supporting the democratic process.