Wonkblog

How Facebook could be a force for good


Facebook CEO, Mark Zuckerberg appears Tuesday for a Senate hearing. (Matt McClain/The Washington Post)

Facebook CEO Mark Zuckerberg recently said he wants his children to believe “what their father built was good for the world.” It hasn’t been a great stretch for that pledge, with Zuckerberg’s massive social network being blamed for facilitating the spread of fake news and allowing millions of private users’ profiles to fall into the hands of political manipulators

But while “Facebook as a force for good” seems far-fetched, studies suggest that — with a few tweaks — the social network could make people happier, healthier and even more civically engaged.

In one study that included 689,003 active Facebook users, scientists at Facebook and Cornell University explored whether the content in a person’s Facebook news feed could be altered to improve well-being. To test this, Facebook users were randomly assigned to different experimental groups for a week to test the effect of reducing the amount of negative content they saw. Specifically, a software program hunted for negative words (words such as “bad” and “sad”) in posts that would usually be displayed in a user’s news feed. For some groups, only a few of the negative stories were filtered, while other groups had up to 90 percent of the negative content stripped from their feeds.

Three million posts’ worth of analysis later, the researchers found people with filtered feeds were slightly but consistently more positive in their own posts, when measured against a group who saw their feeds randomly filtered by the same amount.

Interestingly enough, the happy feed groups also posted slightly fewer words than the control group. Make of that what you will.

The study was compelling but also controversial, as the researchers also suppressed positive content for other Facebook users and found it made them more negative. The ethics of this have been debated at length. But of course, Facebook tweaks the algorithm that filters our news feeds all the time — it just doesn't typically share its findings openly.

This research offers a window into the importance of the news feed algorithm, confirming that the way Facebook curates posts affects happiness.  Why not use this power transparently and for social good?  For instance, Facebook could let users elect to have negative news feed content throttled either all the time or at certain times if they’d like a mood boost. Providing these kinds of options is one way Facebook could make the world a little sunnier.

Another recent study suggests an additional and arguably more important avenue through which Facebook could be a force for good.

During a recent round of U.S. congressional elections, Facebook facilitated a 61 million person experiment. All users above legal voting age who visited the social networking site on Election Day were randomly assigned to different groups. One group saw a social message at the top of their news feed encouraging them to vote, inviting them to click a button to report if they voted, and displaying up to six profile pictures of their friends who reported voting. Another group saw all the same information except the faces of friends who reported voting were omitted. The researchers wanted to know who voted in the election, and if the news feed display influenced this. They hypothesized that finding out up to six of your Facebook friends had voted might propel you to vote too, as people tend to follow the actions of others in their social networks. The researchers measured voting by collecting the actual, publicly available voting records of millions of users. Sharing the photos of up to six friends who reported voting increased voter turnout by 0.39 percent — an effect far larger than the margin of victory in many elections.  In addition, there were contagion effects — if a close friend of yours saw this social message (containing pictures of other voters) but you didn’t, you were still 0.22 percent more likely to vote yourself!

The results of this study suggest that social messaging on Facebook can be used to increase voter turnout, which is certainly valuable, but they also imply that social messages on Facebook may be useful for encouraging valuable behaviors in general. For instance, it’s reasonable to expect that similar messages could be used to promote smoking cessation, physical activity, reduced truancy and many other positive actions. By partnering with governments to disseminate public service messages in a social way, Facebook may have another opportunity to improve the world.

Amid Facebook’s other struggles, many will probably find this cold comfort, or even disquieting, disturbed that a small handful of people wield power over the perception of so many. And indeed, some might argue that the best thing Zuckerberg could do for the world would be to pull the plug.

But given Facebook’s prevalence — and given that there’s no sign it’s going anywhere anytime soon — it’s nice to know that if Facebook wanted to, it has easy avenues to help the people who use it. I, for one, hope Zuckerberg will follow through on his commitment to try.

Katherine L. Milkman is a professor at the Wharton School of the University of Pennsylvania who does research about behavioral economics.
The story must be told.
Your subscription supports journalism that matters.
Market Watch
Dow 25,669.32
Today 0.43%
S&P 2,850.13
Today 0.33%
NASDAQ 7,816.33
Today 0.13%
Last Updated:08/17/2018
Close
Now Playing