Facebook's News Feed has long prioritized the posts that users' friends interact with, but it also has displayed other content that Facebook expects users will find interesting. This will still be true, but Facebook will weight material that comes from within users' social networks even more heavily than it has in the past.
Unsaid, but rather obvious, is that social networks tend to comprise like-minded individuals, who tend to post items that fellow members agree with. The feelings of fulfillment and well-being that Facebook users experience could result from being told, over and over, “You're right. You're right. You're right.”
“Facebook wants people to feel positive, rather than negative, after visiting,” the New York Times's Mike Isaac reported this week, after interviewing chief executive Mark Zuckerberg. Isaac added that Facebook reprogrammed its News Feed, having “closely studied what kinds of posts had stressed or harmed users.”
Facebook hasn't published its research, but political disagreement is a well-known stressor. In a Pew Research Center poll in July, 59 percent of Americans said it is “stressful and frustrating” to talk politics with people who have different opinions of President Trump, compared with 35 percent who described the experience as “interesting and informative.”
One way to reduce stress could be to do what Facebook is doing: Feed users more content from friends — content that is more likely to reinforce existing views than to challenge them.
Of course, some people travel in ideologically diverse circles, and many Facebook posts — vacation photos, cat videos, unsolicited food diaries — have nothing to do with politics. It also is possible that further emphasizing content shared and liked by users' friends could help staunch the flow of fake news — unless you have friends who regularly spread hoaxes, that is. Facebook told the Times that its News Feed research and updates were, in part, responses to public criticism that it too readily allowed false information to circulate, unchecked.
But Facebook's changes seem to make it easier than ever to create filter bubbles that block out opinions that don't match your own.
Campbell Brown, Facebook's head of news partnerships, made clear at a Poynter Institute event in March 2017 that the company does not feel a duty to push users out of their comfort zones. Here's an excerpt from Poynter Managing Editor Benjamin Mullin's account of an onstage interview conducted by Poynter Vice President Kelly McBride:
McBride pressed the issue, noting that Facebook has an incentive not to challenge the ideological perspectives of its users: If they feel more comfortable with their News Feeds, they're going to spend more time scrolling through them. And if they spend more time scrolling through them, Facebook gets to show them more ads.“You want to keep people on your platform,” McBride said. “After two hours, I don't feel like I've chosen. I feel like I've been sucked in.”Brown pointed out that Facebook's News Feed algorithm responds to signals from users.“It's not that mysterious,” she said. “What shows up in your News Feed is based on the things that you like. Things you share. People you're friends with and that you follow.”“Isn't that a filter bubble?” McBride countered.“I'm telling you, that world existed long before Facebook,” Brown said, and recommended that users who want to be challenged cultivate a diverse range of ideological perspectives on their feeds.
Got that, Facebook users? The platform thinks it's up to you to make friends with people who don't think like you, so that your News Feed will reflect a range of views. (You could look for news beyond Facebook, too.)
From a business perspective, Facebook's approach makes a lot of sense. And Facebook is obviously entitled to make its platform a low-stress escape.
Users ought to understand, however, that the news they get there might not provide a full picture.