The Washington PostDemocracy Dies in Darkness

It’s not just feelings. Facebook also has the potential to manipulate your worldview.

(AP Photo/Jeff Chiu)

Considering that nearly half of all Facebook users get their news from the site, a recent report on the newsfeed’s growing partisanship was a bit disconcerting.

News sites and Facebook users have known for years that the social network carefully curates what you see in your newsfeed, as it attempts to serve up the most interesting, relevant content. In August 2013, a major retooling of the newsfeed tipped the scales in favor of serious news sites, surfacing them more frequently than Viral Nova, Upworthy, et al. But apparently, per a new report by Mashable, all news sites are not created equal: Highly partisan sites, like the progressive Mother Jones or the conservative Breitbart, have seen a greater surge in traffic since August relative to everyone else.

In other words, since Facebook redesigned its algorithm, partisan political sites have seen way more likes, shares, comments and clicks — and they’re probably showing up in your newsfeed far more than you’re used to.

Before we go any further, a quick caveat here: No one’s suggesting that Facebook intentionally or directly boosted partisan sites. (It’s sort of tempting to imagine some bespectacled engineer, throttling the news feed algorithm and cackling over the downfall of America — but nope, not the case.) Instead, it seems more likely that a) Facebook is ranking these sites as highly as more traditional news sources, b) these sites are very good at gaming the Facebook system, and c) people, for whatever reason, really enjoy posting partisan things.

That would seem to spread responsibility out among several different parties — and it does! But at the end of the day, they all share the common, mysterious denominator of newsfeed. When you post something from, let’s say, the Blaze, the newsfeed algorithm accounts for things like how many likes it has and how your friends interact with the Blaze before determining which of your friends see it. When the Blaze posts “Famed Rapper Discusses Recent Conversion to Christianity,” the algorithm likewise determines how many of the site’s fans will see, and thus share, it. (As of this writing: 1,126.) The algorithm isn’t editorially agnostic, either: While we know virtually nothing about what goes into it, Facebook has been quite open in favoring certain types of content.

So in short, even though there are many, many factors that go into whether a news site performs well on Facebook, the newsfeed algorithm is always, in some indirect way, responsible.

And that’s a pretty big deal, right? You don’t have to work in news or social media to feel the reverberations. Facebook has 1.28 billion active monthly users, 47 percent of whom say they get news on the site. Changes to the algorithm, even minor ones, have the potential to impact the information diets and worldviews of millions upon millions of people.

Theoretically, since last August, we’ve all seen far more polarized news in our Facebook feeds than at any time in the past. Could that have subtly convinced us that the political climate is more extreme than it actually is? Could it have reinforced social media’s infamous echo chamber — inundating readers only with articles that reinforce their preexisting points of view? Is it something Facebook, in its Big Brotherly wisdom, should change?

The prevailing sentiment right now would probably be “no” — after all, the social network is only just beginning to distance itself from another newsfeed manipulation controversy, that one involving the study of users’ moods based on changes to their Facebook feeds.

In both cases, it’s not necessarily the specific instances themselves that should concern us. After all, Facebook makes small tweaks to the newsfeed — with intended and unintended consequences — more or less constantly. They should, however, both be viewed as powerful illustrations of Facebook’s potential to shape the national mood, almost without us realizing it. When you operate on a scale like Facebook’s, there’s really no such thing as “small change.”