(Photo by Dan Kitwood/Getty Images)

Facebook’s 1.44 billion users rely on the site for lots of things: keeping in touch, sharing photos, casual stalking.

But if you get your political news through Facebook, as more than 60 percent of millennials do, please browse with extreme caution: The site doesn’t show you everything, and may subtly skew your point of view.

political news pew

This is not, of course, a new fear; moral panic over “echo chambers” and “filter bubbles” is as old as the social Web, itself. But a new survey by the Pew Center, released on Monday, suggests there may be some new urgency here. Per that survey, a majority of American Internet users now get political news from Facebook — and the 2016 elections, as we know, are in just over a year.

That’s really important, and important to understand, because Facebook is quite unlike traditional conduits of news. (Think: your local ABC affiliate, your gossipy neighbor, this page, what have you.) As in those more traditional settings, Facebook gives you a great deal of control over which sources you follow and what you choose to read. But unlike those other, traditional sources, Facebook also hides many stories selectively. According to a recent Washington Post experiment, as much as 72 percent of the new material your friends and subscribed pages post never actually shows up in your News Feed.

 

Which might be fine, when we’re talking about your ex-co-worker’s baby pictures — but what about if we’re talking about a political scandal?

“A longer-term question that arises from this data [about Facebook as a political news source],” the Pew report says, “is what younger Americans’ reliance on social media for news might mean for the political system.”

Facebook doesn’t show you everything

By now, it should be common knowledge that the News Feed does not show you every post your friends put on Facebook. Unfortunately, there’s still a major misconception around how News Feed works: in a recent study from the University of Illinois, 62.5 percent of participants had no idea Facebook screened out any posts.

Facebook has a good reason for doing this, mind you: If you saw every post, you’d be overwhelmed. There’d simply be too many to read. (This is a problem Twitter’s having, incidentally.) So instead, Facebook does a little math and, based on a range of engagement factors, tries to predict the posts you’re most interested in, and only places those in your News Feed. The math behind the News Feed changes constantly, and Facebook regularly rejiggers it to meet user needs. (You may recall a certain outcry over “manipulative” algorithmic changes in July 2014.)

[What Facebook doesn’t show you]

Anyway, none of this is inherently bad or nefarious. In fact, for the casual, social Facebook user, it’s probably really good. The problem is that more and more people are using Facebook for more and more important things — like informing how they vote — without entirely understanding how it works.

“It’s kind of [like] waking up in ‘The Matrix’ in a way,” said one newly enlightened participant in that University of Illinois study. “I mean you have what you think as your reality” — but it’s actually filtered, moderated.

In fact, Facebook kind of flatters your politics

Facebook has, unsurprisingly, worked very hard to shed the impression that this filtering may somehow hurt its users or their media literacy. In May, researchers working for the site published a very high-profile, very controversial paper on the “ideological diversity” of information in the News Feed. The communications scholar Christian Sandvig called it the “‘it’s not our fault’ study” — a peculiarly deliberate attempt to prove that, even if filter bubbles do exist, their algorithm isn’t to blame.

As Sandvig and others have pointed out, though, that isn’t actually what Facebook’s data shows. For one thing, the study wasn’t conclusive: It only looked at a small and highly non-representative user sample. On top of that, the study shows that the algorithm does tweak political news in three important, if modest, ways. To wit:

  1. The algorithm determines the order of the posts in your News Feed, and hard news stories that the algorithm relegates to the bottom are read far less frequently.
  2. In the average conservative’s news feed, the algorithm cuts out 5 percent of liberal-leaning articles.
  3. In the average liberal’s News Feed, that rate’s a little higher: 8 percent of conservative-leaning articles got cut.

Again, none of this is necessarily nefarious or shocking. (“What else would a good filter algorithm be doing other than filtering for what it thinks you will like?” Sandvig wrote.) But it does mean that, when you use Facebook as a source for political news, that news is modestly more likely to flatter your existing point of view.

“Selectivity and polarization are happening on Facebook,” sums up Sandvig, “and the news feed curation algorithm acts to modestly accelerate” both of those things.

It’s no wonder, really, that highly partisan news sources tend to do very well on Facebook — they speak to the biases of both users and the network.

Here’s why that’s a problem

(Facebook) (Facebook)

It might be hard to see the big so-what in all of this. After all, if you’re just one user cruising along through your News Feed, you want a service that’s pleasant and comfortable for you. You don’t want to be bombarded with rage-inducing partisan news in between your memes and your mom’s stories and your pet videos.

But consider, for a minute, that more than half of all American adults use Facebook — enough people, some scholars theorize, to swing a national election. As more of those people use Facebook for news, we risk “accelerating” polarization for a large slice of the U.S. population.

And that’s too bad, really, both because political polarization can be blamed for a host of ills, and because social networks could really be a force for good here. One analysis of Twitter found, for instance, that ideologically diverse networks tend to yield more moderate people — proof positive, its author wrote, that social media has “rich potential … to transform the political process.”

There are, at least in theory, technical ways around this problem. The sociologist Zeynep Tufekci has called for Facebook to hand more filtering control directly over to its users: “At a personal level,” she wrote, “I’d love to have the choice to set my newsfeed algorithm to ‘please show more content I’d likely disagree with.” (Balancer, a browser extension/research project that took this approach toward news site reading, is the only known tool to increase “diverse exposure” clearly and measurably.)

Meanwhile, Jonathan Zittrain — a law professor at Harvard — has called for Facebook to declare itself an “information fiduciary,” much like lawyers and doctors do already. In exchange for, say, a tax break, the site would promise to offer a depersonalized, unfiltered News Feed experience, among other things.

Unfortunately, these solutions seem unlikely for now — and in the meantime, the available fixes are both very individual and far less ambitious. If you use Facebook to access political news, consider toggling from “top stories” to “most recent.” (That option’s in the left-hand rail, under “Favorites.”) Alternately, think about supplementing your Facebook diet with news from somewhere else. Pew also has a new analysis on the news sources that liberals and conservatives like best: Take a gander at some stories from your ideological opposite.

Liked that? Try these!