In Theory | Opinion
July 14, 2016 at 3:12 PM
Last month, three scholars confirmed what we already knew about social media — or at least had suspected. In a draft paper called "Echo Chambers on Facebook," social scientists Walter Quattrociocchi, Antonio Scala and Cass Sunstein found quantitative evidence of how users tend to promote their favorite narratives, form polarized groups and resist information that doesn't conform to their beliefs.
The study focused on how Facebook users interacted with two narratives involving conspiracy theories and science. Users belonging to different communities tended not to interact and tended to be connected only with "like-minded" friends, creating closed, non-interacting communities centered around different narratives — what the researchers called "echo chambers." Confirmation bias accounted for users' decisions to share certain content, creating informational cascades within their communities.
Users tended to seek out information that strengthened their preferred narratives and to reject information that undermined it. Alarmingly, when deliberately false information was introduced into these echo chambers, it was absorbed and viewed as credible as long as it conformed with the primary narrative. And even when when more truthful information was introduced to correct or "debunk" falsehoods, either it was ignored or it reinforced the users' false beliefs.
While the findings are cause for concern, they don't come as much of a surprise — confirmation bias is nothing new, and conspiracy theories have become an increasingly visible part of our political discussion. The question is whether there is anything a responsible media can or should do differently to make it easier for facts to penetrate these echo chambers, and whether news organizations are willing to make the necessary changes.
A first step would be for outlets to resist pandering to established constituencies, despite the temptations of a guaranteed audience and trend-driven traffic. Supreme Court Justice Ruth Bader Ginsburg is referred to as "the Notorious RBG" by young, light-hearted supporters — it's a meme that plays well with a very particular audience. But will headlining real news with a partisan epithet, as the Associated Press did Wednesday in a tweet about Ginsburg's feud with Donald Trump, serve to ensure key facts are heard, or make potentially important news more likely to be dismissed by those who don't agree with the narrative that the nickname implies?
Another step would be for news organizations to spend less time debunking false information, which the newly published study shows is more likely to solidify readers in their beliefs than change minds. While it's certainly fulfilling for reporters frustrated with repeated falsehoods to declare that a politician's "pants are on fire" or that the politician has received a risible "Four Pinocchios" (The Post's descriptor of choice), doing so is unlikely to make a dent in the belief system of anyone who doesn't already agree. Corrections and clarifications can be written to draw ire or to inform — while the former might get more clicks, the latter will do the most good.
In truth, both of these suggestions fall under a broader, fairly obvious directive, that would probably go against all of the major tenets of marketing and moneymaking but could be better for everyone in the long run. Rather than attempting to break into particular subgroups, the media should aim to provide the truth, plainly stated, to as broad an audience as possible — and then let readers do what they will.
Consider today's political landscape, where 62 percent of American adults get news on social media, and where Facebook likes and shares are the language of debate for voters. With such a close presidential election at stake, isn't it more crucial than ever that voters carefully evaluate all the information presented to them, rather than just reflections of their own beliefs?
The tendency to promote one's favored narrative is natural, but too much confirmation distances us from other perspectives and makes us unable to see the truth when it's finally presented — what the "Echo Chamber" researchers referred to as "a kind of cognitive inoculation." And in the end, a constant us-vs.-them mentality depersonalizes the holders of alternative views. While it's easy to blame readers and sharers for not shrewdly evaluating sources and weighing evidence, the media is responsible for presenting information in a way that would make doing so more likely.
If news outlets spend less time segmenting and more time presenting the news as straightforwardly as they can, it's possible that things could improve — or at least not get worse.