Myth No. 1
Most Americans dwell in online echo chambers.
In his farewell address, President Barack Obama warned against the tendency “to retreat into our own bubbles … especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions.” Under the headline “Your Filter Bubble is Destroying Democracy,” software executive Mostafa M. El-Bermawy wrote in Wired that “the global village that was once the internet was has been replaced by digital islands of isolation that are drifting further apart each day.”
In reality, the average person’s sources of online information are relatively balanced and diverse. People’s habits do incline somewhat toward their preferred political positions, but a study of Web browser, survey and consumer data from 2004 to 2009 found that people’s media diets online were modestly divided by ideology but far more diverse than, for instance, the networks of people with whom they talked about politics in person. This finding of limited information polarization has been repeatedly replicated. Most recently, a new study found that mobile news consumption is even less segregated by ideology than desktop/laptop data used in previous research.
The bubble theory overgeneralizes from a small subset of extremely online people who have skewed information diets and consume a tremendous amount of news. One study finds, for example, that approximately 25 percent of all online political news traffic from Republicans comes from the 8 percent of people with the most conservative news diets.
Myth No. 2
Consumption of news from dubious websites is widespread
“Fake news” — originating on questionable sites and spread through social media — is often portrayed as ubiquitous and uniquely dangerous. The panic started after the 2016 election. “Of all the challenges we face in 2017 and beyond, ‘fake news’ has certainly made its way to the top of the list,” argued a TechCrunch writer. Data seemed to back these claims up. One widely covered study from Oxford University found that “junk news” was more commonly shared on Twitter than legitimate news.
In fact, Web browsing data I collected with two academic co-authors shows that exposure to untrustworthy websites is relatively rare. For instance, only 6 percent of the websites that Americans visited in the weeks before the 2016 election had been identified in prior research as failing to credibly ensure accurate information. Moreover, these visits made up a negligible share of most people’s information diets (especially when you consider how much television news they watched).
Exposure to these sites was concentrated in a small portion of the population. In 2016, we found that more than 6 in 10 visits to untrustworthy websites came from the 20 percent of Americans with the most conservative news diets. For that group, dubious sites made up about a fifth of their online news.
Consumption of bogus information on Twitter is even more skewed: One academic study found that 1 percent of users were exposed to 80 percent of untrustworthy content during the 2016 campaign. (Similar information is not available for Facebook.)
Myth No. 3
'Fake news' led to Donald Trump's election.
In November 2016, a writer for New York magazine blamed misinformation spread on social media — Facebook, in particular — for Trump’s win: “The most obvious way in which Facebook enabled a Trump victory has been its inability (or refusal) to address the problem of hoax or fake news.” That view seemed to find support in an academic inquiry two years later: “A new study suggests fake news might have won Donald Trump the 2016 election,” reported The Washington Post.
That claim is not credible. The study in question found, among self-reported 2012 Obama supporters, an association between endorsing false claims about Hillary Clinton in a post-election survey in 2016 and saying they voted for Trump. But people who supported Trump were obviously more likely to believe negative claims about Clinton; the study provided no evidence that participants saw those claims before the election or that exposure to them changed anyone’s vote.
To this date, no convincing evidence has emerged that fake news changed the election outcome in 2016. It is impossible to rule out elaborate counterfactuals in such a close contest, but we should be deeply skeptical about these claims, given the low exposure rates to misinformation and the well-known difficulty of changing people’s preferences in a highly contentious election. When several co-authors and I conducted an experiment in which we exposed people to a false news article during the 2018 elections, for instance, we found it affected their factual beliefs but had no effect on their political opinions.
Myth No. 4
Fact-checks usually backfire.
“Fact-checking doesn’t work very well,” New York Times columnist Nicholas Kristof wrote two years ago, because “when people are presented with factual corrections that contradict their beliefs, they may cling to mistaken beliefs more strongly than ever.” Likewise, USA Today wrote in 2019, “When you encounter facts that don’t support your idea, your belief in that idea actually grows stronger.”
This one is personal. In 2010, I co-authored a study documenting the possibility of “backfire effects,” in which exposure to corrective information perversely increased belief in a misperception. The finding was a narrow one: We found such effects in two of five studies and only among people for whom the false claim was ideologically congenial. As the study became better known, however, commentators falsely generalized the finding to indicate that corrections always backfire.
Thankfully, evidence has accumulated since then across numerous studies, including our own, that exposure to fact-checks and other corrective information tends to increase the accuracy of people’s beliefs (though they seldom affect how people feel toward candidates). Backfire effects may seem intuitive — everyone can think of examples of people doubling down on false claims — but they appear to be extremely rare.
Myth No. 5
We're now in a post-truth era.
But lies and misinformation have always been part of politics. Consider, for instance, the Red Scare of the 1950s or the persistence of anti-Semitic conspiracy theories over centuries. We shouldn’t romanticize the past.
We also shouldn’t overstate the point. Most people can still distinguish truth from distortion. For instance, a Washington Post poll in 2018 found that relatively few people believed the false claims that Trump so frequently makes. More recently, only 27 percent said in a Pew poll that Trump and his administration get the facts right about the coronavirus pandemic almost all or most of the time; the figure was 57 percent for the Centers for Disease Control and Prevention and other public health organizations.
These judgments about accuracy do not appear to influence people’s political views very much. Trump’s approval ratings, for instance, have barely budged even as he has piled up more than 22,000 false statements in office. The challenge is not that most people don’t see the truth — it’s that partisanship undermines accountability. Americans are all too willing to forgive political falsehoods from partisans on their side of the aisle.