Facebook chief Mark Zuckerberg at the F8 Facebook Developer Conference in San Francisco in 2016. (Eric Risberg/AP)

State-sponsored propaganda like the recently unmasked @TEN_GOP Twitter account is of very real concern for our democracy. But we should not allow the debate over Russian interference to crowd out concerns about homegrown misinformation, which was vastly more prevalent during and after the 2016 election.

Why is misinformation so prevalent and widely believed in U.S. politics?

One explanation for the growth of misinformation is the way people are exposed to — and consume — news today. In particular, concerns have grown about “echo chambers.” According to this theory, people are, intentionally or unintentionally, surrounding themselves with news from like-minded sources. In such environments, people may tend to uncritically believe news content from outlets they trust while dismissing or ignoring information from sources they dislike. If this is true, politicians and commentators may be able to effectively mislead the public by promoting misinformation through allied news outlets.

But when one of us (Horiuchi) and his Dartmouth undergraduate co-authors tested this hypothesis in a recent study, they found that the source of the misinformation they showed to study participants (an incorrect news excerpt about the Affordable Care Act) didn’t matter very much. Regardless of the respondents’ party identification or ideology, attributing the article to Fox or CNN had relatively little effect on the news article’s perceived accuracy.

The problem instead was that people were surprisingly vulnerable to believing the misinformation even when it came from an uncongenial source. Far more believed the false claim (that people would lose health coverage from their parents’ insurance plans when they turned 18 under proposed legislation) when they read an article making the claim. In other words, they swallowed the news story without carefully considering whether it was true.

In this sense, concerns about echo chambers may be overstated — a finding that is consistent with other evidence. The problem isn’t that we’re only willing to listen to sources that share our political viewpoint; it’s that we’re too vulnerable as human beings to misinformation of all sorts. Given the limitations of human knowledge and judgment, it is not clear how to best protect people from believing false claims.

So how can society protect itself from misinformation?

We might thus consider focusing on the supply side of political debate and try to deter politicians from spreading so much misinformation instead. But how can we do so?

One promising approach is summary fact-checking — an increasingly popular format that presents an overview of fact-checking ratings for a politician. This is distinct from focusing on whether a single statement is true or false; rather, it evaluates a group of such statements, assessing a speaker’s overall truthfulness and reliability as a source. Though the statements in question are of course not randomly chosen, the format may be an effective way to increase the costs of repeatedly making false statements.

One of us (Nyhan) investigated the effect of this format in three experimental studies conducted in 2016 and 2017 in collaboration with different undergraduate co-authors at Dartmouth. Compared with respondents who saw fact-checks of individual statements by politicians, participants in the studies who instead saw summary fact-check ratings viewed the legislators in question less favorably and rated their statements as less accurate.

Summary fact-checking won’t persuade everyone, of course. But if we can make politicians fear the political costs of a pattern of false claims a little bit more, there may be less misinformation to report in the first place.

Brendan Nyhan is a professor of government at Dartmouth College. Follow him on Twitter @BrendanNyhan.

Yusaku Horiuchi is a professor of government at Dartmouth College. Follow him on Twitter @YusakuHoriuchi.

Also of interest: