Wonkblog | Analysis
June 15, 2018 at 11:25 AM
People believe the craziest things.
A poll taken after the 2016 presidential election found, for instance, that more than half of people who voted for Donald Trump incorrectly believed that President Barack Obama was born in Kenya. Ten years earlier, a Scripps Howard/Ohio University survey found that just over half of Democrats wouldn't rule out the possibility that “people in the federal government either assisted in the 9/11 attacks or took no action to stop the attacks because they wanted [the] United States to go to war in the Middle East.”
How do you combat misperceptions like these? That's the subject of a new paper published in the Journal of Elections, Public Opinion and Parties by Brendan Nyhan of Dartmouth College and Jason Reifler of the University of Exeter.
Through survey experiments, Nyhan and Reifler arrived at a surprising answer: charts. “We find that providing participants with graphical information significantly decreases false and unsupported factual beliefs.” Crucially, they show that data presented in graphs and illustrations does a better job of fighting misperceptions than the same information presented in text form.
As the researchers tell it, there are two main reasons why misperceptions, particularly those linked to political views, are widespread. First, people may not know the correct information simply because they haven't encountered it. For example, if you haven't been following economic news, you may not be aware that the current unemployment rate is 3.8 percent, the lowest number in 18 years.
But a lack of information can metastasize into misinformation when people encounter facts that they perceive as threatening to their sense of identity. “On high-profile issues, many of the misinformed are likely to have already encountered and rejected correct information that was discomforting to their self-concept or worldview,” the researchers write.
Take, for instance, the “question” of Obama's birthplace. If you're a hard-right partisan, a good chunk of your political identity may be built around the notion that Obama was, somehow, an illegitimate president. “Conceding the validity of Obama’s birth in the U.S. would require accepting the president’s legitimacy, which would be threatening to so-called birthers,” Nyhan and Reifler explain.
Countering these misperceptions is a challenge. We already know that when people receive new information that's threatening to their worldview, they often either reject it completely or go to great lengths to interpret it in a way that conforms with their existing beliefs. Nyhan and Reifler's previous research demonstrated that simple textual corrections of misinformation often fail.
But what about visual information? A number of studies have shown, for instance, that people have an easier time understanding data when it's presented in a chart-based format instead of say, a table of text. This seems to be particularly true for “structured” (as opposed to random) data that conveys information related to real-world phenomena.
The authors set up survey experiments to test whether visual information, in the form of charts and graphs, did a better job of correcting misperceptions than textual information. In 2008, they asked people whether insurgent attacks on U.S. and coalition forces in Iraq fell after the George W. Bush administration implemented a troop surge. Previous polling had shown that people opposed to the war generally believed that the surge had little effect or was making the situation worse.
Some in the sample of 1,000 individuals were shown a chart of the number of attacks on coalition forces before and after the surge, while others were not.
This chart had little effect on people who supported the war and opposed withdrawing U.S. troops — these people already believed, correctly, that attacks on U.S. and coalition forces fell after the surge.
But among those who were ambivalent about the war or who opposed it, the effect of the chart was dramatic. About 56 percent of people opposed to the war and who weren't shown the chart incorrectly believed that attacks increased or stayed the same after the surge. But only about one-third of war opponents who were shown the chart still held that incorrect view.
A 23-percentage-point drop in the misperceptions rate among war opponents is huge, and it speaks to the power of visual data to counter misinformation. The researchers' second experiment essentially replicated those findings, but this time in an area where conservative misperceptions were particularly strong: the performance of the job market under Obama from 2010 to 2011.
Among people who disapproved of Obama's performance on economic matters near the close of his first term, more than 80 percent falsely believed that the number of people with jobs fell or stayed the same from 2010 to 2011. But that belief was sharply curtailed among Obama disapprovers who were shown a chart of the federal government's jobs data for the 2010-2011 period.
Both experiments demonstrate that a chart can reduce misperceptions by significant amounts. But is a chart any better than text? A final experiment, about belief in global warming among Republicans, strongly suggests that's the case.
For that experiment, Republican respondents were asked whether they believed that average global temperatures were increasing, decreasing or staying the same over the past 30 years. One subset of those surveyed was given a textual summary of the scientific consensus on global temperatures, while another group got a chart showing temperature change since 1940 as measured by four climate institutions. A control group received no information before answering.
Among Republicans who didn't strongly identify with the GOP, people who had received the textual summary were less likely to hold incorrect beliefs about global temperature than those who received nothing. But the reductions in misperception were greatest among those who saw the chart.
Furthermore, something interesting happened among people who identified strongly with the Republican Party: Textual information slightly increased their likelihood of holding incorrect beliefs about global temperatures. But the chart, again, sharply reduced the likelihood of misperceptions.
“We find that a graphical correction decreases misperceptions about temperature change more than an equivalent text correction, which is consistent with the observed contrast between Studies 1 and 2 … and previous studies that found corrective text about controversial issues to often be ineffective,” Nyhan and Reifler concluded. If you want to change minds, in other words, show them a chart, not a wall of text.
Now, there are a number of limitations to keep in mind here. This is just one paper, so it will be important to see whether other researchers are able to replicate the effects. It also bears pointing out that not all misperceptions can be fixed with a chart: How, for instance, would you chart Obama's birth certificate?
But these results strongly suggest that, at least where numbers are concerned, a good chart has the power to change hearts and minds above and beyond what a paragraph of text can do.