» This Story:Read +|Talk +| Comments
Monday, 11 a.m. ET

Science: Department of Human Behavior

Network News

X Profile
View More Activity
Shankar Vedantam
Washington Post Staff Writer
Monday, September 15, 2008; 11:00 AM

Washington Post staff writer Shankar Vedantam and Jason Reifler, a political scientist at Georgia State University, were online at 11 a.m. ET on Monday, Sept. 15 to discuss research that shows that fighting misinformation in political campaigns is almost impossible.

This Story

Read about the study in this week's Department of Human Behavior column: The Power of Political Misinformation.

The transcript follows.

___________________

Shankar Vedantam: Welcome to this online chat to discuss the Department of Human Behavior column today, The Power of Political Misinformation. Do you see examples on the campaign trail where misinformation is rampant? Have you noticed whether attempts to debunk the misinformation have been successful? Several experiments seem to suggest that misinformation is not only difficult to debunk, but that some refutations can actually prompt MORE people to believe the misinformation. I am delighted to be joined by Jason Reifler at Georgia State University, who helped conduct experiments showing this "backfire effect."

_______________________

Arlington, Va.: Without a doubt Vedantam and Reifler's research demonstrates that lies and deliberate falsehoods injected into a political campaign have a desired viral effect. The subject matter is contaminated, and the candidates remain even more vulnerable to spurious charges. A modest suggestion to remedy the problem might be to remind the spinmeisters that actions have consequences. Independently funded 'truth squads,' the Federal Election Commission, and the Courts need to be brought fully into the mix to sort truth from lies, and to impose penalties and sanctions for deliberate violations of 'truth in advertising.' If the electorate doesn't demand honesty and integrity from the process then the only thing we can be assured of is that future campaigns will begin in the gutter and descend from there.

Jason Reifler: This question raises a couple of important points. First, that pressure can be potentially be brought to bear on campaigns that traffic in falsehoods. I am not a lawyer, but my understanding is that pursuing legal strategies (like libel or slander) would be extremely difficult. Second, we do see campaigns try and get TV stations not to air certain ads. Although in a federal election this can also have the potentially difficult legal issues by denying running ads of one campaign.

I think the main point relevant to our research is that "truth" is somewhat in the eye of the beholder. Political predispositions greatly affect what we view as true. Objective standards for what is "true" in terms of political events may be more difficult than we realize.

Shankar Vedantam: A good example of what Jason Reifler is talking about comes from one of the experiments I wrote about in the column today. Conservatives told about the 2004 Duelfer report -- which said Iraq did not have weapons of mass destruction before the 2003 U.S. invasion -- were MORE likely to believe Iraq did have the weapons than conservatives who did not hear the refutation. Even if it were possible to set up truly independent watchdog groups or agencies (and I don't really see how this could be done) the research seems to suggest that people largely believe what they want to believe. Is there an antidote for that?

_______________________

Silver Spring, Md.: Here's a different sort of question concerning misinformation, science and the political campaigns: After 8 years of Mr. Bush suppressing scientific findings (by the EPA or the NIH, principally), shouldn't Sarah Palin be questioned about her episode of sitting on the findings of her own scientists regarding the polar bear? According to the New York Times, an outside scientist was stonewalled as she misrepresented her staff's findings. When he finally got a hold of their internal discussions, it turns out Palin's public statement was a lie:

Once Elected, Palin Hired Friends and Lashed Foes (New York Times, September 13, 2008)

Shankar Vedantam: Thanks, Silver Spring. It's an interesting question and a good one. My sense is that the media are trying to ask a lot of questions, but to some extent are being locked out by the McCain-Palin campaign, which is keeping a tight lid on interviews. That said, I have seen many media accounts that attempt to correct misinformation -- the question we are dealing with today is whether these accounts actually change people's minds and attitudes?

Jason Reifler: The topic of political manipulation of scientific data is beyond my area of expertise. Chris Mooney has written extensively on this question, and I highly recommend his work.

There are lots of questions about Palin (as there are with all candidates). Campaigns certainly do shield candidates from the press -- this is not something invented by the McCain campaign.

_______________________

Bethesda, Md.: When I was growing up in the south in the 50s and 60s, I thought white racism was unmovable. Yet opinions changed dramatically in less than a decade. Given that people seek confirmation of their beliefs and consider even debunked falsehoods to be confirmation, how is such a dramatic opinion change possible?

Jason Reifler: This question raises a number of interesting points. First, some scholars would say that racism has not abated as much as it first appears. Rather, "racist" people have learned how to answer survey questions correctly. (That is not my view, but is a position held my some.)

Second, there may be a difference between our experiments that challenge a view one time and an overwhemling message environment against prior held beliefs that continue for years. This is one of the areas that our research is continuing in.

Shankar Vedantam: Thanks for the question, Bethesda. I think you are raising a larger point that is relevant: Over time, it does seem that people's views are changing, and that inaccurate views tend to go out of fashion, or go extinct. But two questions arise from this: Is the change because you are studying different groups of people at different points in time? Does comparing racial attitudes in 1950s America with the country today tell us that people have changed their minds or that many of the people we are talking to today were not around in the 1950s? The other point is that it appears that some shifts in people's attitudes track large-scale political changes. As the Bush Administration has itself stepped away from explicit claims that Iraq had weapons of mass destruction before the 2003 invasion, for example, I am guessing fewer conservatives subscribe to that view today. But is that because misinformation was corrected, or because the political winds changed?

_______________________

Huntingtown, Md.: One of the studies referenced in the article concludes that among conservatives, some refutations strengthen rather than mitigate misinformation. They claim it's because conservatives are more rigid in their views. Since the authors are both Democrats and presumably liberal, why should I take their claim as science? I would offer the refutation that it's not rigidity but a belief that like the laws of physics, there are moral laws that are universal, unchanging and not subject to the whims and cultural preferences of mankind. Liberals claim they do not believe in absolute truth, even though I would argue that they too have their absolutes, such as their contention that there ARE no absolutes. Studies like this would be more useful without making value judgments as to why people behave as they do.

Shankar Vedantam: I will pass that along to Jason so he can answer for himself. For my own part, I wrote about these studies because they were CONTROLLED experiments. Many of the experiments I wrote about in the column today suggest biases among liberals -- Democrats were more likely to think worse of then Supreme Court nominee John Roberts after hearing misinformation about him, even after the misinformation was corrected. Unless you are suggesting the researchers falsified their results to paint both Republicans and Democrats in a bad light -- which I personally have to say does not seem credible to me -- it seems that it is their data that are speaking to us, not their personal political beliefs. Parenthetically, I find it disturbing how often people dismiss data if it clashes with their pre-existing views -- isn't that an example of shooting the messenger?

Jason Reifler: I want to thank the participant for the question. I think that it is important to ask in a study about political predispositions driving how we interpret the world how our predispositions might affect the experiments we create and how we interpret the data.

First, these were controlled experiments, as Shankar points out. We actually were not expecting the results we got. Our work started with the idea that corrections could be more effective, and that effective corrections have to be what the psych literature calls "causal" (corrections that explain how people arrived at their mistaken view are more effective). WE did not find that. So, the results were not something we were trying to get to be a "gotcha" at conservatives/Republicans.

Second, we also test liberal misperceptions. And we alsofind that corrections are ineffective. That is, a correction of a "liberal" misperception does not bring liberals any closer to the truth. The difference is that we have yet to see a backfire effect among liberals. OUr research continues in that area to see what we can find. (If you have suggestions for good liberal misperceptions, please feel free to share. This is one area where we actively seek input from people with political preferences different than our own.)

_______________________

Takoma Park, Md.: Any correlation between the backfire effect and level of education/IQ/ other metrics that might be predictors of good analytical skills or the ability to retain information accurately?

Jason Reifler: We do find that those who do better on a simple battery of truly objective fact questions (e.g. who is the chief justice of the supreme court) tend to be less misinformed. There is suggestive evidence from a our statistical analyses that knowledge plays an important moderating role, but our data are not conclusive on that point.

These are great questions! Keep 'em coming!

Shankar Vedantam: I must say I would be extremely cautious with the suggestion that education is always an effective safeguard against misinformation. If you go back and read many of the columns I have written -- the archive is at washingtonpost.com/behavior -- there are numerous examples that show people who are the best informed sometimes have the worst and most rigid biases.

See for example, this piece about the "hostile media effect":

Two Views of the Same News Find Opposite Biases (Washington Post, July 24, 2006)

_______________________

Falls Church, Va.: So is there any way we can effectively hold campaigns accountable for misrepresenting the facts, or outright lying?

Jason Reifler: I wish a had the silver bullet that would slay the villain of falsehoods in politics. We are still figuring out the very basics of exactly when there will be a backfire effect. The next step is strategies to mitigate the power of falsehoods. Sadly, we are a long way away. Brendan and I are testing something right now in lab setting. But even if that works, it will be hard to migrate that from the lab to the real world. We believe that this is an important problem. I really wish I had a good answer that said "here's all we need to do, and the problems will be solved."

Shankar Vedantam: It seems to me the solution for these biases -- in so far as there really is a solution -- is less about what is out there and more about how we ourselves think. Many of us are quick to spot the biases in our political opponents, but are completely certain that our own views and information are accurate. To our political opponents, our certainty about things is also a glaring example of bias.

I don't want to imply that the lies propagated by different sides are all equivalent, and therefore cancel each other out. Clearly, some politicians and some campaigns are more honorable than others, and some are better at mass deception. From the point of view of the audience -- us -- caveat emptor!

_______________________

Washington, D.C.: I'm having difficulty inferring the reason for the backfire effect, especially of the magnitude of the instances you describe. If the refutation is itself unrefuted, doesn't redoubling one's original belief sort of amount to insanity?

Jason Reifler: Our speculation is that being given disconfirming evidence of what one believes threatens one's sense of self, and that the brain responds to this threat by counter-arguing the evidence. By successfully counter-arguing against it. one comes to hold the initial "mistaken" belief even more strongly.

Shankar Vedantam: Dear Washington, surely you are not implying that people are rational creatures, who think about the world in deliberate ways and reach balanced conclusions about what is right and wrong??!!?

_______________________

Liberal misperception ideas: How about unfounded prejudices about regular churchgoers (liberals more likely to already believe churchgoers are dumb) or about people who believe that Jesus Christ was literally the son of God (slightly different from regular churchgoer, but just as likely to trip "dumb" hot buttons in many liberals)?

Jason Reifler: We'll look into finding a way to incorporate these. They are slightly different than what we have done to data because there is not a specific fact to try and debunk (at least for the first). As the second (is Christ literally the son of God?), I'm simply going to steal an Obama line -- that is above my pay grade.

Shankar Vedantam: Good suggestions, both! Stereotypes always produce caricatures. To paraphrase both McCain and Obama, Americans have many more things that unite them than divide them. A political campaign, of course, reminds us primarily about differences, and not about areas of consensus or similarity.

_______________________

Arlington, Va.: Gentlemen, I think you dodged my comment. The operant phrase is that actions have consequences. I agree that 'proving' the lie will be a challenge, but as your own research indicates, once the fabrication (and the fabricators) have scored their point, the damage is done. 'All's fair in love and war (and politics)' is neither in the best interests of this country nor an example of how the United States is somehow more 'virtuous' than other societies. Expensive and punitive sanctions to include enjoining a ad campaign or a group may seem draconian, but dammit if the electorate doesn't insist on integrity in the process, who will?

Jason Reifler: I don't think that I was necessarily dodging the question (which is of course self-serving to believe).

My response is 1. as I understand the legal system, that is extremely difficult to do, and 2. our research shows that the idea of "facts" is slippery. Suppose a case comes to the FEC, FCC, or a judge. It will be up to the nebulous "them" to decide the facts and whether something really was an egregious abuse of truth that draconian punishment is necessary. I think an implication of our research is that whether one sees the transgression as sufficiently egregious to warrant stiff punishment will depend on the political predispositions of the person adjudicating the case. Harsher punishment may be helpful, but also may be difficult to actually implement.

Shankar Vedantam: Also, I strongly believe that a lot of misinformation propagated by political campaigns is completely sincere. People are not trying to mislead and deceive -- they sincerely believe many of the things they are saying. I am not sure a punitive system would do very much to help fight misinformation.

_______________________

Harrisburg, Pa.: There are several self-designated "fact checker" organizations. Which in your opinion are some of the better fact checkers?

Jason Reifler: I think factcheck.org does a good job. I will admit that doing this research has led me to change how I read fact checking items. I am much more aware of not letting counter-arguing affect what I take away.

By the way, Shankar wrote a great column about a year ago that focused on different ways correcting "myths" (though if I recall correctly those tended to focus on health issues).

Shankar Vedantam: The earlier piece Jason referred to is here:

Persistence of Myths Could Alter Public Policy Approach (Washington Post, Sept. 4, 2007)

The Washington Post has an excellent fact-checker in the person of my colleague Michael Dobbs. Check out his pieces on The Trail blog. Michael is the best fact-checker around, and my saying so has absolutely nothing to do with the fact I work for the same organization!

In the end, the best fact-checker may be that person you see in the mirror ...

_______________________

Bethesda, Md.: Since political campaigns are relatively short, winner-take-all events, unlike the steady drumbeat on civil rights, how is there any hope for truth in political campaigns? The truth only reinforces the lie.

Jason Reifler: First, that is a great line. Can we use it?

Second, this is our long term interest--how can we create a better informed citizenry? As I said before, I wish we had the answer figured out. As of now, we don't. But, I think that identifying this problem lets us know that we need to work hard to find a solution.

_______________________

Princeton, N.J.: There is a difference now. There is YouTube. I think that people who use faith-based reasoning (both religious and otherwise) will never be convinced because, after all, faith is the ability to hold beliefs not supported by or even in contradiction with facts. But an open-minded person (if one exists) will probably be more influenced by a tape than a printed version.

Shankar Vedantam: I agree that technology can provide us with tools to fight misinformation. But I think it is an open (and empirical) question as to whether the advent of the internet has made misinformation more or less common. It is true that anyone can post a video showing a political candidate contradicting himself or herself, and this ought to increase accountability, but it is also true that rumors and misinformation now spread rapidly through the internet. The splintering of the media (another technological development) has caused people of different political persuasions to tune in to different channels or read different publications -- meaning people are less likely than before to encounter points of view that clash with their own, or hear refutations of damaging rumors. The experiments I wrote about today all involved compelling refutations, but still proved ineffective -- it seems safe to say refutations in the real world are even less likely to be useful than the ones described in the experiments!

_______________________

Baltimore, Md.: Obligatory LBJ anecdote:

Back in 1948, during his first race for the U.S. Senate, Lyndon Johnson was running about ten points behind, with only nine days to go. He was sunk in despair. He was desperate. And it was just before noon on a Monday, they say, when he called his equally depressed campaign manager and instructed him to call a press conference for just before lunch on a slow news day and accuse his high-riding opponent, a pig farmer, of having routine carnal knowledge of his barnyard sows, despite the pleas of his wife and children.

His campaign manager was shocked. "We can't say that, Lyndon," he supposedly said. "You know it's not true."

"Of course it's not true!" Johnson barked at him. "But let's make the bastard deny it!"

Shankar Vedantam: Thanks for the anecdote, Baltimore. Since we are talking about misinformation and refutations, it might be useful for you to let us know the source of that anecdote!

On the larger issue of politicians misrepresenting the truth, I have written columns previously that discuss how the most "effective" liars are the ones that actually believe their own lies.

_______________________

Takoma Park, Md.: OK, another question. Is there a metric that assays people's own standards of honesty? Like how they behave themselves on a day to day basis? And do you think that people who score lower for honesty would be more likely to experience the backfire effect? There is a notion out there that honest people can detect lies while liars are themselves fooled.

Jason Reifler: I have no idea, but this is an excellent question/suggestion. Thanks!

Shankar Vedantam: Do take a look at a previous column I wrote on the subject of self-deception in politics. It describes research in evolutionary psychology that argues that self-deception evolved because it made deceiving others more effective.

When Seeing is Disbelieving (Washington Post, April 30, 2007)

_______________________

Falls Church, Va.: The Duelfer report may be a difficult example for you, because that report indicated that Iraq was intending to resume WMD production once sanctions were lifted, so it was a bit of a mixed message.

Jason Reifler: It may be. My reading of the report is that Iraq did not have stockpiles of WMD at the time of the invasion, which is what our question focuses on. Iraq may have had desires to reconstitute its weapons program. That being said, evidence of no stockpiles of WMD should not lead one to more strongly believe that there were stockpiles immediately prior to the invasion.

Shankar Vedantam: The question volunteers were asked after being provided with Bush Administration claims and the Duelfer report was whether they agreed with the following statement:

"Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived."

_______________________

Baltimore, Md.: I got the anecdote from Hunter S. Thompson, so make of it what you will...

Shankar Vedantam: Thanks, Baltimore.

_______________________

Chicago: Do you have any insights as to how one would find away around this "backfire" effect? How can we point out to someone with firm beliefs that what they think is factually inaccurate without causing cognitive dissonance?

Jason Reifler: If our speculation is correct that receiving difficult information leads one to counter-argue to protect a sense of self, than remedies probably lie in affirming one's sense of self before attempting to correct misperceptions. This approach is what Brendan and I are starting to test in a lab setting.

Shankar Vedantam: To put it another way, is it possible that fighting misinformation might be less about the facts and more about reassuring people that they are not dumb or stupid if they change their minds? If the process by which people believe bad information is partly psychological, is it possible the solution is partly psychological, too? Of course, this is very different from the conventional view of how to fight misinformation -- which argues that the antidote to bad information is merely to provide people with good information.

_______________________

Buffalo, N.Y.: I sincerely believe that citizens should have to take a civics exam to qualify to vote; it should be administered in an oral and written form, so it's not a matter of literal literacy but civic literacy. Why on earth should someone be allowed to vote if they don't know how many branches of government we have? Your thoughts?

Shankar Vedantam: Hmmm. There is a long (and rather distubing) history of what happens when societies decide that some citizens are better capable of electing leaders than other citizens. I take away nothing from your point that voting should involve educating yourself about the candidates and issues, but I think most experiments that have tried to limit voting to one group or another have ended badly. (It may be worth noting that people who want to ban other people from voting usually think the people who ought to be kept out of voting booths are those who are most likely going to vote for their political opponents!)

_______________________

More about reassuring people that they are not dumb or stupid if they change their minds: This is definitely what I do to my husband before I spring a big grievance. It seems to work.

Shankar Vedantam: On that funny and excellent note, we must bring this chat to a close. Thanks to all for excellent questions and a robust discussion. Thanks especially to Jason Reifler for giving us an hour of his time. Have a good day everyone -- and remember to be skeptical about the things you feel most certain about!

_______________________

Editor's Note: washingtonpost.com moderators retain editorial control over Discussions and choose the most relevant questions for guests and hosts; guests and hosts can decline to answer questions. washingtonpost.com is not responsible for any content posted by third parties.


» This Story:Read +|Talk +| Comments
© 2008 The Washington Post Company

Discussion Archive

Viewpoint is a paid discussion. The Washington Post editorial staff was not involved in the moderation.

Network News

X My Profile
View More Activity