And then something odd happened. Some rejected Kessler's analysis, often pointing to this Post article from the week after the attacks which mentioned that "law enforcement authorities [in Jersey City] detained and questioned a number of people who were allegedly seen celebrating the attacks and holding tailgate-style parties on rooftops while they watched the devastation on the other side of the river." Celebrations of some people on rooftops is pretty far from thousands of people who were televised celebrating — and those rooftop celebrations were not confirmed as having actually happened. The article, in other words, in no way proves that Trump was telling the truth.
It was reminiscent of the controversy over Ben Carson's story about winning an honesty award while a student at Yale. The Wall Street Journal found no evidence that the story had happened, prompting Carson to offer evidence that objectively didn't prove the point. To an outside observer, Carson noting that there was a class called "Perception" at Yale in 2002 doesn't prove that the candidate took a similarly named class while studying there in 1973 — much less that he was given $10 for his honesty. But a number of Carson's supporters accepted it as exculpatory.
It's hard to prove a negative — that Trump didn't see what he says he saw or Carson didn't get that $10 — as Kessler noted in his analysis. But it's not hard to recognize that evidence intended to back your point of view actually does no such thing.
But that's not the way humans work.
"We all think of ourselves as being these rational people. We hear evidence, and we process it," said Peter Ditto, professor of social psychology at University of California at Irvine, when we spoke by phone this week. "What's clear from decades of social psychological research is that people's emotions get involved in their reasoning, their motivations, their intuitions. Those shape and bias the way we process information."
"It's not that people believe anything they want to believe. People still think and need rationale," Ditto said. "But the things that we feel change what we count as evidence."
You're probably familiar with the concept of "motivated reasoning." That term refers to the tendency of people to rationalize on behalf of outcomes they want to see. Maybe you're thinking about making a leftover sandwich from your Thanksgiving turkey but are on a diet. If you don't eat the turkey, it will spoil, you might think, offering a reason to do what you want despite any number of arguments that could be made contrary to that impulse.
Ditto talks about something similar — motivated skepticism.
"People tend to be a lot more skeptical of information they don't want to believe than information they do want to believe," he said. He suggested that most users of Facebook would be familiar with this. Someone with an opposing political position on an issue might share an image that you can immediately see is false or misleading — but you're more motivated to be skeptical than the other person. (Think Donald Trump and the murder rate tweet.) "People tend to just sort of scoop up information they want to believe and uncritically analyze it," Ditto said, "and then are much more skeptical and allocate their skepticism in a biased way."
That compounds over time, so that people compile evidence that supports their view and critically dismiss that which doesn't, so that the evidence on their side eventually seems overwhelming.
Those of us who write about politics with some regularity will notice a multiplier effect: emotion. The candidates who prompt the most complaints and the most pushback on articles tend to be those with the most energized bases of support: Trump, Carson and Bernie Sanders. That the Republican base is unusually angry at government is a large part of what has given Trump and Carson room to breathe in the primary.
That overlap doesn't surprise Ditto. "The more passionate people are, the more morally convinced they are about the issue, the more they care about that in various other ways, the more biased they're likely to be," he said.
But there's no indication from his research that conservatives or liberals are more likely to use bias in selecting evidence. He's been conducting meta-analysis on past studies looking at this issue. "Both sides show a clear bias," he said, "They're more likely to accept the same information as valid if it supports their political views than if it doesn't, and the magnitude of that effect is exactly the same" between political sides.
Even whether or not close scrutiny should be applied also differs based on motivation. Trump's story is "close enough to the truth" for his supporters," Ditto said. Defenders of Carson's life story offer a similar argument: "'What [he's] saying is basically true, so don't quibble with me about the details,'" Ditto said. But many of those supporters "just wouldn't accept that from the other side. Those same people that are saying 'why quibble on these words' have a history of going after President Obama's words and picking them apart very carefully."
The end result is that data or evidence is often a bad way to try to convince someone of an argument (which is precisely why, despite your being armed with a surfeit of how-to guides to change your family's politics over the holidays, you didn't actually change anyone's mind).
"What you guys do," Ditto said of members of the news media, "is present facts and try to say it louder and louder. That's very unlikely to work. I always kind of compare it to [people] who go to another country and don't understand the language, so they just say it louder and louder."
Suggesting that, unless you are already skeptical of Trump's story, this will not convince you:
Alleged but unconfirmed contemporaneous stories of New Jersey arrests do not prove Trump right!
If it did, I suspect Ditto would like to speak with you.