Illustration by Steve McCracken

During the 2016 campaign — and through the first month of the Trump presidency — I got asked one question more than any other: Why don't you guys fact-check Donald Trump?

The answer I always offered is: We do! Lots. And I would point them to the great work of The Post's Glenn Kessler and Michelle Ye Hee Lee during the campaign. That duo fact-checked 92 Trump claims — two-thirds of which they found to be totally false.

It is indisputable then that The Post — and lots of other media organizations — fact-checked Trump. A lot. What people were saying when they asked about fact-checking then was really, “Why don't the fact-checks of Donald Trump change people's minds?” As in, if two-thirds of what Trump says just isn't true, how could he have won the White House?

A piece by the New Yorker's Elizabeth Kolbert answers that question better than I ever could, detailing, through a series of fascinating social experiments, why people believe what they believe — and why facts have very little to do with it.

You should read the whole piece. But these two passages really jumped out at me.

1. “People believe that they know way more than they actually do. What allows us to persist in this belief is other people.”

2. “If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views.”

Opinions about everything — including politics — are not made more meritorious or convincing if they are backed by a steady helping of facts. Most often opinions are created and strengthened by affirmation from other people. And, once you and someone else believe something — whether it's true or not — then you have not just double the assurance that you're right but exponentially more assurance in your view.

This 2014 survey testing how people thought the U.S. should conduct itself in the wake of the Russian incursion into Crimea is illustrative of that point:

Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)

A lack of knowledge about the region or its politics wasn't seen by people as any sort of hurdle to committing military forces. Quite the opposite. Why? Because their views were separated entirely from the necessity of facts. They and the people around them thought something and, simply by that occurrence, it was turned into fact.

This tendency toward convincing ourselves that we are always right because we know people who agree with us — even if the facts don't — is absolutely central to understanding the so-called fake news allegations flying back and forth these days.

Many of President Trump's supporters call anything they disagree with “fake news” because to them it is. They don't know people who agree with what the mainstream media is reporting so, therefore, it must be fake. Of course, news isn't fake simply because you don't agree with it. News is fake if it isn't true in light of all the known facts. Not whether or not it “feels” true or not to you.

Former vice president Joe Biden used to like to say “You're entitled to your own opinions but not your own facts.” Increasingly — and unfortunately — that's not true.