A piece of graffiti in Adelaide, Australia. (Michael Coghlan/Flickr)

Typically, we use this space to debunk the various hoaxes, charades and conspiracy theories that afflict social media each week. But this week, I can’t do it. I must abstain. Because someone’s done a study on debunk efforts like this one, and bottom line? They’re all in vain.

To reach this heart-rending conclusion, Walter Quattrociocchi — the head of the Laboratory of Computational Social Science at IMT Lucca — and a team of seven (!) other researchers studied how two groups of U.S. Facebook users interacted with news on the site. One group was comprised of people who interact with reputable science pages. (Those are the ones who presumably have a level of news literacy.) The other group was made of people who like far-out conspiracy pages — anti-vaxxers, Illuminati-watchers, that kind of thing.

[If you use Facebook to get your news, please — for the love of democracy — read this first]

They quickly came to two conclusions about the conspiracy and non-conspiracy groups. First off: They didn’t overlap at all, which means the misinformed, as we’ll politely call them, were unlikely to ever see the truth. And second, when the conspiracy group did encounter “debunking” information, it didn’t change their mind. In fact, it just made them more resolute: After encountering a post that challenged a conspiracy theory, theorists tend to like and comment on pages about that theory even more.


Do debunk efforts change people’s minds? Well — not really. The orange line shows the rate at which people stop engaging with conspiracy posts if they HAVE seen debunks. The green line is the same rate, but if they haven’t: It’s faster. (Quattrociocchi et al)

That counter-intuitive effect, Quattrociocchi writes, has something to do with the conspiracy echo chamber: Because social environments like Facebook allow users to mold it to their own tastes, they’re only ever exposed to people and information “that conforms with their beliefs.” (More research will be needed, Quattrociocchi has said, to determine if Facebook’s algorithms exaggerate that tendency; a controversial study, published last May, suggests that it does — albeit modestly.)

On top of that, the identities of many conspiracy groups are built around rejecting the mainstream narrative. Even when confronted with eyewitness accounts or statistics or studies or other pretty solid evidence, their sense of self requires they reject it.

[A YouTube video claims ‘Back to the Future’ predicted 9/11 — and that isn’t even the weird part]

This is … depressing, frankly! And not merely because I’ve sunk 76 almost-consecutive weeks into this column. It’s also depressing because, with debunks or without them, Quattrociocchi expects hoaxes and conspiracy theories to grow more common. When you break them down into their composite parts, these hoaxes are just some mix of bias, functional illiteracy, and institutional mistrust – not exactly quick-fix problems.

They’re only heightened by the sheer amount of information on the Internet, and the pace at which new information comes out: Simply put, no one has the time or cognitive capacity to reason all the apparent nuances and discrepancies out.

“Facebook and Twitter have created a direct path of content from producers to consumers, changing the way users become informed, debate ideas, and shape their worldviews,” Quattrociocchi et al. write. “This scenario might foster confusion about causations regarding global and social issues and thus encourage paranoia based on false rumors.”

Can confirm: That is indeed happening! And apparently, debunking them doesn’t do a darn thing.

“What was fake” will return in its conventional form next week, even though its futility pains me. If you see any more Internet shenanigans before then, please e-mail caitlin.dewey@washpost.com.

Liked that? Try these: