Comet Ping Pong. (Nikki Kahn/The Washington Post)

When a guy with an assault rifle walks into a pizza joint to “self-investigate” the made-up conspiracy theory he found on the Internet about a nonexistent child-prostitution ring, there’s no doubt we’ve got a problem.

And regular folks are reasonably alarmed.

A new Pew Research Center study finds that 2 in 3 U.S. adults say that fabricated news stories cause “a great deal of confusion about the basic facts of current issues and events.” This sense is shared widely across incomes, education levels, political affiliations and most other demographic characteristics, according to the study.

Pope Francis agreed, memorably comparing the consumption of fake news to the eating of excrement. (A much-shared fake story said he had endorsed Donald Trump for president.) President Obama has chimed in on the dangers, too: “When there’s so much active misinformation and it’s packaged very well,” he said, it poisons political discourse.

Facebook, initially reluctant to step into the fray, announced Thursday that it would take some first steps.

A new feature on Facebook will now flag news if it's been disputed by an independent third party fact checker. You will also see an alert if you're about to share the news. (Facebook)

“We’ve focused our efforts on the worst of the worst,” wrote Adam Mosseri, a Facebook vice president. Those efforts include testing ways for users to report what they suspect is fake news; working with the Poynter Institute’s International Fact-Checking Network to provide users with verified information on “disputed” stories; and reducing the financial incentives to spammers.

The idea is to slow the spread of fake news without turning Facebook into a worldwide censor.

It’s a promising start, given Facebook’s outsize role as a purveyor of fake news to its nearly 2 billion users.

And it certainly beats one of the ideas that surfaced in the Pew survey — that the government or politicians should act to stop the spread of fake news. (Asked who should tackle the problem, respondents gave about equal weight to government, tech companies such as Facebook and Google, and the public.)

Government involvement is a seriously bad idea. It could put the question of what constitutes real news and what constitutes fake news in the hands of those who may be most affected by it.

And given the ascendancy of Trump, who traffics in falsehoods on a regular basis — and has been clear about wanting to limit long-established press rights — it’s an even worse notion.

“I’m not suggesting we throw up our hands and say there’s nothing to be done, but we need to be very rigorous in defining the problem and thinking through the implications,” said Jameel Jaffer, director of the Knight First Amendment Institute at Columbia University.

Comet Ping Pong customers came out to support the restaurant after a gunman entered it with an assault rifle, firing it at least once. Several other businesses on the block have received other threats as well. (Whitney Shefte/The Washington Post)

For one thing, he says, “there’s a very narrow category of content that everybody would agree upon.”

The term “fake news” is fuzzy. It can refer to a multitude of problems, including disinformation, propaganda, conspiracy-mongering or what Jaffer calls “very biased takes on public affairs.”

“I don’t think we want the government — or, for that matter Facebook — to be the arbiter of what’s true and what’s false,” Jaffer told me.

So what else can be done?

Eli Pariser, the founder of the viral-news site Upworthy, has set up an online clearinghouse for potential solutions. One of these: Verified news-media pages. A news organization would have to apply to be verified as a credible news source, after which its stories would be published with a “verified” mark, similar to Twitter’s check mark.

Another is adding a “fake news” flag to questionable articles. This could be user-generated or crowdsourced; it would mean that readers would at least see a warning box before they clicked through, thus potentially slowing the spread. As the Guardian noted in a recent survey, this would be vulnerable to gaming the system: “Users could spam real articles with fake tags.”

All of these ideas are open to claims of bias.

In a world increasingly plagued by social-media filter bubbles and partisan echo chambers, it’s tough to get agreement even on the color of the sky — much less the role of Russian cyber-intrusion into the American presidential election.

But despite the protestations by some in the Trump camp that there are no facts anymore, that way lies anarchy.

Amy Mitchell, Pew’s director of journalism research, told me that the new survey reinforces earlier research findings: “Americans have a hard time agreeing on the facts.” But the Pew survey — mostly done before the gunman walked into the aforementioned D.C. pizza shop, Comet Ping Pong — makes clear that people find the proliferation of fake/false news confusing and want action.

Facebook and other tech giants need to keep moving on this, while being ever-mindful of legitimate free-speech concerns. That’s a very tricky balance, with hazards everywhere.

The answers don’t lie in government oversight, which can quickly turn to censorship.

Perhaps most important: We all must get smarter about what we’re reading and viewing.

Schools should be redoubling their efforts to teach news literacy, civics and history. News literacy organizations deserve more support than ever. Fact-checking, and good judgment, informed by radical skepticism, matter most. And yes, a slower trigger finger on the share buttons.

Truth may indeed be hard to pin down. But facts do exist — and underground tunnels at Comet Ping Pong don’t.

For more by Margaret Sullivan visit