The Washington PostDemocracy Dies in Darkness

YouTube joins Silicon Valley peers in preelection QAnon clampdown

YouTube, while stopping short of a full ban, puts the Google-owned platform in line with Facebook and Twitter.

(Eric Piermont/AFP/Getty Images)
Placeholder while article actions load

Google-owned YouTube said Thursday it is taking broad action against the QAnon conspiracy theory, better aligning its policies with Twitter and Facebook in the critical weeks before the presidential election.

YouTube said that it would update its hate and harassment policies to “prohibit content that targets an individual or group with conspiracy theories that have been used to justify real-world violence."

The QAnon conspiracy theory, which has spread widely under a banner of protecting children from pedophilia, has been associated with violent incidents, including in 2018, when an armed man touting the ideology was arrested after a standoff at the Hoover Dam.

YouTube is the latest social media company to institute a clampdown on the most virulent aspects of the right-wing conspiracy theory, which has entered the mainstream in large part because of its viral growth on social media. Facebook and Twitter initiated broad crackdowns this summer that closed or limited the reach of more than 20,000 QAnon-connected accounts and pages.

YouTube did not say how many videos and channels it expected would be affected by its modified policies, but a number of popular accounts were swiftly removed shortly after the announcement.

The company appeared not to go as far as Facebook and Twitter, which have effectively suppressed communities devoted to the conspiracy theory, noting that it would still permit content discussing QAnon “without targeting individuals” or groups protected under its hate speech policies. That has been a difficult line to walk for Silicon Valley. Facebook in August said it would ban online forums focused on QAnon only when the discussion involved potential violence, a qualification it eschewed two months later in imposing a more rigid ban.

“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” the company said in a blog post.

What started as a mysterious message on the online forum 4chan in 2017 has grown into one of the world’s most virulent conspiracy theories, spanning numerous countries, estimates suggest, and tens of thousands of adherents. At the core of QAnon are baseless allegations that Democratic officials and Hollywood celebrities have engaged in unconscionable crimes, including raping and eating children, while seeking to subvert the Constitution. President Trump, the conspiracy theory holds, is quietly battling these evils.

How the Trump campaign came to court QAnon

Trump, when given the opportunity, has declined to distance himself from the deluded worldview. At a White House briefing in August, he said he appreciated the support of QAnon adherents, calling them “people that love our country.” When a reporter outlined the erroneous claims underlying the theory — that “you are secretly saving the world from this cult of pedophiles and cannibals” — Trump seemed to embrace that role for himself.

“I haven’t heard that. Is that supposed to be a bad thing or a good thing?” the president said. “If I can help save the world from problems, I am willing to do it. I’m willing to put myself out there. And we are actually, we’re saving the world.”

YouTube has acted against QAnon in the past. In 2019, the company removed tens of thousands of QAnon videos and hundreds of related channels when it updated its hate speech policies to ban denials of major violent events, such as the Holocaust, and conspiracy theories targeting protected groups.

As QAnon grew, Facebook and Twitter missed years of warning signs about the conspiracy theory’s violent nature

Updated policies curtailing misinformation about covid-19 also suppressed aspects of the online QAnon movement, which has been a major vector for falsehoods about the pandemic. The restrictions came up short, however, when a video, called “Plandemic,” gained traction on YouTube and other social media with false claims that shadowy elites were engineering the coronavirus and a possible vaccine to enrich themselves.

While originally posted on YouTube and other platforms, the video appeared to take off because of links shared within major Facebook groups devoted to QAnon — communities that grew, before Facebook’s enforcement actions, in part because of the platform’s own recommendation algorithm. According to YouTube, changes to its recommendations system were associated with a significant reduction in the number of views on major QAnon-related channels surfaced from non-subscribed recommendations — falling by over 80 percent since January 2019.