“QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement,” the company said in its blog post.
The ban encompasses all Facebook pages and groups devoted to QAnon, as well as Instagram accounts that have names representing the deluded philosophy. It does not reach individual Facebook profiles or posts, meaning conversation about QAnon will hardly be forbidden on the platform.
This action comes after more than two years of mounting evidence that the QAnon conspiracy is rife with violent, hateful themes that regularly violated policies across Silicon Valley and also inspired numerous real-world crimes.
"Ultimately the real test will be whether Facebook actually takes measures to enforce these new policies — we’ve seen in a myriad of other contexts, including with respect to right-wing militias like the Boogaloos, that Facebook has repeatedly failed to consistently enforce its existing policies,” said Sen. Mark R. Warner (D-Va.), who has been pushing Facebook for more action against QAnon.
At the core of QAnon, which took root on anonymous message boards in October 2017, are baseless allegations that Democratic officials and Hollywood celebrities engaged in unconscionable crimes, including raping and eating children, while seeking to subvert the Constitution. President Trump, the conspiracy theory holds, is quietly battling these evils.
The “Q” of QAnon is supposedly a high-level government official privy to these secrets because of a top-secret security clearance. The shadowy figure speaks only on the site 8kun, a successor to the now-closed 8chan, but the information for years spread almost instantly across mainstream social media platforms, powered by those analyzing Q’s pronouncements.
The conspiracy theory has grown particularly popular on the political right, with more than 70 Republican candidates for office embracing at least some elements of QAnon this year, according to tracking by liberal research group Media Matters. One adherent, Marjorie Taylor Greene, is virtually guaranteed to win a seat in Congress in November’s election.
QAnon this year has played a key role in spread disinformation related to Covid-19 and the vaccines that might help remedy it, as the conspiracy theory has expanded to take on new themes, such as the supposed dangers of 5G cellular technology.
Facebook’s August action left many critics frustrated that the company hadn’t gone further to curb a conspiracy theory that, while active on fringe platforms, used the amplification power of mainstream ones to reach many more people. In the first month of that policy, Facebook said it removed more than 1,500 pages and groups that discussed potential violence and more than 6,500 pages and groups tied to hundreds of “militarized social movements."
But, company officials concluded, they had not done enough.
“It’s really important that they pushed beyond that to see that this conspiracy was being pushed in other places,” said Vanita Gupta, president of the Leadership Conference on Civil & Human Rights, a Washington-based umbrella group, after Tuesday’s action. “So it’s a significant development.”
Travis View, who co-hosts a podcast called “QAnon Anonymous,” said of Tuesday’s action, “It will probably hinder the growth of QAnon and decrease the reach of misinformation generally. But there is risk that this may cause some to seek out QAnon community on more extreme platforms. There’s a possibility that the QAnon community will be smaller than it might have been, but more volatile.”
Facebook moved quickly on Tuesday to scrub QAnon content that had been widespread on the platform. A page with 130,000 followers, called “Q Pin” and devoted to “all things Q,” remained active on the platform six days after The Washington Post had raised questions about violent language appearing on its posts. It was removed within an hour of the announcement of the new policy.
Three times last month the page shared an “Army for Trump” website seeking to recruit volunteers to stand watch at the polls, among other responsibilities. On one post, a user commented to call Democrats “dead ducks in the water.” Another user falsely suggested Democrats had promised “Riots and Murders” surrounding the election and asked how Republicans would respond.
The language illustrates how difficult it was for the platform to remove only QAnon content involving potential violence — Facebook’s previous standard but a fine line for a movement that envisions the mass arrest of Democrats and celebrities.
Groups that came down on Tuesday had such names as “WhereWeGo1WeGoAll” — the motto of the movement recited on camera by the likes of Michael Flynn, Trump’s first national security adviser — and “17- TRUTHERS UNITE !,” a reference to the 17th letter in the alphabet, Q.
Facebook’s action drew praise on Tuesday from those that had been calling on the platforms to do more to combat QAnon.
“I’m pleased to see Facebook taking this action. And I hope we see other social media platforms follow their lead swiftly. But the growth of QAnon online is not just the fault of the social media platforms. We need political leadership— leadership from all elected officials, and from all sides of the political spectrum— to denounce QAnon and similar groups," said Daniel J. Jones, a former FBI analyst and Senate investigator who lead the review of the CIA’s torture program, now president of Advance Democracy.
The group found similar problems on Twitter following its enforcement action in July. Steps taken by the company led to QAnon content dropping by roughly half, while a significant amount of of it remained on the platform.
A Twitter spokesman did not immediately respond to a request for comment about whether the social networking platform would step up its enforcement in connection with Facebook’s move.