The company declined to say how many Facebook users would be affected by the changes.
The baseless and often mutating QAnon philosophy, which has been identified by the FBI as a potential domestic terrorism threat, has gained prominent backers, including a slew of congressional candidates. One of them, Republican Marjorie Taylor Greene, is almost certainly headed to Congress after winning a Republican primary in a deep-red district in Georgia last week. Trump, as well as top White House aides and members of his campaign, have repeatedly elevated the movement’s iconography and rallying cries — in an apparent embrace of the conspiracy theory, whose nucleus is fealty to the president.
Facebook’s move reflects the growing pressure facing Silicon Valley to take action against groups that incite violence or threaten others during a nationwide racial reckoning and a tumultuous election year.
The action also addresses militia organizations and groups planning or supporting violence related to recent protests. Facebook said the militia groups, like QAnon, “have demonstrated significant risks to public safety but do not meet the rigorous criteria to be designated as a dangerous organization and banned from having any presence on our platform.”
“While we will allow people to post content that supports these movements and groups, so long as they do not otherwise violate our content policies, we will restrict their ability to organize on our platform,” the company said in a blog post.
Facebook’s move Wednesday is less severe than recent action by Twitter, which last month removed or limited the reach of roughly 150,000 QAnon accounts, citing the online movement’s association with violence and real-world harm. Twitter also removed related URLs from tweets and said it would prevent the conspiracy theory from appearing in recommendations and trending topics.
Facebook said that it would no longer recommend to its users groups or pages devoted to QAnon and that content in these communities would soon be down-ranked on the NewsFeed feature. Pages affiliated with the conspiracy theory will no longer be allowed to purchase ads or use the platform’s commercial listings.
QAnon took root on Internet message boards in the fall of 2017, with posts from a self-proclaimed government insider identified as “Q.” The mysterious online figure left cryptic — and fallacious — clues about Trump’s impending conquest over the “deep state,” spawning an elaborate far-right worldview that came to absorb many other debunked ideas.
Soon, believers began flocking to Trump rallies as the philosophy gained currency in more mainstream online communities. It has also been central to numerous violent acts, according to law enforcement, including two killings, a kidnapping, vandalism of a church and a heavily armed standoff near the Hoover Dam.
Recently, QAnon has been a major force in propelling misinformation about the coronavirus, including by aligning with Facebook groups that oppose vaccination, and in pushing falsehoods about Sen. Kamala D. Harris (D-Calif.), the Democratic vice-presidential nominee.
In the face of a growing public backlash as well as internal pressure, Facebook and other tech giants have started to take stronger actions against Trump and some of his most extreme supporters.
Earlier this month, Facebook removed from Trump’s account a clip from a Fox News interview in which he said falsely that children are “almost immune” to covid-19. Twitter required his campaign’s account to delete a post that included the same video, temporarily preventing it from tweeting. Both companies have applied labels to some of his posts about mail-in voting, seeking to direct users to more-reliable information.