The result is an ever-widening conspiracy-theory movement that, despite efforts by Silicon Valley to clean up its services, remains a primary fount of disinformation in the closing days of the hotly contested presidential election.
QAnon accounts have been particularly aggressive recently in spreading unverified claims about the supposed business dealings of Hunter Biden, the son of Democratic presidential nominee Joe Biden. The accounts also have sought to undermine confidence in mail-in voting, warning, without evidence, that the election will be “stolen” by Democrats.
“Despite the actions of social media platforms to curtail the growth of QAnon, QAnon conspiracy theories and misinformation continue to gain traction online at an alarming rate,” said Daniel J. Jones, a former FBI analyst and Senate investigator who now is president of Advance Democracy, which studies disinformation. “This growth has real-world consequences. We’re going to see faith in our institutions continue to erode and that’s going to make it increasingly difficult to govern.”
Websites touting QAnon’s baseless claims, meanwhile, have rapidly adapted to efforts to knock them offline.
The anonymous message board 8kun, which is where the mysterious figure “Q” posts cryptic clues known as “drops,” this month began using a Russian company to protect itself from online attacks after a company based in Oregon cut ties with the site, according to research by the SITE Intelligence Group, which tracks online extremism.
It also found that other QAnon websites have been getting similar protection from the San Francisco-based Cloudflare. Cloudflare’s head of public policy, Alissa Starzak, said in a statement, “We do not discuss users without their permission.”
Shortly after QAnon’s birth in October 2017 on the anonymous message board 4chan, it spread to mainstream platforms such as Facebook, Twitter, Reddit and YouTube, which propelled to much larger audiences bogus claims about satanic pedophiles running the Democratic Party. The platforms eventually cracked down — Reddit in 2018, and Facebook, Twitter and YouTube in the past few months — because of numerous violations of policies against hate speech and growing publicity about its tendency to fuel real-world violence.
But every time a company has acted against QAnon, the conspiracy theorists found new venues for their messages while also learning to camouflage themselves on the old ones — using coded language or tweaking hashtags to avoid automated efforts to close accounts and remove content.
This adaptability caps a banner year for QAnon, which enjoyed explosive growth by spreading misinformation about the origins and nature of the coronavirus this past spring, portraying it as a hoax engineered by Microsoft co-founder Bill Gates and a “Satan-worshipping cabal” to confuse and control an unwitting public.
Disinformation about the safety of vaccines, already prominent among online conspiracy theorists, emerged this year as a key QAnon narrative, as did misinformation about the racial justice protests that emerged in the aftermath of George Floyd’s killing by Minneapolis police. QAnon groups on Facebook were still growing sharply after the company’s initial effort to control it in August, with a sample of 85 accounts showing an average increase of more than 100 members, according to an Oct. 14 report by SITE Intelligence Group.
“Fortunately, mainstream social media companies are now aggressively moving to keep QAnon off their platforms,” said Rita Katz, executive director of SITE. “But it will require more than just well-known companies to mitigate QAnon’s threat. QAnon already has a wealth of options — if not an entire infrastructure — to fall back on.”
In the swing states of Florida and Texas, one out of every 25 tweets on the election come from accounts affiliated with QAnon, and at least 7 percent of the responses or other engagements with Trump’s tweets came from accounts affiliated with QAnon, according to research from Advance Democracy.
The research, relying on data from digital intelligence firm Zignal Labs, also found that between January and mid-October, Twitter accounts affiliated with QAnon played a leading role in promoting content from far-right websites, from Breitbart to Gateway Pundit. The most active account posting on politics in Florida, meanwhile, had QAnon references in its name and profile information before being suspended on earlier this month, the analysis found.
“We’re committed to protecting the public conversation and promoting healthy conversation on Twitter,” spokeswoman Lauren Alexander said in an email. “To date, through our work to deamplify content and accounts associated with QAnon, we have reduced impressions on QAnon-related tweets by more than 50%, meaning our users are seeing less unhealthy content on their feeds.”
Facebook spokesman Sarah Pollack said that since expanding its policies on conspiracy theories that spur violence in August, the company has “removed about 1,700 Pages, 5,600 Groups and about 18,700 Instagram accounts representing QAnon.”
Even as QAnon content remains popular on mainstream services, the specter of removal has pushed adherents to more hands-off platforms. Telegram, an encrypted messaging app based in Dubai that SITE Intelligence Group says “houses the majority of the international QAnon community,” has announced no enforcement against QAnon. The company did not respond to a request for comment.
QAnon also is thriving on Parler, a social media site popular with conservatives that has taken no action against the conspiracy theory, and Gab, whose founder, Andrew Torba, wrote that he “welcomes QAnon.” Parler did not respond to a request for comment. Gab did not offer a responsive comment to a Washington Post query.
T-shirt sales thriving
As QAnon messages continue to spread on various platforms, business ventures related to the conspiracy theory are also expanding — often using enforcement actions on mainstream social networking services as a selling-point. Consumer products, from T-shirts to coffee to mugs, bind adherents to the conspiracy theory just as powerfully as do memes and online catchphrases, said Lisa Kaplan, CEO and founder of Alethea Group, a counter-disinformation consultancy.
“There’s truly an economy that’s now associated with QAnon,” Kaplan said. “We talk about it in the context of elections — for the first time there are QAnon candidates likely to be elected to Congress — but we haven’t talked enough about the individuals who are paying their mortgages and feeding their families based on QAnon merchandise sales.”
Diatribes against the so-called deep state and cryptic clues about Trump’s impending conquest over satanic forces are interspersed on 8kun, the sprawling online site that’s home to the QAnon conspiracy theory, with more upbeat messages.
“Stickers,” one reads, “Stickers.” Another advises: “Get Yours!” A third, similarly rendered in eye-catching yellow and black, advertises “8kun coffee.”
The ads promote an e-commerce site operated by Jim Watkins, a U.S. Army veteran and businessman who runs 8kun from the Philippines. Denizens of the no-frills, no-rules forum find the ads intrusive. “They are tracking us,” one wrote recently about the posts.
The ads are also a reminder, said experts tracking the conspiracy theory, that QAnon is a moneymaking opportunity for Watkins — and for other people and groups operating platforms where the deluded philosophy is propounded. Watkins is the registered owner of the e-commerce domain linked to the ads, called Is It Wet Yet. The site is connected to a political action committee he founded in February called Disarm the Deep State, which enjoys a presence on both Facebook and Twitter and describes itself as the “first Super PAC dedicated to removing shadow government actors.”
“It’s obvious self-dealing,” said Fredrick Brennan, a software developer who founded 8kun’s precursor, called 8chan, in 2013 but has since called for it to be shut down, directing criticism at the site’s new owner that recently led Watkins to bring a “cyberlibel” case against him. “He knows he’s in total control of Q, whether or not he writes the posts himself, so he’s been able to turn it into an industry,” Brennan said.
Watkins did not respond to an email seeking comment.
The “profiteering,” said Brennan, extends beyond Watkins and has intensified because of the recent crackdowns on social media. Following enforcement actions by the major technology companies, the conspiracy theory has been sustained in large part by a robust digital marketplace for goods branded with its iconography.
The theory’s evangelists criticize Big Pharma, and then promote alternative medicines to those whom they’ve convinced. They warn of an impending social collapse, and then peddle “prepper” equipment for newly minted survivalists.
QAnon marketplaces bridge mainstream and fringe online platforms. The most popular site is Amazon, with more than 8,000 individual QAnon-branded products, according to an analysis conducted by Alethea Group and the Global Disinformation Index.
Amazon Books has at least 30 different QAnon-related texts, many of which end up on top-seller lists for religious philosophy, controversial religious knowledge and other obscure genres.
“Amazon is the one helping QAnon influencers make money, and it might even be more important than Facebook,” Brennan said. “They need payment processes as much as they need social media. If they don’t have payment processes, it doesn’t matter how viral they are — they can’t collect donations.”
An Amazon spokeswoman, Mary Kate McCarthy, declined to comment. (Amazon founder and CEO Jeff Bezos owns The Washington Post.)
The e-commerce site eBay, which did not respond to a request for comment, is another popular destination for QAnon gear, with 3,000 products related to the conspiracy theory, according to the analysis by Alethea Group and the GDI. Etsy, which once hosted similar material, acted against the conspiracy theory’s adherents this month, following the crackdown by Facebook.
A QAnon entrepreneur
Meanwhile, numerous independent e-commerce sites offer merchandise related to the conspiracy theory. While their owners present themselves as the bane of mainstream platforms and media gatekeepers, these networks benefit crucially from their associations with more established brands.
Exemplifying the arrangement is the White Hat Movement, a collection of sites operated by Dustin Nemos, a 33-year-old father of two whose real name is Dustin Krieger. Having amassed a lengthy rap sheet in North Carolina, including drug-related and burglary charges, Nemos said he relocated to southwest Florida. There, he makes a living by selling QAnon merchandise and participating in revenue shares with more prominent brands.
Among these is an emergency preparedness business, My Patriot Supply, which uses his name and likeness to sell emergency food supply products. The business, which enjoys a vast online presence, including more than 100,000 followers on Facebook, did not respond to a request for comment about its ties to Nemos.
“We’re not getting rich off the QAnon movement, which people want to say we are,” Nemos said.
Sales of his QAnon-related book on Amazon were foundational, Nemos said, helping to fund some of his other ventures. In all, the White Hat Movement has a reach of more 130,000 Facebook users through seven pages and about 6,000 Twitter users on two accounts, Alethea Group’s analysis found. These include right-wing news sites and other corporations registered in Wyoming and active since 2019.
Nemos has also sought to create a health insurance company trading on the “Make America Great Again” slogan, as well as an independent cellphone service, according to Alethea Group and GDI, which cited leaked logs from the social media site Discord. The 2019 logs suggest Nemos was consulting Judy Mikovits, the discredited virologist who gained viral popularity in far-right communities when she was featured this spring in the debunked “Plandemic” video.