The Washington PostDemocracy Dies in Darkness

Conspiracy theorists, banned on major social networks, connect with audiences on newsletters and podcasts

Newsletter company Substack is making millions off anti-vaccine content, according to estimates

Demonstrators rally against vaccine mandates at D.C.'s Lincoln Memorial on Sunday. (Eric Lee/Bloomberg News)
Placeholder while article actions load
correction

Mercola’s newsletter is among the top 20 most popular in the political category on Substack and subscribers pay $50 a year. A previous version of this story incorrectly said subscribers pay $50 a month and that Mercola’s newsletter was among the top 10 most popular on Substack. This article has also been updated to clarify that the $2.5 million is an estimate for revenues. This article has been updated.

Joseph Mercola, a leading anti-vaccine advocate whose screeds have been restricted by YouTube and Facebook, this month warned that the unvaccinated might soon be imprisoned in government-run camps. The week before, he circulated a study purporting to use government data to prove that more children had died of covid shots than from the coronavirus itself.

Shut down by major social media platforms, Mercola has found a new way to spread these debunked claims: on Substack, the subscription-based newsletter platform that is increasingly a hub for controversial and often misleading perspectives about the coronavirus.

Substack, which researchers from the nonprofit Center for Countering Digital Hate say makes millions of dollars off anti-vaccine misinformation, on Wednesday defended its tolerance for publishing “writers with whom we strongly disagree.”

Prominent figures known for spreading misinformation, such as Mercola, have flocked to Substack, podcasting platforms and a growing number of right-wing social media networks over the past year after getting kicked off or restricted on Facebook, Twitter and YouTube.

Now these alternative platforms are beginning to face some of the scrutiny that has imperiled social media services. But there’s a fundamental difference in the architecture of newsletters and podcasts when compared to that of social media companies. Social networks use algorithms to spread content — sometimes misinformation — to users who don’t want to see it. Newsletters and podcasts don’t.

These newer platforms cater to subscribers who seek out specific content that accommodates their viewpoints — potentially making the services less responsible for spreading harmful views, some misinformation experts say. At the same time, the platforms are exposing tens of thousands of people to misinformation each month — content that can potentially lead people to engage in behaviors that endanger themselves and others.

Earlier this month, 250 doctors and scientists wrote an open letter to the music streaming platform Spotify asking the company to drop host and comedian Joe Rogan — one of its most popular podcasters — for discussing conspiracy theories about vaccines. Neil Young asked the company to remove his music in protest this week, saying in a letter that Spotify “can have Rogan or Young. But not both.” (Spotify dropped Young on Wednesday.) Former Trump adviser Stephen K. Bannon, who was booted from Spotify in 2020, used his popular podcast, available on multiple platforms, to disseminate violent rhetoric and false claims about the election in the weeks leading up to the Capitol siege on Jan. 6.

Substack, which was founded in San Francisco in 2017, is part of a growing crop of subscription-based services whose mission is to help creators, authors and other influencers get paid for building more intimate relationships with devoted audiences. Readers pay per month to subscribe to a certain author, and the author keeps 90 percent of the revenue, while Substack takes 10 percent. The subscription model has become so popular that Twitter recently launched a subscription service and Facebook has outlined plans for paid subscription-based newsletters for authors and creators.

Spotify pulls Neil Young’s music after his ultimatum regarding Joe Rogan and ‘fake information about vaccines'

Mercola has been banned from YouTube, and his content has been restricted on Facebook. He uses his remaining public channels — like Twitter — to direct people to a “Censored Library” of articles he publishes in his newsletter, which is one of the top 20 most popular political newsletters on Substack.

Mercola did not respond to a request for comment.

This type of content is “so bad no one else will host it,” said Imran Ahmed, CEO of the Center for Countering Digital Hate, a nonprofit that focuses on combating misinformation and has researched Substack. By splitting subscription profits with creators, the group estimates, Substack generates at least $2.5 million a year in revenue from just five anti-vaccine leaders who have amassed tens of thousands of subscribers, each paying $50 a year.

Substack declined to comment, but shortly after The Washington Post made inquiries, CEO Chris Best and his two co-founders published a blog post saying that putting up with “the presence of writers with whom we strongly disagree” was a “necessary precondition for creating more trust in the information ecosystem as a whole.”

“The more that powerful institutions attempt to control what can and cannot be said in public, the more people there will be who are ready to create alternative narratives about what’s ‘true,’ spurred by a belief that there’s a conspiracy to suppress important information,” they wrote.

Facebook groups and other closed forums have long been plagued with misinformation because they are essentially echo chambers in which users share similar viewpoints, experts say, and newsletters face similar problems. They can make like-minded people more radicalized in their beliefs. And a popular newsletter can be picked up and amplified by other outlets, as well as forwarded to others.

Early on, social media companies took a hands-off approach to policing content. Only posts directly advocating violence or lawbreaking were removed. But Silicon Valley firms like Facebook, YouTube and Twitter have changed their approach over the past four years in response to controversies, including the use of their services for online bullying and sowing disinformation. They have crafted policies that police many forms of harmful material, including banning misinformation about the coronavirus, and have hired small armies of moderators who scan content and delete what breaks the rules. They also work with fact-checkers that help the companies label content that is inaccurate.

The rules social media companies have designed for advertising are even stricter because companies do not want to be perceived as profiting off hate and other social ills.

Still, misinformation creeps through and proliferates.

A major funder of the anti-vaccine movement has made millions selling natural health products

Substack, by contrast, is operating under standards that resemble those of social media companies in their early days. Chief executive Best said he wants to build a platform for “questioning conventional wisdom,” where “dissent is allowed.” He said he disagrees with the way social media companies have been pushed into becoming “censorship police.”

Best has even made a point of contrasting his business model with that of social media companies, saying the purpose of firms like Substack is to let people “take back” their minds from their social media feeds, which he refers to as “amplification machines.”

Substack’s limited content guidelines say the company bans “harmful content” but don’t specify except to say that this includes “material that advocates, threatens, or shows you causing harm to yourself, other people, or animals.” The company did not respond to questions about how the rules are being enforced, other than to say it maintains a hands-off approach.

Joan Donovan, research director of the Technology and Social Change Project at the Shorenstein Center on Media, Politics and Public Policy, said the attitude of companies like Substack was only going to invite further scrutiny.

“Openness is easily exploited, so a lack of policy means the brand’s reputation will be dragged anytime there is a major scandal,” she said. “Substack’s brand will be tied to its most controversial creators. Clear policy will ensure they can enforce their terms early on before a creator has caused so much damage that it’s impossible to separate bad actors from a bad product.”

The Centers for Disease Control and Prevention has said approved coronavirus vaccines are effective in preventing covid hospitalizations and deaths. Mercola’s claim that they’ve killed more children than covid has been debunked by PolitiFact, which cited the CDC as saying there is no clear evidence that the vaccines have caused any deaths. The claim about potentially putting the unvaccinated in prison camps is a misleading reference to a New York bill from 2015 that relates to Ebola, not the coronavirus, according to Reuters.

Steve Bannon was deplatformed. An obscure media mogul keeps him on the air.

The Center for Countering Digital Hate (CCDH) calculated its minimum profit figure for Substack by looking at five authors who are known anti-vaccine advocates or have expressed skeptical opinions about vaccines.

Substack does not publish exact subscriber numbers but says whether authors count subscribers in the thousands and tens of thousands. The CCDH took those rough subscriber numbers and calculated their monthly profits by using low-end estimates for how much revenue is generated per subscriber. It then calculated Substack’s revenue using the 90-10 split, with 90 percent of revenue going to the author and 10 percent to Substack.

“Substack should immediately stop profiting from medical misinformation that can seriously harm readers,” said Ahmed, the CCDH chief.

Loading...