The Washington PostDemocracy Dies in Darkness

Opinion Running a disinformation campaign is risky. So governments are paying others to do it.

Members of Anti-Vax Watch in Washington on March 25. (Katherine Frey/The Washington Post)

There’s a risk to running a disinformation operation yourself: If the operation gets caught, you get caught, too. So governments, politicians and parties around the globe have found a solution. They pay someone else to do it for them.

Facebook’s latest report on so-called coordinated inauthentic behavior spotlights an attempt to spread lies about coronavirus vaccines. The effort flopped — but more important than its immediate effects is what it tells us about the trajectory of such campaigns. Facebook describes a “disinformation laundromat” that attempted in two phases to discredit Pfizer-BioNTech and AstraZeneca, planting misleading material on forums such as Medium and Reddit and then spreading the content through fake accounts on social media sites. The organizers also tried to recruit influencers to pump out the message; those plays ultimately undid the enterprise when targets blew the whistle instead of taking the bribe.

Follow Editorial Board's opinionsFollow

The cross-platform nature of the attempt hammers home the need for a whole-of-society approach to disinformation. The tactics, however, are only one piece of the puzzle. Another is who’s carrying them out: in this case, a subsidiary of a British-registered marketing firm whose activities, according to a BBC investigation, are primarily conducted from Russia. The subsidiary, Fazze, describes itself as a “marketing agency” — placing it among a cadre of similar entities worldwide allegedly willing to provide propaganda services for undisclosed beneficiaries. These “black PR” firms will, as Israel’s Archimedes Group put it, “use every tool … to change reality according to our client’s wishes.” The services aren’t typically as sophisticated as government campaigns, but they offer obfuscation. They’re also cheap.

The Oxford Internet Institute unearthed third-party contractors targeting at least 48 countries last year, and some are much closer to home than to Moscow. A campaign under the banner India vs. Disinformation publishing “fact checks” in favor of Prime Minister Narendra Modi turned out itself to be disinformation, connected to a Canadian communications company called Press Monitor. (Press Monitor has acknowledged that it had contracts with the Indian government and some of its foreign embassies, but said India vs. Disinformation was an “independent” project.) A group of Facebook pages doing more or less the same thing to prop up Bolivia’s right-wing government was traced to CLS Strategies, a communications firm based in Washington. (CLS put the head of its Latin America practice on administrative leave after Facebook revealed its involvement last summer, but disputed that the work qualified as foreign interference.) The trick, as with Fazze, is it is hard to tell where a company is really headquartered, and harder still to tell whom the company is working for regardless of its location.

The proper policy response is elusive, not least because of the implications for free corporate expression. A vow among nations not to pay private companies to do their disinformation dirty work — and to prevent those within their borders from engaging in foreign interference? A sort of export control regime for disinformation as a service? President Biden, after a false start, might have had some success persuading Russian President Vladimir Putin to crack down on the worst ransomware actors in Russia. Now countries interested in Internet arms treaties have another weapon that needs controlling.