Renee DiResta is the technical research manager at the Stanford Internet Observatory and a Mozilla Fellow. Michael McFaul is director of the Freeman Spogli Institute for International Studies and a Hoover fellow at Stanford University. Alex Stamos, a former chief security officer of Facebook, is director of the Stanford Internet Observatory and a visiting scholar at the Hoover Institution.

In 1983, an anonymous letter from an author claiming to be an American scientist appeared in an Indian newspaper, asserting that the HIV virus raging across the world was a bioweapon released by the United States. Over the next several years, similar claims appeared in leftist and alternative newspapers around the world and ended up becoming widely believed among those predisposed to distrust the Reagan administration. As late as 2005, a study showed that 27 percent of African Americans still believed that HIV was created in a government lab.

We now know that these claims were part of a massive Soviet disinformation campaign. And as successful as this operation was, the methods it used look modest and primitive in the age of the Internet. During the 2016 election campaign, Russian intelligence used the same technique, known as “narrative laundering,” to inject its preferred stories into mainstream American media. In the 2016 disinformation operation, Russian intelligence officers and their proxies supercharged their misleading stories with real documents: emails stolen from the Democratic National Committee and Hillary Clinton’s campaign manager. (Roger Stone, who has just been found guilty on charges of lying to Congress and witness tampering, served as a conduit for some of that material.)

Our research, published in a report appearing this week, describes and analyzes the Russian narrative laundering playbook. It is quite possible that these exact techniques will be used again. And why shouldn’t they? We’ve done almost nothing to counter the threat.

At least we know the primary culprit: Russia’s military intelligence agency, the GRU. During the 2016 U.S. presidential election, GRU operatives used WikiLeaks and fake personas (“DC Leaks” and “Guccifer 2.0”) to disseminate the hacked emails, which came to dominate coverage in both traditional and social media. That is yet another lesson that has survived from Soviet days: Narrative laundering is especially effective when the stories are built on real documents.

Although it is difficult to measure precise effects, the GRU was undoubtedly successful in changing the way Americans were talking about the two candidates at the time. (The WikiLeaks dump was timed to distract from the coverage of Trump’s “Access Hollywood” sexual harassment scandal.) Of the five distinct forms of Russian interference, the “hack and leak” campaign by the GRU, and the subsequent media coverage it inspired, likely had the greatest impact.

As our data set reveals, the Russians are now perfecting these techniques worldwide — mostly to shape public discourse on topics of geostrategic interest to Russia, such as the ongoing Syrian civil war. As described in the report, GRU agents created a variety of false identities such as the Inside Syria Media Center, a nonexistent think tank that successfully pushed pro-Assad and anti-Western narratives. Another GRU-created identity called herself Sophia Mangal, the purported co-editor of the ISMC, who frequently contributed to fringe sites such as and engaged on Medium, Quora and Twitter while hiding behind stolen profile pictures.

The big tech companies have embarked on some reforms in response to Russian mischief-making, such as enhancing advertising transparency and algorithmic down-ranking of divisive political content. But such moves are of little use against intelligence professionals who are willing to conjure up fake media organizations, invent think tanks and support Kremlin-aligned conspiratorial voices. Social media platforms need to devote far more human resources to the task. The tech industry also lacks an official coordinating body to enable collaboration between companies and with democracies; creating an independent organization to do so should be a top priority.

Traditional and social media are intertwined in a manner that makes spotting and stopping the viral spread of propaganda difficult, and there is no easy way to protect ourselves against these sorts of “active measures” while defending our cherished First Amendment rights.

Most of the attention in the battle against foreign disinformation has focused on bots, trolls and other digital actors on social media, but it must also include traditional media organizations. Editors and reporters should consider how they will react to these situations now, rather than improvising reactions to the wave of disinformation we know is on the way. Newsrooms should carefully consider how the volume of their coverage might be manipulated by strategic leaks.

Most importantly, they need to break the cycle of amplifying disinformation by “covering the controversy.” Propaganda professionals such as GRU have a strong track record of inserting wild conspiracy theories and false claims into the media environment; repeating those theories and claims, even to debunk them, gives the propagandists the amplification victory they seek.

Many Americans are counting on journalism to help lead our country out of an age of democratic erosion and fake news. Journalists only can succeed in that mission if they avoid becoming unwitting accomplices of disinformation themselves. We hope that our findings will raise awareness of the threat among media professionals and help them to prepare for adversarial action in 2020.

Read more: