Studying disinformation is a little like being a virologist: The object of your investigation is challenging to detect, the symptoms can vary widely, and most importantly the source can be difficult to identify. This is especially true when studying disinformation about a virus.

Unlike misinformation, disinformation isn’t simply inaccurate: It’s actively designed to mislead. On social media, the two can appear identical, as the message is sometimes the same. In practice, both can be just as dangerous, especially now. Factual information is essential to public health, particularly when it comes to something as urgent as how we respond to the novel coronavirus spreading in our communities. Online disinformation during a health crisis has literal life-or-death consequences for people in the real world. It’s easy to point at foreign actors like the Russians and Chinese for spreading disinformation across social media. It would also be accurate but incomplete. Sadly, foreign actors aren’t the biggest danger. We Americans are.

Domestic production of misinformation is booming. Anything from false hope sold through fake cures to the creation of politically motivated memes and hashtags can sow confusion, making it more difficult to respond to the crisis in a cohesive, sensible manner. These efforts have made the crisis worse, and people will die or are dying as a result.

While it is true there is Russian and Chinese disinformation pushing various narratives regarding the coronavirus, foreign actors are often merely curating messages that we Americans created. Instead of making stories up from whole cloth, foreign adversaries take the misinformation we give them and launder it into disinformation.

Take, for instance, the persistent theory that the virus was created by scientists (it wasn’t) and escaped a lab in China (it didn’t). This theory did not spring forth from a St. Petersburg troll farm or a GRU bunker in Moscow. (The GRU is a Russian military intelligence agency.) According to our research, the first English-language tweet suggesting this theory was on Jan. 20. It came from an anonymous conservative American woman who, in her profile, claims to love Jesus, her family, her country, her freedom and her guns. Lest you think that reflects some partisan bias, consider that the second English-language tweet was sent the next day from a liberal, blue check-marked academic. The creation and dissemination of misinformation does not discriminate by ideology.

Foreign actors are, of course, capable of creating their own false narratives and fake news, and this is particularly true when it comes to public health issues. The Russians, in fact, have a long history of doing exactly that. In the 1980s, the Soviet KGB successfully spread the story that AIDS was a biological weapon created by the CIA. More recently, early work done by the Russian Internet Research Agency (the disinformation organization made famous by the Mueller investigation) spread false public health rumors on social media.

Russian trolls maintained this approach for some time, but with diminishing returns. In 2014, for example, they created fictional, fearmongering stories that included an Ebola crisis in Atlanta and an outbreak of salmonella in Upstate New York (linked to quintessentially American Thanksgiving turkeys!). Despite spitting out a range of content, including videos and photos of the “events,” and using hundreds of social media accounts to distribute their false stories, the Russians never gained any traction with these efforts. Research has shown that most social media users don’t actually spread fake news. Though Americans often insist they don’t trust the “mainstream” media when it comes to real-time events on the ground, it seems they may.

After these failed attempts, the Russian Internet Research Agency changed tactics in 2015. It stopped creating fake news and began to rely on real events and real conversations, playing into our uncertainties and anxieties about the things we know (or think we do) instead of inventing narratives. Rather than carpet-bombing a broad audience with generic messaging from anonymous accounts, it began engaging communities on the left and right, giving them specifically tailored messaging. The tactics addressed health-related issues, including vaccination and climate change, among a range of topics.

In accordance with the new strategy, it stopped making up lies and began telling people how to think and feel about existing stories, which often meant a change from selling fiction to selling spin. It turns out the latter is more efficient and more effective. Telling people what they want to hear rather than what you want to peddle can be far more persuasive. Other times, such as with its anti-vaccination messaging, the tactic meant joining ongoing conversations and amplifying lies America was already telling itself. It helps that Americans across the ideological spectrum create plenty of divisive content to borrow from. The Russians never have to look far to find a spicy meme or blood-boiling video to share in an effort to infuriate their followers and make them ever more disgusted with the other side.

The same continues to be true with coronavirus disinformation. In our research, we have found multiple networks of fake accounts — one of which we can attribute to Russia — that use conversations about coronavirus as a tool for political attacks. To right-leaning Americans, these trolls criticize the response from liberals, suggest the coronavirus is being used to take away their freedoms, and point the finger of blame at China. To left-wing Americans, they suggest the administration’s response is immoral and inadequate and point the finger of blame at Trump. On both sides, these are arguments that real Americans are also making, typically with honest intentions. The attacks play to the trolls’ goals, however, and so they repeat them, making the loudest and ugliest versions more mainstream. In doing so, they dangerously widen existing divisions in a time of crisis, making critical compromise more difficult. As before, these networks rely on hashtags from organic American conversations, such as #TrumpLiedPeopleDied and #ReopenAmerica. They aren’t creating the divisions, but they are working hard to make them wider.

It’s not just our hashtags and memes that fuel foreign disinformation, however. Websites peddling conspiracy theories, such as and Alex Jones’s, were retweeted and reposted routinely by Internet Research Agency trolls in 2016. In recent weeks, these websites have discussed the Chinese lab origin theory, among other fearmongering stories related to the virus. Such stories have since circulated around social media thousands of times. Among those accounts circulating the stories are pro-Russian and Russian state media-affiliated social media influencers. It’s worth noting, though, that these sites also continue to draw on narratives once disseminated by the KGB. In 2019, for example, both shared inaccurate stories about the conspiracy theory that the CIA had a covert role in spreading HIV. These companies may not explicitly set out to spread disinformation, instead intending, most obviously, to profit. But they, and many companies like them, profit while echoing the efforts of foreign powers working to undermine our safety and security.

Many commentators have discussed various ways in which the United States has acted to make the coronavirus crisis worse than it could or should be. The public’s own role in spreading global disinformation needs to be added to that list. We have to address our own culpability in the problems that are fomented by disinformation. At a time when most news and information people digest is socially mediated, we need to create citizens and platforms that are more resilient to lies and more accepting of facts.

Above all, however, we need to stop doing the trolls’ jobs for them.