But it’s clear that online disinformation is dismantling our democracy. This year, replete with coronavirus conspiracies and misleading claims about mail-in ballots, has demonstrated that disinformation has downstream effects on ordinary people, and that our ailing information ecosystem will not be healed by a change in administration alone. It is affecting public safety, public health and the functioning of our democratic institutions themselves. And to make matters worse, disinformation is here to stay. If Joe Biden is elected, Russia will not suddenly shut down the Internet Research Agency; copycat adversaries won’t cancel their efforts; and neo-Nazi forums will not immediately go dark. The widespread harm that flows from such outlets will keep on seeping into our lives and undermining our institutions.
I recently testified before the House Intelligence Committee about this threat. The attendance at the hearing on “online misinformation and conspiracy theories” itself supported my thesis: Not a single Republican member showed up, citing “security concerns” about the virtual format. Unwilling to recognize the danger of coronavirus that kept the committee from meeting in person, while simultaneously misleading the public about the reason for their de facto boycott, Republicans advanced the misguided perception that countering disinformation ought to be a partisan topic, effectively abdicating Congress’s bipartisan responsibility to conduct oversight of it.
We’ve also seen this politicization touch our intelligence community. In a news conference last week, Director of National Intelligence John Ratcliffe discussed Iran’s email campaign to intimidate voters by impersonating the Proud Boys, presenting it as a threat to the election. A much greater concern — Russian access to some state and local government networks — would be announced less than 24 hours later. Both revelations present serious concerns about the integrity of our voting infrastructure, trust in our election processes and the status of our democracy more generally. Ratcliffe, however, claimed that Iran’s campaign, which targeted Democrats, was somehow detrimental to the president, turning what should have been an issue of collective concern into a one-sided pity party.
This was entirely in keeping with a running pattern. Throughout the lead-up to the election, we’ve seen Ratcliffe and other Trump administration officials attempt to shift the narrative away from Russia toward assessments that portray the president as a victim. They seem to have forgotten disinformation’s ultimate victim is democracy. People cannot participate in the democratic process without quality information, and those who knowingly mislead are willfully undermining that pillar, whether they actively seek to support specific candidates or merely act as chaos agents.
This democratic dismantling continues beyond the halls of Congress and outside of the realm of foreign interference. Twenty-four adherents of the QAnon conspiracy theory are running for Congress this election, buoyed, no doubt, by the president’s repeated refusal to disavow the extremist movement. Instead, Trump has claimed he knows nothing about it, except that, in his words, “they are very much against pedophilia. They fight it very hard.” It’s a claim that both undersells the movement’s real beliefs and overrepresents its actual activities. But his deceptive framing is, of course, one of the dog whistles that allows QAnon to ensnare new followers and evade social media crackdowns. The movement has spread around the world. It will morph and adjust its doctrine if Trump loses the election, but it will not disappear.
Disinformation and conspiracies also have implications for the democratic representation of women and minorities. A plot to kidnap Michigan Gov. Gretchen Whitmer was hatched on social media, and social media platforms were apparently none the wiser. It was an FBI mole — not a tip from a tech firm — that brought down the plan. And in an ongoing Wilson Center project tracking the use of gendered and sexualized disinformation against female politicians in the 2020 election, my team has observed an increase in individuals spreading malicious, false, racist, sexualized narratives against Sen. Kamala D. Harris ahead of Election Day. On the night of the vice-presidential debate, instances of hashtags containing sexualized disinformation or violence against Harris on “alternative” social media platforms Parler and 4chan increased 631 percent and 1,078 percent, respectively. Abuse and harassment of this kind raises the cost of seeking elected office or participating in democratic discourse. It makes such activities unpleasant, and at times unsafe, and is meant to keep women and people of color out of the public conversation. A Biden/Harris victory will not end this sort of discourse; it has become the new normal.
Finally, the tools and tactics of disinformation often affect participation in the democratic process itself, as well as public health. Using online microtargeting tools, the 2016 Trump campaign was able to target Black voters for “deterrence” from voting — that is, for voter suppression. But these tactics can also be more surreptitious. A new documentary, “People You May Know,” details how evangelical churches use online data profiles to identify vulnerable individuals — people who have battled addiction, who are at risk of divorce — ensnare them in the organization, and later share that data with conservative political campaigns for further exploitation. The campaigns then direct their misleading messaging — for instance, that coronavirus is no worse than the flu — to exactly the at-risk groups that will find them most convincing.
Such microtargeting, which makes disinformation especially potent, has become the subject of intense debate on Capitol Hill, but all campaigns use it. The difference is how they use it and what they use it for. Some Democrats, including Biden, have taken pledges “not to fabricate, use or spread data or materials that were falsified, fabricated, doxed or stolen for disinformation or propaganda purposes,” but there is no incentive for other candidates to do the same. And whether the disinformer is a political aspirant, a media mogul or an online influencer, cloaked in the trappings of free speech, they all know what the Russian government knows; disinformation not only dismantles democracy, it works in direct opposition to it.
So yes, disinformation has always existed. But today, while working in opposition to democracy, disinformation itself has been normalized. No longer is it a tool reserved for well-resourced campaigns and government apparatuses that can invest in a metaphorical Trojan Horse. Anyone with a social media account and a basic understanding of the online environment — from a troll in St. Petersburg, to “Q,” whoever or wherever he may be, to an American political operative — can launch such a campaign. It’s not the novelty of the tactics that should concern us, but the fact that they have become our democracy-altering norm.