THE STORMING of the Capitol on Jan. 6 was shocking — but not, for those following the mounting story of misinformation and disinformation surrounding the election, all that surprising. Now, many of the researchers observing most closely have released a report that helps the rest of us understand what happened, as well as how to keep it from happening again.
The Election Integrity Partnership sprang up last summer as a collaboration among institutions devoted to studying the spread of falsehoods, propaganda and conspiracy theories that infect today’s Internet. The aim was to connect the almost 10,000 state and local election offices, government agencies, social media sites, traditional media and academics who otherwise weren’t communicating all that effectively, despite a mutual interest in protecting democracy and preserving the truth. The partnership was able to intervene as distorting narratives swept the nation — and able also to see where, when and how interventions fell short.
The basic story is this: The then-president and his allies, including prominent right-wing commentators on both traditional news networks and newer platforms, primed supporters to expect a stolen election. After that, misleading claims spread from the bottom up and the top down. Individuals interpreted innocuous reports of Sharpie pens distributed at polling places, or tallied vote shares becoming bluer, as evidence of rigging. Influencers spread those assertions and also created elaborate lies of their own. All these mini-narratives, serving the mega-narrative of the steal, boomeranged across platforms. High-profile repeat spreaders took advantage of sites’ particularities: exploiting their essential features for maximum reach, and exploiting differences in policy and enforcement to avoid being completely shut down.
There are lessons for everyone. Platforms expanded their policies on the fly throughout the election cycle, even though they could have anticipated some of the content and structure of disinformation campaigns. The policies were also sometimes unclear, and enforcement spotty; repeat spreaders in particular should face escalating consequences. Meanwhile, the federal government must treat the information ecosystem as critical infrastructure — clarifying roles and responsibilities across agencies for protecting election-related discourse and issuing public threat assessments, for starters. This will involve coordinating with and among platforms, as well as with state and local officials, who themselves should improve communication with voters about what happens to their ballots and when.
In the end, as the partnership points out, we will never rid society of bad information. The task instead is to “build resilience” — so that when the trolls come marching in next time, we know how to disarm them. That will take prediction, preemption, detection and debunking, for which all actors can plan ahead better — together. Yet it may also require a more ambitious reckoning. The problem, more than any specific claim, or narrative, or mega-narrative, is the environment that allows them all to fester and eventually explode. This has to do not with individual lies or liars, but with the very design of online forums that encourage sensationalism and foster echo chambers. Fixing that isn’t easy, but it’s essential.
Read more: