The Mueller investigation has just indicted 13 Russian nationals, and three Russian organizations, including the notorious St. Petersburg troll factory, the Internet Research Agency, for their roles in the 2016 U.S. election. This indictment has come as a complete surprise. It provides a wealth of new information about how Russian trolling operations work and what they tried to accomplish. This information contradicts many popular beliefs about Russian social media operations.
Russian operations probably did not change voters’ minds
One of the persistent myths of the election cycle is that Russian influence operations helped change voters’ minds, and hence helped Donald Trump get elected. This argument seems to be plausible, and even compelling. We know Russian sources circulated lots of bogus stories on social media, and many people read them. It is easy to jump to the conclusion that this fake news gave Trump the majority he needed in a few swing states.
The problem, as Dartmouth political scientist Brendan Nyhan discusses, is that this account does not really hold together. First, it is hard to be sure how many people (rather than bots) actually read these fake stories. Second, even if people did read them, they were buried among many, many other posts, some of them equally alarmist. Finally, there is a lot of political science research showing it is really hard to change people’s minds. Even massive TV advertising campaigns appear to have only tiny effects on people’s decisions over who to vote for. Social media posts, buried among a multitude of other such posts, are likely to have even less.
That does not mean Russian activities were not important
To understand what Russia has been doing abroad, you need to understand what it has been doing at home. Political scientists such as Margaret Roberts and Josh Tucker of the Monkey Cage have documented how Chinese and Russian leaders discovered a new way of dealing with inconvenient voices in their own society. Rather than just trying to censor them, they have increasingly looked to drown them in a flood of other perspectives, arguments, claims, counterclaims and nonsense.
American libertarians like to say the best antidote to bad speech is good speech. What Russian and Chinese rulers have discovered is that the best antidote to good speech is bad speech — and lots of it.
In a description by Adrian Chen, a journalist who has done extensive research on the Internet Research Agency, the consequence of Russian activities against their own public “was not to brainwash readers but to overwhelm social media with a flood of fake content, seeding doubt and paranoia, and destroying the possibility of using the Internet as a democratic space.” What Roberts describes as “flooding” tactics make it more or less impossible to conduct ordinary democratic conversation and argument.
Russia has weaponized “flooding” against the U.S.
This is what lay behind Russian trolling tactics. Certainly, Russian trolls wanted to discredit the candidates whom they detested. However, they did not seek to change Americans’ minds, but to create enough doubt, confusion and paranoia to destabilize democracy. The Mueller investigation documents this clearly. As the indictments describe it, “By in or around May 2014, the ORGANIZATION’s strategy included interfering with the 2016 U.S. Presidential Election,” with the stated goal of “spread[ing] distrust towards the candidates and the political system in general.” By spreading rumors, flooding the zone with disinformation, stirring up protests and counterprotests, and otherwise creating confusion, Russia wanted to gravely weaken the U.S. political system.
Many people believe Russian trolls did not expect Trump to be elected president. However, they wanted a United States that was sufficiently divided against itself that a President Hillary Clinton would have difficulty in governing, let alone taking decisive action abroad.
This may also explain why Russian actors both spread rumors that Clinton was guilty of vote fraud and probed the vulnerabilities of online U.S. electoral records. Their likely intentions were not to fix the vote but to create enough paranoia over the possibility that the vote had been fixed that a President Hillary Clinton’s legitimacy would have been seriously damaged. As an expert report concludes about this kind of attack, “simply put, the attacker might not care who wins; the losing side’s belief that the election was stolen from them may be equally, if not more, valuable.”
This has implications for what comes next
It will obviously be difficult for the Mueller investigation to act on these indictments. The targets are in Russia, and Russia is not going to extradite them. It may also be that the indictments serve a larger and as yet unknown purpose in the web of legal arguments and evidence gathering that the Mueller investigation is putting together.
What the indictments tell us right now is that the problems lie less in Russian activities than in the weaknesses of U.S. democracy that they leverage. Now that Russian actors have identified these weaknesses, they can easily be exploited by others.
We used to think American democracy’s reliance on free speech provided its own defenses. However, now we see how this reliance has created its own vulnerabilities that can be exploited by foreign adversaries. Flooding attacks can destabilize democracy — especially when they are augmented by Americans’ own tendency to hype up the paranoia.
Tackling these problems will require major new policy initiatives. It will also require major new social scientific investigations. If we are to secure democracy against the kinds of attacks that the Mueller investigation has described, we need to know far more about the relationship between democracy, speech and information, and to identify the vulnerabilities in this relationship. This may mean, for example, that we should start thinking about informational resources such as the U.S. Census as key determinants of national security.
This article is one in a series supported by the MacArthur Foundation Research Network on Opening Governance that seeks to work collaboratively to increase our understanding of how to design more effective and legitimate democratic institutions using new technologies and new methods. Neither the MacArthur Foundation nor the Network is responsible for the article’s specific content. Other posts in the series can be found here.