An underlying theme running through special counsel Robert S. Mueller III’s investigation is that Russia’s ultimate goal was to make sure Donald Trump was elected president. That’s just part of the picture.
Last month, Mueller’s team released the details of the grand jury indictments of 13 Russian nationals, as well as a shadowy Russian firm known as the Internet Research Agency, for conducting information warfare against the United States and breaking three U.S. federal laws.
Our research looks at Russian cyber and information warfare activity — and distinct patterns begin to emerge. But this is a nonlinear strategy and a long-term assault on Russia’s adversaries. Although boosting the Trump campaign may have been one of Russia’s primary goals in 2016, the 2020 goal could just as easily be helping the president’s Democratic challenger.
The Russian playbook: What do we know about it?
Russia has been conducting information warfare for a very long time — current doctrine has roots that go back to Soviet times. The nonlinear warfare (or reflexive control) tactics include the ability to control information — inject, alter, obfuscate or withhold altogether — as well as the timing for these actions. The Kremlin’s aim is to sow perpetual discord in governments and populations, beyond just one election cycle.
By changing an adversary’s perception of reality, Russia hopes to exert pressure on that country’s decision-making. For example, in Ukraine, Russia has used a wide array of measures to divide that country’s population and keep it in perpetual conflict. This includes overt tactics such as spreading disinformation about pro-Western candidates on social media, as well as covert tactics such as contracting organized crime to commit atrocities to create fear in unsuspecting populations.
Cyber-strategy is a way to push back
For Russia, cyber-operations are a subset of overall information warfare and a way to integrate hacking seamlessly into influence campaigns. Russia considers NATO expansion and Western incursions into post-Soviet space existential threats, providing pressing motivations for a response. Furthermore, Moscow views popular uprisings in Ukraine or the Middle East as results of Western interference campaigns, and Russia is fighting back while reestablishing itself as a great power.
However, it is very difficult to measure the true impact of Russian activities. We can measure the number of clicks, likes and shares, for instance, but how do we make a linear connection to outcomes?
We can’t. Even measuring the number of people who show up for rallies initiated by Russians — a form of disruption — doesn’t give a precise metric. Furthermore, a recent study finds that more than 60 percent of Americans now get their news from social-media platforms such as Facebook, where Russia was very active in spreading disinformation. And 1 in 4 Americans were exposed to fake news as well as Russian bots during the 2016 election cycle. Yet finding the link between Russian actions on social media and vote change remains elusive.
Hillary Clinton not winning the White House was highly likely to be one of Russia’s objectives — but not the primary one, as the Mueller indictments suggest. Long-term strategic thinking in Russia looks far beyond one election cycle, allowing for more comprehensive and long-standing foreign policy goals. The election cycles of Western democracies, in contrast, tend to disfavor long-term strategic planning on foreign policy goals.
Russian political interference is about keeping an adversary nation domestically divided for a long period of time. Russia looks to spread division, exacerbate any conflict possible and ultimately destabilize the political system and erode trust in the government and institutions. Therefore, had Clinton been elected president, Russia arguably could have achieved these same goals, given the rancorous and divisive campaign.
So what happens in 2018?
With the U.S. congressional midterm elections coming up in November, a firm and credible threat by the Trump administration could bring Russian information activities to a halt. If these efforts failed, relations between the two nuclear powers could erode, potentially leading to an escalation of tensions elsewhere. And a U.S. pushback would play into the Russian rhetoric of being threatened by the United States.
U.S. elected officials as well as government agencies could put more pressure on social media companies such as Facebook and Twitter — although this could risk running counter to core values such as free speech. Facebook, for instance, has acknowledged that its platform was used to sow discord. Silicon Valley has promised to be more transparent and follow new Federal Election Commission rules about disclosure of foreign ads on its platforms, although these may not be ready for November. Also, the State Department budget includes $120 million to fight disinformation this year; this money has not yet been used. Putin has shrugged off the Mueller indictments as “not my problems.”
Fight disinformation with a better-informed population
The more nuanced possibility is to counter Russia’s influence through a more informed populace — making people fully aware that Russia has taken advantage of partisan news outlets and the technology used on the Internet. This response, however, requires a better understanding of Russia’s playbook, agile and quickly evolving as it is.
Russia is using Western democratic norms to divide Western countries. Boosting government transparency and ensuring stability in institutions also could be proactive policy steps, and this would mean fewer opportunities to divide populations on issues. Stable institutions and the population’s trust in its government can help thwart influence campaigns and ultimately make Russian cyber and information operations inconsequential to Western political discourse.
Ryan C. Maness is an assistant professor in the Defense Analysis Department at the Naval Postgraduate School.
Margarita Jaitner is a warfare analyst at the Swedish Defense Research Agency.
The implications of this article are strictly derived from the authors and do not reflect the positions of the U.S. Navy, the U.S. government, Swedish Defense Research Agency or the government of Sweden.
This article is one in a series supported by the MacArthur Foundation Research Network on Opening Governance that seeks to work collaboratively to increase our understanding of how to design more effective and legitimate democratic institutions using new technologies and new methods. Neither the MacArthur Foundation nor the network is responsible for the article’s specific content. Other posts can be found here.