In November 2016, Facebook chief executive Mark Zuckerberg dismissed the idea that the social network’s content influenced the outcome of the 2016 election on two grounds: “Voters make decisions based on their lived experience” and “fake news on Facebook … is a very small amount of the content.”
Although our experiences and political party affiliation both affect voting, messaging matters, too, sometimes decisively. In the final week of the 2000 election, as my research with Richard Johnston and Michael Hagen demonstrated, while Democratic nominee Al Gore was widening his popular-vote advantage, Republican George W. Bush secured the presidency in the critical state of Florida by dominating the airwaves with ads reassuring seniors that he would safeguard Social Security. Similarly, in 2008, as the amount by which Barack Obama outspent John McCain on ads increased, so, too, did voters’ belief that electing the Arizona Republican meant a third Bush term, a conclusion that increased the likelihood of an Obama vote.
In 2016, the extent and virality of Russian Web content was great enough to plausibly affect the outcome of an election decided in three states by about 78,000 votes. And, importantly, key messages seemed designed to activate indispensable Donald Trump constituencies or demobilize a critical Democratic one.
Until last week’s congressional hearings, the disclosed level of Russian-created Web content was too small to matter. No more. “It’s clear they were able to drive a significant following … for a relatively small amount of money,” Facebook’s general counsel, Colin Stretch, told the House Intelligence Committee. Specifically, 150 million Facebook and Instagram users were exposed to Russian-generated content. Kremlin-tied messengers also published more than 131,000 messages on Twitter and loaded more than 1,000 videos to Google’s YouTube.
The power of the ads and posts was magnified by “liking” and sharing. “Who is behind this mask? A man? A woman? A terrorist?” asked one. Superimposed on a picture of women in burqas, the appeal was clear, “Like and share if you want Burqa banned in America. Stop all invaders.” That one prompted 1,100 comments and 55,000 shares.
Even if the original source hides behind a benign pseudonym, likes, shares and comments telegraph that these are ideas our peers accept. Belief that our community values the content increases both uncritical acceptance and sharing with our like-minded network of acquaintances. At the same time, liking, sharing or commenting increases our own commitment to the message.
Many of the hot-button appeals from the Russian trolls were consistent with Trump’s messages. A post promoting a rally in Pennsylvania reiterated his promise to “put miners back to work,” while depicting him in a miner’s helmet and surrounded by supporters brandishing “Trump digs coal” posters. Other posts echoed the need to protect U.S. borders and preserve gun rights. Research suggests that political messages like those can increase the importance of particular issues when voters evaluate candidates — a finding especially relevant in an election in which almost a fifth of the electorate disliked both major-party candidates.
Content was also strategically allied with the objectives of the Trump campaign. Two traditional Republican constituencies — churchgoers and military families — were unlikely supporters of a thrice-married candidate who confessed that his celebrity status permitted him to kiss and grope women, secured multiple deferments to avoid military service, dismissed the heroism of a prisoner of war, and joked that dodging venereal disease was his Vietnam. While Trump needed high turnout from those groups, a victory for Clinton required rallying black voters in numbers close to those achieved by Obama.
And, indeed, Russian messaging seemingly sought to neutralize claims that Trump was the less moral of the two candidates. For example, a video produced by a state-owned network formerly called Russia Today and garnering 6 million views spread the lie that 100 percent of the Clintons’ charitable contributions “went to themselves.” Trolls cast Hillary Clinton as Satan, promoted the debunked claim that former president Bill Clinton fathered an out-of-wedlock son and assured Christians that the ability to wish others a Merry Christmas would be protected by Trump. Russian hacking also shifted the message terrain against the Democratic nominee among evangelicals and Catholics (a key voting block in Philadelphia, Detroit and Milwaukee) when WikiLeaks released, and conservative media touted, the email of a highly placed Clinton staffer that seemed dismissive of both evangelical Christians and Catholic Republicans.
Other messages conceivably helped shore up support for Trump among veterans. For example, a Russian Facebook post declared without evidence that “Hillary Clinton has a 69 percent disapproval rating among all veterans,” a notion reinforced by its assertion that an overwhelming majority of veterans “despise” her. Signaling disapproval by respected peers is a potent means of persuasion.
Russian trolls also promoted content with the potential to suppress minority voting. Fake ads (one featuring actor Aziz Ansari, another a black woman in front of an “African Americans for Hillary” sign) encouraged: “Avoid the line. Vote from Home.” People were instructed to text or tweet their support for Clinton, instead.
Although the reach and strategic alignment of the recently-released materials had the potential to influence the outcome of the presidential election, two factors diminished the power of some of the Russian messages: the sometimes glaring misuse of the English language in the posts and the targeting of states that weren’t in play.
Just as Democratic candidate John F. Kerry signaled his outsider status in 2003 when ordering a cheesesteak with Swiss cheese (as opposed to “Wiz [Cheese Wiz] with or widdout [onions]”), the Russian troll content contained jarring cues that the messengers were part of an alien community. Consider that “miners for Trump” event post, which noted, “The state of Pennsylvania rose owing to multiple enterprises mining coal, producing steel, and creating the need for other jobs . . .,” followed by the sentence fragment: “As far as Mr. Trump pursues the goal of creating more jobs and supports the working class.” A revealing characteristic of the Russian language, the absence of the definite and indefinite article, is evident in statements such as “out of cemetery” and “burqa is a security risk.” Given the frequency with which errors in grammar appear on social media, however, those mistakes may not have stood out to everyone.
Skeptics of Russian influence in the election have also pointed to posts targeting non-battleground states, such as Texas. But that puzzling focus may reveal Russian ignorance about the electoral college. Underscoring that possibility is the obliviousness evident in the trolls’ assertion that, should Clinton be elected, “the American army should be withdrawn from Hillary’s control according to the amendments to the Constitution.”
Interestingly, clear evidence that the Russian bait caught some people comes from the Lone Star State, as well, in the form of dueling rallies at a Houston Islamic center in May 2016. Prompted by the Russian-created Facebook group “Heart of Texas,” a handful of people showed up to protest the “Islamization of Texas,” while a small counterprotest gathered across the street, orchestrated by the Russian-created Facebook group “United Muslims of America.” Had the clash turned violent, it may have generated national news and, with it, increased salience in battleground states for divisions that the Trump campaign was attempting to magnify. If that was the trolls’ goal, they failed.
It’s hard to know what effect Russian ads, posts and tweets had on voters in the states that narrowly decided the 2016 election. We don’t know how much of the content that showed up in the downstream news feeds reached susceptible voters there and was digested by them. We also don’t know how social media amplification and news media preoccupation with the Russia-hacked Democratic National Committee and Clinton campaign emails affected voters’ perceptions of Clinton. But the wide distribution of strategically aligned messages increases the likelihood that Russian efforts — including posts, ads, tweets and the release of stolen emails — changed the outcome of the 2016 election.