Christine Emba edits The Post's In Theory blog.
As if we needed more evidence that Facebook influenced the election.
Last week, the social-media company revealed that during the 2016 presidential campaign it sold more than $100,000 in ads to a Kremlin-linked "troll farm" seeking to influence U.S. voters. An additional $50,000 in ads also appear suspect but were less verifiably linked to the Russian government.
In the grand — at this point, far too grand — scheme of campaign spending, $150,000 doesn't sound like much. It's a minor TV ad buy, perhaps, or a wardrobe makeover for one vice-presidential candidate. But in the context of Facebook, it matters quite a bit. Not just for what it might have done to the election but also for what it says about us.
Apart from Web marketers and media-company employees, few seem to be fully aware of how influential Facebook can be. And why should they? Our use of the site tends to be mindless — ogling a relative's new baby, scrolling past 60-second cooking videos and maybe liking an article or two if they catch our eye.
Yet there's no question Facebook has a big influence on our worldview, whether we realize it or not. Sixty-six percent of U.S. Facebook users admit that they get news from the site, a number that in the end amounts to 44 percent of the general U.S. population. And people are more likely to believe news shared by their friends.
So what news do they get? Here's where a seemingly minor ad buy becomes alarming. Because of its millions of users and the site's focus on sharing, Facebook has a news reach that can transcend that of traditional media such as print or television. And that reach comes oddly cheap. One hundred dollars in Facebook ads could deliver a buyer's message to thousands of viewers, whose further sharing would allow it to ripple out exponentially.
Now, turn that into $100,000 and inject it with malice. And imagine being able to target this message with minute precision: say, telling black voters in swing counties that Hillary Clinton was an incorrigible racist, or enraging white, male gun lovers with her supposed plans to roll back the Second Amendment. Imagine how quickly such misinformation could spread and metastasize.
And imagine no one knowing it was happening.
Our obliviousness is unsettling enough, but the way that our Russian adversaries used it against us positively stings. After the ad sales were revealed, Facebook's own chief security officer, Alex Stamos, shed some light on what those purchases might have looked like. "The vast majority of ads run by these accounts didn't specifically reference the US presidential election, voting or a particular candidate," he wrote. "Rather, the ads and accounts appeared to focus on amplifying divisive social and political messages across the ideological spectrum — touching on topics from LGBT matters to race issues to immigration to gun rights."
In other words, the United States is so caught up in partisanship that we've lost our ability to keep a level head, and the whole world knows it — including our adversaries.
Americans can more consistently be relied upon to share wild-eyed rumors than to think critically on social issues. Basic civic debates have become so inflammatory that foreign actors can use them as cattle prods, sending us running mindlessly to whichever side we're told is safe. We're easily distracted from real facts and flock to news that confirms our biases. While the echo-chamber effect has been known for some time, the fact that it has become so dependable as a way to divide us is damning.
Russia spent at least $100,000 on Facebook ads because of Americans' known susceptibility to partisan division, our willingness to outsource the work of analysis to social-media algorithms and our tendency to not think too hard about what we see. No, the money isn't minor. But the real problem is us.