That’s how Mark Turnbull, managing director of Cambridge Analytica — the election consulting firm that worked for Donald Trump; the Brexit campaign and dozens of other clients; political parties in Kenya, Mexico and beyond — described the company’s tactics to a “customer” who was, in fact, a reporter for Britain’s Channel 4 News. The “information” in question referred to false information, which the company itself was proposing to create. Perhaps a little sting operation involving some “beautiful” Ukrainian girls? The company could catch it on a secret camera, and get it “into the bloodstream of the Internet.”
In the end, it was Turnbull and Alexander Nix, the Cambridge Analytica CEO, who were caught in a sting operation on a secret camera. But there is a deeper irony here, too. This is a company that bills itself as a purveyor of a cutting-edge technology, “psychographics.” Its former employees talk about “psy-ops” and secret algorithms. And yet here were its chief executives bragging about their use of the most primitive political tactic known to mankind: the smear campaign.
They also bragged about their use of another ancient tool: fear. Cambridge Analytica studies the data, Turnbull claimed, and looks for anxieties you didn’t know you had, “until you saw something that evoked that reaction from you. And our job is … to understand deep-seated, underlying fears.” Since this is more or less what Machiavelli was writing about half a millennium ago, you can hardly call that a discovery either.
But even if you doubt whether the company has learned anything new about human nature, you can still observe that Cambridge Analytica, like every other political consultancy in existence, does have access to a new kind of delivery system: the amazing marketing tool known as Facebook. Cambridge Analytica used it both legally (Facebook not only allows but encourages all marketers, whether pushing xenophobia or soap, to “target” their advertising at people who they think might particularly appreciate it) and possibly illegally: The company surreptitiously gained access to the Facebook data of 50 million people through a fake research project and used it to fine-tune its delivery of fear and smears. Worse, Facebook knew about this purported violation of its platform policies in 2015, yet chose not to suspend the company from the network until the story was exposed this month.
But the scandal that has erupted over this reported breach of contract disguises the larger issue: Even when operating legally, the company’s advertising would have been opaque to the people who received it. Political dirty tricks haven’t changed much since the Renaissance, but the delivery method has fundamentally changed the game. Political persuasion that used to take place in the open, in Congress or on the hustings. Now it is covert. The average person opening his Facebook feed on a smartphone does not know he has been “targeted” as a person who might read about a sting operation involving Ukrainian girls. He may not know that his data inclines to the right, which is why he saw a lot of articles in 2016 about the terrible threat of immigrant crime – or that his date shows sympathies with the left, which is why he kept reading stories denouncing Hillary Clinton as a sellout not worth voting for. However he voted, he may also not have understood how bots contrived to make particular stories or pages seem more popular than they are in reality, or that a part of the commentary underneath the articles was the work of paid trolls.
And if he didn’t know, that’s because companies such as Cambridge Analytica and Facebook didn’t want him to know. Until now, we’ve been focused on the ways in which Russian operatives manipulated the Internet. But they are hardly alone. Covert political advertising makes a mockery of election laws in every country that has them. As Turnbull put it so eloquently, the new practitioners of propaganda don’t want their old-fashioned smear campaigns to look like “propaganda,” because if it did, you might ask, “Who’s put that out?” But “Who’s put that out?” is exactly what voters have the right to know. If the Internet platforms won’t conform to that minimal standard on their own, it’s time to regulate them.