The Washington PostDemocracy Dies in Darkness

Four and a half reasons not to worry that Cambridge Analytica skewed the 2016 election

Placeholder while article actions load

This week, Cambridge Analytica made headlines after whistleblower Christopher Wylie revealed that the company had used data from millions of Facebook profiles to psychologically profile U.S. citizens and target them with political messages, including during the 2016 presidential elections. Newly named national security adviser John Bolton’s PAC was among its users, records show.

Observers have pointed out many reasons to be concerned about all this: The way that the data was collected from Facebook arguably did not allow for informed consent. The researcher who collected the data was not authorized to pass it on to Cambridge Analytica. Cambridge Analytica itself may have broken U.S. election laws, if British individuals without U.S. green cards worked on any U.S. election campaigns.

This explains how social media can both weaken — and strengthen — democracy

But here’s one thing you probably should not be concerned about: whether Cambridge Analytica successfully used this profile data to manipulate millions of Americans’ political behavior. When Cambridge Analytica took credit for Donald Trump’s 2016 election victory, social scientists mostly responded with eye-rolling and references to “snake oil.”

Why did social scientists so quickly dismiss the manipulation claims? Here are four reasons Cambridge Analytica’s claim of psychological manipulation doesn’t pass the social scientist’s smell test.

1. Personality is not a good predictor of political views.

The “Big 5” personality traits (which Cambridge Analytica claimed to use in its work) only predict about 5 percent of the variation in individuals’ political orientations. Even accurate personality data would only add very little useful information to a data set that includes people’s partisanship — which is what most campaigns already work with.

2. Predicting personality is hard.

Yes, it’s possible to predict personality from online data. But a recent meta-analysis shows that even if you have access to someone’s digital footprint, you can only learn so much about their Big 5 traits. Even if your model does well at first, it will probably be out of date soon, as the things people “like” on Facebook change.

3. Changing individuals’ choices based on their personality profiles is harder than it sounds.

You can improve online advertisements by targeting them using personality data. But the effects tend to be small. In this successful study, researchers targeted ads, based on personality, to more than 1.5 million people; the result was about 100 additional purchases of beauty products than had they advertised without targeting.

And trying to change political behavior would have an even lower success rate. Most people probably do not identify with their beauty regimens as strongly as many Americans identify with a political party.

4. They had stiff competition from other campaigns.

Once you know that personality prediction probably didn’t add much value to Cambridge Analytica’s approach, then what it did starts to look a lot like the microtargeting also used by other campaigns, and which the Obama 2008 campaign in particular was famous for. And even these more traditional microtargeting approaches don’t have a clear track record of success.

Facebook wants its users to drive out fake news. Good luck with that.

And it’s not clear that Cambridge Analytica could do any of this.

In case all this isn’t persuasive, here is a fifth, slightly less scientific reason to doubt Cambridge Analytica’s success. By most accounts, Cambridge Analytica does not seem capable of pulling off the large-scale and complex personality-based profiling operation that it claims to have mastered. Before the 2016 general election, Republican strategists were already expressing less-than-stellar opinions of the company. And in the videos that Britain’s Channel 4 released this week, Cambridge Analytica appears to recruit new clients by focusing on dirty tricks, rather than by promoting its supposedly slick psychometric persuasion machine. Even the researcher who gave it the Facebook data in the first place now says that Cambridge Analytica’s claims “quickly fall apart” upon inspection.

Feel free to worry anyway — just not about Cambridge Analytica’s boasting

Of course there is still plenty to worry about. This episode has raised many important topics for discussion, including how Facebook and other platforms handle private data; whether these platforms should be regulated; how their business models are in direct tension with data privacy protection; and the consequences of living in a world in which attempts to manipulate us based on our digital footprints are ubiquitous.

But Cambridge Analytica’s specific attempts at psychographic profiling do not need to rank highly on this already crowded list of concerns.

Kris-Stella Trump (@kstrump) is director of the Anxieties of Democracy program at the Social Science Research Council.