Shortly before noon, Fox Business aired a segment discussing testimony offered to the Senate last month. Robert Epstein, a psychologist who at one point was editor in chief of Psychology Today, told senators on July 17 that his research suggested Google had given millions of votes to Trump’s opponent Hillary Clinton in the 2016 election. A guest on Fox Business named Oz Sultan, who worked with Trump’s 2016 campaign, looped that claim back into the broader, ongoing criticism of social-media companies that’s currently in vogue among conservatives.
Trump, though, quickly picked out — and exaggerated — the claim about Clinton votes.
That’s not what Epstein said in his testimony. He estimated a range of 2.6 million to 10.4 million votes, with 15 million votes being the possible shift in 2020. That 2.6 million estimate, he said, was the “rock bottom” estimate. While Epstein identifies himself as a Democrat who backed Clinton, that’s a convenient figure, since Clinton won by about 2.9 million votes nationally.
There’s just one problem: Those estimates deserve far, far more skepticism than Trump would ever give them.
On its face, the numbers are dubious. In his prepared remarks, Epstein estimated Google “gave at least 2.6 million votes” to Clinton, a statement that isn’t well-defined. Gave … how? These were non-voters inspired to vote? Trump voters who switched? Without knowing that, it’s hard to evaluate the accuracy of the claim.
A claim, mind you, that is very bold. Getting millions of voters to vote a particular way is the sort of thing that political parties spend a lot of time trying to figure out. Epstein is claiming that more than 2 percent of all 2016 voters were influenced to vote for Clinton by Google. The scale is massive.
So why does he make this claim? He appears to have combined two bits of research he’s conducted: a 2015 look at how search engine results can influence political opinion and a collection of search results from users before the 2016 contest. Over the last 25 days of the campaign, a summary of the latter research suggests, “we found that search results were, on average, biased to favor Hillary Clinton on all of those days.” Given that his earlier research found results could influence views of candidates, we get the top-line assumption.
What does “biased to favor Hillary Clinton” mean? We don’t know. The summary doesn’t explain what that looks like.
It does, however, suggest it found results emailed to his research team from Google’s email system (Gmail) to be unusually unbiased.
“Perhaps Google identified our confidants through its gmail system and targeted them to receive unbiased results,” it reads, “we have no way to confirm this at present, but it is a plausible explanation for the pattern of results we found.” So they threw those results out.
One of the more baffling aspects to this research is that no indication is made about how the searches were conducted. Google’s search results are specific to users, and there’s no indication in the summary (mentions of using incognito mode, for example) that any effort was made to return unweighed results from the search engine. Nor is there information provided about who participated in the study. Collecting results from a group of well-to-do city dwellers, for example, might help explain any “bias.”
This is more problematic because while the research points to thousands of search results that were analyzed, only 95 people actually provided responses to the study. Meaning if the results were driven by the identities of those individuals, the variation in the pool of results was actually 95. Oh, and of that group? Only 21 were undecided. If the 2.6 million figure derives from that group alone, the value of that figure is almost nil.
It’s worth noting that bias by Google is precisely what Epstein expected to see. In August 2015, he wrote an essay for Politico in which he predicted that Google might be able to influence the election and, voilà, so it did.
In his testimony, Epstein was also asked by Sen. Ted Cruz (R-Tex.) to talk about another way in which Google might have influenced a federal election. On Election Day last year, Google’s home page changed its iconic logo to read “Go Vote.” Writing for the Epoch Times, Epstein claimed that Google could have spurred an additional 500,000 people to vote.
In 2010, Facebook ran an experiment aimed at boosting turnout by showing users friends who’d already cast a ballot. It estimates this experiment — which used data about individuals to identify likely voters and showed people images of their friends who voted — increased turnout in a target pool of 60 million people by 340,000 votes. In what doesn’t seem much like a coincidence, Epstein’s estimate of an increase of 500,000 votes is about the same percentage of the 87 million people he claims saw Google’s logo as 340,000 is of 60 million.
So the same results, in effect — but without any of the use of photos of friends or targeting of likely voters. Sure.
Epstein also dances around the question of intentionality. Google insists it doesn’t re-rank its results to influence politics — meaning that it didn’t intervene with the results of its initial algorithmic ranking. Epstein says he “never claimed it did,” which suggests he’s finding fault with the algorithm itself. But, in his prepared remarks, he also pointedly claims “[a] growing body of evidence suggests that Google employees deliberately engineer ephemeral experiences to change people’s thinking.”
If you want to allege bias but can’t prove bias, the above pair of claims seem like a needle you might want to thread.
This is one claim from one person that, as far as I can tell, hasn’t been peer-reviewed or replicated. On its surface, it’s dubious, as is the methodology underlying it. It’s the sort of thing that people in positions of authority — such as, say, a senator or a president — might be cautious about spreading.
But, on the other hand, it also lets Trump claim almost-victory in the 2016 election. And when something does that, Trump rarely shows any signs of hesitation about getting it in front of as many people as possible.