Georgia Gov. Nathan Deal speaks during a ceremony announcing a $300 million expansion of Google's data center operations in Lithia Springs, Ga., in June. (David Goldman/AP)

The experiment was simple: Take a diverse group of undecided voters, let them research the candidates on a Google-esque search engine, then tally their votes — never mentioning that the search was rigged, giving top link placement to stories supporting a selected candidate.

The researchers expected the bias would sway voters, but they were shocked by just how much: Some voters became 20 percent more likely to support the favored candidate.

And almost none of the voters caught onto how the results were being skewed. In fact, those who did notice the preferential treatment, the researchers said, felt even more validated that they'd made the right choice.

The series of studies exploring the "search engine manipulation effect," to be published soon in the Proceedings of the National Academy of Sciences, highlights the vast and invisible influence that tech giants such as Google, which handles two-thirds of all U.S. searches, can wield on a national scale.

That search results color our thinking is nothing new: American companies spend tens of billions of dollars every year to get their sites to the top of the pile. But in an age where learning about candidates is always only one search away, researchers say the effect could wield a worrying level of influence — and that it has already helped swing some votes.

These studies first attracted attention a few years ago, but researchers Robert Epstein and Ronald E. Robertson now say they've been able to repeat its results across five double-blind, randomized experiments, covering a diverse pool of more than 4,500 undecided voters in the United States and India.

The tests, using a specially designed search engine called Kadoodle, found that voters tended to increase their preference for candidates favored higher in the search results, especially if they were unfamiliar with the names on the ballot.

"I couldn't even believe what we got," said Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology. "It seemed impossible ... to be able to shift that many undecided voters toward whoever we chose."

That effect, Epstein said, could grant leaders of the world's search engines extraordinary power over how voters cast their ballots. In the study, the researchers wrote that the effect has "perhaps already been affecting the outcomes of close elections."

Skeptics of the research have noted that voters are immersed in a swirl of information beyond just what they type into Google and can also be swayed by factors like political, religious and social ties.

The search engines also appeared relatively less influential in one experiment involving Indian voters in last year's Lok Sabha elections — the world's biggest election to date, with more than 800 million eligible voters — perhaps suggesting that the search effects are more subdued amid the tumult of a real-life vote.

Even without a sweeping (and unproven) search-engine conspiracy, the effect gives further insight into how we decide what to trust online.

Recent research published by the American Psychological Association found that just searching the Internet for information makes us feel smarter, probably because we confuse our own knowledge with the boundless wisdom on the Web.

A Pew survey in 2012 found that 73 percent of Americans believed most or all of what they found on search engines was "accurate and trustworthy”; a similar-sized group said the engines themselves were "a fair and unbiased source of information."

That becomes a problem, because researchers said search engines can — through its users, the media and the many others who contribute to the Web — end up placing greater weight on more popular candidates, snowballing and skewing the race.

And all it takes is securing the first page of search results: In a 2007 eye-tracking study in the Journal of Computer-Mediated Communication, searchers fixated on and trusted the higher-listed results far more, even when the lower links returned more relevant results.


The study measured how people clicked on results in a Google-like search engine. Note how the top results pull in nearly all the clicks, with a slight bump for links at the bottom of the six-result page. Links in the virtual wasteland, of page two and beyond, are mostly forgotten or ignored. This pattern is found repeatedly in search-engine studies. (Courtesy of the study)

Google says it has taken pains to preserve the integrity of its namesake search, and earlier this year a team of Google researchers announced they were looking into ways to deliver results based not on popularity, but "trustworthiness," a measure of the link's facts versus a comprehensive "knowledge vault."

“Controlling the truth value of what is being circulated, information quality — absolutely, it should be something that gets checked,” Luciano Floridi, a University of Oxford philosophy professor and strategist on a Google advisory board, told Quartz last week.

In a statement, a Google spokesperson said, "Providing relevant answers has been the cornerstone of Google's approach to search from the very beginning. It would undermine people's trust in our results and company if we were to change course."

[Fear not: Mark Zuckerberg will not be taking over politics any time soon]

But studies show that algorithms, like people, can discriminate, too. Even the most cold and calculating computer instruction can end up perpetuating the biases of what people are looking for, potentially skewing the results.

As for how to solve this problem? Epstein says they're following up with further research, but early ideas — including giving top-link choosing power to a panel of non-partisan judges — seem difficult or impractical for a modern age of nearly limitless search.

Epstein, the former editor-in-chief of Psychology Today, has tussled with Google before over what he calls its unchecked influence and threat to civil liberty, and he has routinely called for the search giant to submit to greater regulatory oversight.

But he is not the only one to caution against the risks of one search engine's dominance. Floridi, the Google-advising philosopher, also supports the idea of encouraging voters to use many different, competitive search engines, so they can better assess the results for themselves.