In the aftermath of Tuesday's vote, that idea seems absurd. Netanyahu has not only won the election, but he has done so by a very comfortable margin. Observers are now beginning to wonder whether Israel's political polling companies, which had recently given Netanyahu rival Isaac Herzog an edge, painted a misleading portrait of the country's electorate. "So, about those Israeli pollsters...." the Atlantic's Jeffrey Goldberg tweeted wryly as the scale of Likud's win became clear.
Did Israel's pollsters fail? And if so, why? Examining both the country's pre-election polls and its exit polls paints a complicated picture.
When pre-election polls don’t match election results, a return to first principles of polling is a good starting point to understand what went wrong. In this instance, the pre-election polls were not only inaccurate, they actually suggested the wrong winner. Final polls showed Herzog's Zionist Union alliance with a three- to four-seat edge over Netanyahu’s Likud party, but Likud ended up prevailing by six seats.
There are two primary explanations for the polling errors: Either respondent opinions changed from the time the poll was taken or there was flawed polling methodology.
The first explanation could be that the polls were right at the time they were conducted but that voters changed their minds in the days between the final published polls and election day. Israeli election laws prohibit the publication of pre-election polls in the last four days of a campaign. Robust campaigning by Netanyahu in the final days, or strategic voting in reaction to the polls, may have changed preferences substantially and made the final published polls look wrong.
There is anecdotal evidence of this late switch among voters. While the media are prohibited from publishing polls, campaign pollsters continued to conduct internal polls. Those results have remained private, but some Twitter whispers suggest that the race narrowed toward the end.
The second explanation goes to the fundamentals of polling methodology. This remains a big question because many public polls in Israel lack fundamental disclosures about how the surveys were conducted. The Huffington Post’s Mark Blumenthal, who tracked the Israeli pre-election polls, closely found a “profound lack of transparency of their methods.” Blumenthal also noted that while poll forecasters underestimated the share of seats won by Netanyahu’s party, they were fairly accurate in gauging the number of seats won by pro-Netanyahu parties.
Understanding how polls are conducted helps to explain inherent biases in the methods of sampling and data gathering. For example, a poll that gathers data by telephone may not include cellphones. Internet-based polls typically only include those who have Internet access. Each of these polling methods would exclude significant portions of the population that are systematically different from the entire voting population.
Pollsters try to account for these differences by adjusting the data to known characteristics of the population. But without publishing these details, there is little way to understand what went wrong and why.
Even if Netanyahu's last-minute campaigning changed minds in the final few days, that wouldn't account for the results of five exit polls conducted Tuesday by Israeli media companies. On average (and with a remarkable lack of divergence), these polls found a dead heat between Likud and the Zionist Union.
Exit polls are fundamentally different from pre-election polls. Rather than asking potential voters who they might vote for, polling companies set up outside voting stations and ask people who they voted for. While they might not get every single voter to reveal their choice, they take demographic information that should allow them to make predictions about how the entire country might vote.
Joe Lenski, an expert on exit polls at Edison Research, is quick to caution that even at best, polls can only make predictions. "Exit polls have margin of errors like any other survey," he says in a phone interview. "It just becomes magnified when races are this close."
Lenski does not work on polling in Israel, but he did point to media reports he had seen that suggested possible problems in Israeli exit polling. First, he noted that some Israeli polling experts believe Likud supporters had refused to participate in exit polls more than any other group.
"In certain voting stations, voting stations in places where there are a lot of new immigrants, pro-Likud ballot boxes, the percent of those who voted (in the exit polls) was especially low," Channel 2 TV's pollster Mina Tzemach told Israel's Army Radio, according to Reuters. This is a common issue in exit polls, Lenski explains, and generally polling companies take demographic details from non-responses and adjust results accordingly.
Another potential factor that Lenski noted were reports that the polling companies had stopped interviewing voters about two hours before the polls closed in a bid to have numbers for news reports at the 10 p.m. close of polls. In closing their polls early, pollsters may have missed a last-minute rush from right-wing voters. Counting the votes of members of the Israeli army also was known to be a problem, Lenski added. The soldiers vote earlier than the rest of the population and follow different procedures.
Errors in exit polls occur all over the world — in the 2004 U.S. presidential election, for instance, a number of polling companies, including Edison, were found to have made errors that led to inflated estimates of support for John Kerry in exit polls. Israel, with its diverse range of parties, may present an especially complicated task for pollsters.
"They have a really tough time in Israel," Lenski says. "They're not only estimating the top two candidates, they're trying make seat estimates for 10 parties."