The Washington PostDemocracy Dies in Darkness

The New York Times rocked the polling world over the weekend. Here’s why.

(AP Photo/Orlin Wagner)

The New York Times made a big decision over the weekend. It unveiled a collaboration with an Internet polling company and unveiled the results of its first survey from that collaboration. I reached out to David Leonhardt, the editor of The Upshot, the Times' data and visual storytelling site, in hope of getting answers to a few questions about how that decision was made -- and why. He graciously obliged.  Our exchange, conducted via email, is below. It has been edited only for grammar.

FIX: The Times shook up the polling world this weekend when the Upshot decided to start including Internet polling -- conducted by YouGov -- in your election models.  Scott Keeter, the director of survey research at Pew, called the decision "a very big deal in the survey world." How much time did you guys spend thinking about including Internet polling and what tipped the scale for you?

Leonhardt: As a long time Fix fan, I’m excited to be doing this, Chris. And I like this question, because it lets me explain what’s different about this project and what’s not different.

Let’s start with what’s not different. The Times has long included Internet polling in our election models, as do other well-known models. Our Upshot Senate model has included online polls since we launched it in April, and the FiveThirtyEight model included online polls when it was at The Times (as it presumably will in the future).

As you know, the world of polling is changing. Telephone polls based on random-number dialing – which have long been the gold standard and remain vital to any model – are coping with declining response rates. In 1997, telephone polls had a response rate of 36 percent, according to Pew. By 2012, the rate had fallen to 9 percent.

Online surveys, even the very best, have their own limitations, of course. Only 89 percent of Americans are on the Internet, Pew says. And unlike telephone polls, online surveys can’t randomly select people, because you can’t randomize email addresses the way you can randomize phone numbers. That’s a big drawback.

But here’s the good news: The best phone polls and the best online surveys have different challenges. Phone polls tend to have low response rates among younger people. (How often do you think 25-year-olds answer calls on their mobile phone from numbers starting with 888?) Older people are less likely to be online.

As is often the case, you get a more accurate picture of reality when you are able to look at it from different perspectives. That’s why looking at both the most rigorous online polls and most rigorous phone polls gives you a better sense of a campaign than looking at only one.

Another thing that hasn’t changed: The New York Times/CBS Poll. It’s obviously a telephone poll – conducted by SSRS, one of the best survey-research firms – it remains at the heart of Times political coverage. Times/CBS polls have broken all kinds of new ground over the years – with revelatory polling on race, class, unemployment, the Tea Party and other subjects – and they’ll continue to do so.

So what is different about this new project? In the last couple of elections, CBS collaborated with YouGov. This year, we are working with both YouGov and CBS on the project, and it’s a significantly expanded panel. We think we’ll be able to do some good work as part of the collaboration.

FIX: At the heart of the debate is the fact that Internet polls typically rely on an opt-in system -- people select themselves to take part – while more traditional surveys select their participants via a random digit dialing method. How do you answer critics -- including one of the Post's own pollsters -- who say this is an unnecessary break from decades of "quality research methods"?

Leonhardt: There is no question that random-dialing telephone polls start with a significant advantage over an online survey: the ability to randomly select a sample that resembles the country.

At the same time, when only 9 percent of people sampled by a telephone poll respond, we know they’re unlikely to look like the country as whole. That 9 percent are opting to respond to a poll. They are, in important ways, a self-selected group, different from people who do not respond to polls. Mark Blumenthal, of HuffPollster, discussed these issues in a post this week, in which he noted that both phone polls and online surveys now face some similar issues. Telephone polls must weight their respondents – by demographic groups, for example – to have the respondents resemble the country as a whole.

The evidence suggests that the best telephone polls are doing a good job of weighting, which is great news. But it’s not easy. Over the last few election cycles, many telephone polls appear to have missed some young adults, which may be part of the reason most underestimated President Obama’s 2012 vote share in their final pre-election polls. Again, you get a more accurate picture of the horse race when you look at it from several angles.

FIX: Why did you guys choose to partner with YouGov and not another online poll provider? Is there something about their panel of respondents or their methods that stood out to you as unique or more tested?

Leonhardt:  Online surveys, which tend to recruit people via e-mail or web links, need to clear a high bar to show that they can begin to make up for their inability to do random dialing. They need a strong, transparent methodology, and most online surveys don’t have one. Frankly, the same goes for a huge number of surveys out there, especially partisan surveys and “robopolls.”

YouGov is different in crucial respects. It goes to significant lengths to match its respondents to Census data (as Nate Cohn explains in more detail here), so they’re similar to the population as a whole. YouGov is also quite transparent about its methods. Doug Rivers, the Stanford professor who helps run YouGov, has published journal articles about the methodology, and releases all kinds of details about individual surveys.

This approach helps explain why academics, particularly political scientists, so often use YouGov’s surveys in their research. YouGov’s prior work with CBS, in 2012 and earlier, also went well. Maybe most important, YouGov’s record is public: Its 2012 surveys correctly pointed to the winner in every state but one (Florida, where it had Obama losing by 1 point rather than winning by 1 point, being the exception). YouGov also underestimated Obama’s national margin by a bit less than many telephone polls did, and the firm fared well in 2010 too. In Colorado, for instance, YouGov was one of the few firms to show Senator Michael Bennet ahead.

Let me be clear, though: I’m not suggesting that YouGov’s data is better than everyone else’s data or that you should put too much emphasis on a single result from it. Political analysts shouldn’t put [too] much weight on any single poll finding from anywhere. They should look at an array of polls. That’s why my colleagues Amanda Cox and Josh Katz have built a model that combines many poll results – as well as data on fundraising, a state’s political history, candidate characteristics and other factors – and offers probabilities. I’m merely saying that the evidence seems overwhelming that YouGov should be a healthy part of that mix.

FIX: There's a real push for transparency in polling -- both from the survey research community as well as our readers.  Releasing detailed information about the data and the results has become de rigeur for many pollsters and the new organizations with which they collaborate. Will the YouGov polling the Times is using be releasing those sorts of details? Why or why not?

Leonhardt: Yes, indeed. YouGov has posted a rich array of data on the first wave of results from its 2014 panel – including crosstabs for every state and raw numbers of respondents. The data will keep flowing in the three subsequent waves between now and election day.

FIX:  Is the use of online polling an experiment for this election for the Times?  Or is this methodology something we will see come 2016 as well?

Leonhardt: I don’t think anyone knows what the future of public-opinion research will look like, but it seems very likely to continue changing. Rigorous public-opinion research has a wonderful record in this country, and it has relied on the telephone for its success. To state the obvious, the way people are using their telephones has changed and isn’t done changing.

I can imagine all kinds of ways that survey research will change in coming years. If text messaging becomes nearly universal and occurs mostly through people’s mobile phone, surveyors could use random-digit texting as a way to poll people, for instance. Or maybe someone will invent a way to sample email or web addresses randomly. Then again, maybe random sampling will continue to become more difficult, and surveyors will need to keep looking for alternatives. It will be fascinating to watch.

Two predictions seem reasonable: One, as the boundary between telephones and computers continues to blur, so will the boundary between phone polls and online surveys. Two, technological change usually brings more benefits than drawbacks. I’m fundamentally optimistic that the best telephone polls and the best online surveys will find ways to address these issues.

Loading...