The Washington Post
Navigation Bar
Navigation Bar

Related Items
  • Archive

  • Polls Section

  •  
    What America Thinks
    Can Americans Just Say No to Pollsters?

    By Richard Morin
    Washington Post Polling Director
    Monday, June 7, 1999

    It's an article of faith in the polling world that response rates are falling as more and more people are just saying no when a pollster calls. But are they, really?

    A review of data collected in time-series survey projects from around the world suggests the answer may not be so simple. Response rates may not be going down uniformly in the United States or elsewhere. And while more people may be initially refusing to answer a survey, many polling organizations are making up the difference by using more sophisticated and aggressive strategies to squeeze every possible interview out of a target sample.

    First, a definition, then some data. Pollsters figure response rate all sorts of ways. One simple way is to divide the total number of completed interviews by the total number of telephone numbers dialed. Thus if 1,000 people completed a poll but 2,000 telephone numbers were dialed, then the simple response rate would be 50 percent.

    If, over time, more people refuse to do interviews or aren't home when a pollster calls, or if more telephones are connected to answering machines and not picked up by real people, or for whatever reason, then the denominator (2,000 in our example) swells and response rates go down. And that's bad, at least in theory, because a low response rate suggests there's a good chance your sample doesn't accurately reflect the population you're trying to measure.

    That's exactly what many people believe has been happening: Fewer people are cooperating with pollsters, and response rates allegedly have been declining around the world. But what's believed and what's true may be two different things.

    Here's the data: "A review of 56 time series around the world showed 22 declines in response rates, 16 with no change, 14 with variable trends (ups and downs), and four with rising response rates," wrote Tom Smith, director of the National Opinion Research Center's General Social Survey, in a message to AAPORNET, the online bulletin board maintained by the American Association for Public Opinion Research. "While declines clearly greatly outnumber increases, the pattern is hardly one of general, unfettered decline," wrote Smith, who cited data first reported in the International Journal of Public Opinion Research in 1975.

    A more recent study found the same thing. Edith de Leeuw, a Dutch researcher, reports that the next issue of the Journal of Official Statistics will contain a study of response rates over time and in different countries. "Some countries have a clear downward trend, others remain stable overall, but show an increase in refusal and a decrease in non-contacts, indicating more fieldwork effort," she writes.

    That pattern – or lack of a pattern – may be true for major time-series polls – those big-budget surveys such as the annual General Social Survey or the government's monthly household survey. "But to complicate matters, times series are not representative of all surveys," says Smith. "They are based on studies with figures in the public domain and contain few commercial polls and no market research surveys."

    One AAPORNETter offers evidence that response rates in commercial surveys may not be uniformly in decline. "A review of 10 years' worth of Time Inc.'s subscriber studies –all postal surveys based on random samples of subscribers, designed to monitor things like customer satisfaction or response to editorial innovations – showed no overall pattern of decline," reports Scott McDonald, director of research at Time Warner. "Similarly, our telephone polls over the same time period were not showing lower cooperation rates, though contact rates [the percentage of people actually contacted out of all the telephone numbers dialed] were lower," a function perhaps of increased use of answering machines, data lines, etc.

    Based on my experience with Washington Post surveys, the picture is mixed. On major polling projects, response rates have not fallen and may even have increased – even though people are more likely to refuse to participate now than five or 10 years ago.

    We've countered this eroding cooperation by increasing efforts to convert those who refuse to answer surveys, lengthening the periods over which surveys are conducted, adding callbacks for hard-to-reach groups such as young people, and other strategies to capture reluctant or elusive respondents.

    On surveys The Post conducts with the Henry J. Kaiser Family Foundation and Harvard University, we require a minimum response rate of 40 percent, generally the minimum response rate for surveys reported in academic journals. This strictly figured response rate is the equivalent of many of those 60 percent cooperation rates reported by other polling organizations using a more forgiving formula.

    But that's what we do when we're being good. Growing non-cooperation with surveys is a problem. But an even bigger non-response problem in survey research is caused by our growing addiction to overnight surveys.

    Here, response rates plummet to the teens, as more and more numbers are called to yield samples that may be best described as random samples of people who were home at the time the poll-taker called – a group that in terms of attitudes, behaviors and demographics may or may not be representative of the country as a whole.

    Overnights are seductive. They're quick, cheap and they're easy to do: Time constraints demand a limited number of questions on a limited number of topics. Often, they're good enough. Sometimes they're even preferable to more rigorously done polls with longer field periods, such as during the early weeks of the Kosovo campaign when the situation was changing rapidly and anything longer than an overnight poll would have been overtaken by events.

    Overnights are fast becoming the norm to measure attitudes in times of war and peace, even in times of election campaigning. Given their inherent unreliability, pollsters are playing with fire (and loving every minute of it, given the increase in overnight polling). With improvements in techniques and strategies, polling can survive declining cooperation rates. The question is, will polling survive overnight polls?

    Richard Morin is director of polling for The Washington Post. "What Americans Think" appears Mondays in The Washington Post National Weekly Edition. Morin can be reached at morinr@clark.net .

    © Copyright 1999 The Washington Post Company

    Back to the top

    Navigation Bar
    Navigation Bar