When to See Through Polls
You don't need a survey to know that Americans love polls. Election polls. Issue polls. Polls on whether Michael Jackson was murdered. Polls on boxers or briefs for men.
Polls are proliferating, and so are problems with the way they're conducted and reported. The Post is taking steps to tighten controls on both.
Several weeks ago, the news staff was given updated standards that spell out do's and don'ts.
The goal, said Post polling director Jon Cohen, is to remind reporters and editors that "not all polls are equal." While many are statistically valid, others are pseudo surveys masquerading as serious scientific research. And some are pure hokum.
"The numbers [of polls] are going up," said Richard A. Kulka, until recently president of the American Association for Public Opinion Research, which promotes quality polling. "It's cheaper than ever to do them . . . especially if you don't use accepted probability standards."
Public opinion research has been an important part of The Post since 1935, when it started carrying "America Speaks," a new syndicated column by polling pioneer George Gallup. The paper even paid for a blimp to advertise it above Washington.
Today, The Post is one of a handful of American newspapers with a major commitment to its own polling. It conducts a national poll as often as once a month in a decades-long partnership with ABC News. The Post's methodology is available on its Web site, including the wording of all questions.
The updated standards -- Cohen said they will eventually be made public -- are all about credibility.
"We need to be as skeptical about numbers as we are other types of information," managing editors Liz Spayd and Raju Narisetti wrote in a note to the staff accompanying them. "Just because we see a number and a percentage sign assigned to something does not make it fact."
Quality news organizations such as The Post know to ask baseline questions to establish a poll's reliability. Who paid for the survey (are they pushing a point of view)? How were the questions worded (were they leading)? How was the sample chosen (was it truly random)? How large was it (100 or 1,000)? What's the margin of error (it can be statistically significant)? Who was surveyed (for an election poll, were they "likely" voters)? When were they surveyed (was it before the leading candidate's big gaffe)? How were they questioned (personal interview, automated phone response or on the Internet)?
But Post reporters are being asked to probe deeper and to be especially wary of unproven new polling techniques.
Among them are "click-in" polls, where participants respond to survey questions online. They are widely considered unreliable because the sampling isn't random.