washingtonpost.com  > Live Discussions > Politics
Transcript

Washington Post-ABC News Poll

Christopher Muste & Richard Morin
Senior Polling Analyst & Polling Editor
Friday, September 10, 2004; 2:00 PM

The latest Washington Post/ABC poll shows a significant boost to President Bush's ratings coming off of the Republican National Convention.

Senior polling analyst Christopher Muste and polling editor Richard Morin took your questions on the latest poll numbers, the trends and what the results mean for the 2004 election.


Wednesday's Sessions
World: Greece's second bailout, 11 ET
Food: Free Range on Food, 12 ET
Entertainment: Reliable Source Live, 12 ET
Style: 30 Lessons for Living, 12 ET
Weekly Schedule

___ Message Boards ___
Weigh in with your opinion on the latest news and analysis 24-hours a day.

Readers Are Talking About...

The transcript follows.

Editor's Note: Washingtonpost.com moderators retain editorial control over Live Online discussions and choose the most relevant questions for guests and hosts; guests and hosts can decline to answer questions.

_____________

Columbus, Ohio: I think its safe to say, the polls post RNC convention have been all over the place. The Time and Newsweek poll had Bush up 11, Zogby has Bush up 2, then the Gallup poll had Bush up 1 (RVs), ICR had Bush and Kerry tied, Fox has Bush up 2 now your poll has Bush up 9. And some of the state polls have been even more weird.

I noticed a little golden nugget in the ABCNews write-up of this poll: "After the Democratic convention there were eight points more Democrats than Republicans among likely voters; today, there are six points more Republicans than Democrats."

Now which is more likely: Your poll, Time's poll and Newsweek's poll have detected a huge shake-up in political partisan breakdown the likes of which we have not seen since the New Deal or you guys are simply oversampling Republicans or more accurately oversampling Bush voters?

Richard Morin: It is not surprising to us or to ABC that there is a party identification "bounce" after each convention, just as there usually is an uptick in support for the candidate whose party just held its convention.
Remember what our horserace question asks" "If the election were being held today...." That means the results reflect existing conditions--this week, that means a slight uptick in Republican identifiers. The results are not a prediction of what will happen in November, rather an accurate--albeit perhaps transitory--measure of what is happening now. I expect these party id bounces to smooth themselves out in a few weeks.

_______________________

Charlotte, N.C.: Why do you poll people on a three way race involving Nader? Nader is only on 11 ballots, give or take. Currently he is not on Florida's (subject to another court ruling). Why not at pose a question about just a two way race? Also, if you are going to construct questions involving a fringe candidate like Nader, why not throw the Libertarian into the mix?

Richard Morin: Hello, it's great to be back answering your good questions. And please welcome Christopher Muste, my colleague and the Post's senior polling analyst. He will be working with me until after the election while Claudia Deane, the deputy director of polling, is on maternity leave.
As for Nader: That rascal has caused us so many problems this year! Currently we only ask the three-way horserace question of people who live in states where Nader is on the ballot or where the issue is not yet resolved. In states where Nader is definitely off the ballot, respondents are asked the two-way. We are interested in Nader and not the Libertarians because of his impact in 2000.

_______________________

Atlanta, Ga.: Although I'm in my forties, I've never paid attention to a presidential election until now. Which means I never really paid attention to polls either. Here's what I don't get: Why are polls so divergent in their results? If you ask a simple question like "who are you voting for"' and you ask in about the same time frame, how can 1 poll show a 1 or 2 percent difference and another show a 10 percent difference? And if the audience itself is somehow the factor, than how can any one single poll be considered accurate?

Christopher Muste: Good question, one that many people have.

There are a number of factors that cause poll results to differ, even if we're talking only about polls done with careful attention to good polling practices.

Many of the differences you see are what we would normally expect given that we are polling only a sample of people, and not every voter in the country. Most polls done by media organizations have about 1,000 people in the sample. According to statistical theory, this means that the figures we report on a question like who people plan to vote for will probably be within plus or minus three percentage points of the true figure among all voters. So our figure of 50% for Bush could actually be anywhere from 53% to 47%.

Another source of differences is that it's very hard to know who's going to vote, and since we want to know how the actual voters are going to vote, that's a big issue in polling. Pollsters have developed a variety of questions to get at the group of people who are most likely to vote, but differences in the techniques different pollsters use will lead to differences in the results they report.

Timing can make a big difference as well. Polls taken during or just after a party convention are likely to put that party's nominee doing better than a poll taken a week later. Other events during the course of a campaign can affect poll results as well.

There are also fairly consistent differences between the results different polling organizations get -- one might often report a higher vote for Republican candidates compared to other polling organizations.

There are a host of other issues, but I think if you look at many of the good polls together, you can see both a range of answers, which is your concern, but also consistencies, such as the general agreement among the polls that President Bush received a modest bounce after his convention. It's important to be a savvy consumer when it comes to polls, and you're learning fast.

_______________________

Ashland, Mo.: Two questions:

1. Question 30 of your poll asks for a response on whether "most Americans" are better off financially,etc. Why don't you ask whether the individual is better off? There is some evidence that people tend to exaggerate the collective pain because only bad news is reported. Is your question a better indicator of how the person would vote?

2. Of what relevance is how a candidate is doing in the "Battleground" states as a collective? A candidate could be ahead in those states but still be losing if he is winning fewer of those states than was won in the last election.

Richard Morin: Two good, tough questions.
Political scientists have studied the impact of the economy on voters and they have found something very interesting. In the main, people don't vote their pocketbooks--they vote how they think the country as a whole is doing. So if they're doing well but they believe most Americans are faring poorly, they'll punish the incumbent party's candidate even though they're doing okay personally. The opposite is also true, these scholars found. If you are not doing well but think the country as a whole is doing well, you tend to reward the incumbent party. In practice, we often ask both questions.
Your question about global samples in battleground states is important. I believe these results are somewhat useful, though limited. That's why I just mentioned them in a single paragraph in today's story. The numbers do not tell you which states are going for Kerry or Bush, but it does give you a sense of how the campaign is playing out across these states. Just as national surveys have proven to be a reasonably accurate estimator of the national vote (and of the winner), polls in a subset of these states can give you a reasonable look at the state of play across these states.

_______________________

Atlanta, Ga.: Why should we believe poll results taken from "likely" voters instead of from actual "registered" voters?

Richard Morin: In our surveys, all likely voters are self-described registered voters. "Self-described" is an important caveat--a fair number of people--15 percent or so--claim to be registered but actually are not. I guess they want to impress the interviewer! The problem in interviewing actual registered voters is that it is impossible to get complete lists of all registered voters in the country. Many states--but far from all--have such lists, and we have used registration list-based samples on some state projects, including surveys we are doing of Latino voters with Univision and the Thomas Rivera Policy Institute at USC.

_______________________

Philadelphia, Pa.: In light of your latest poll, is there anything that I as a Democrat can find solace in?

Richard Morin: Yes--it's two months until election day. And you might take heart (or recall with dismay) that Al Gore had a fairly big lead in surveys done around this time four years ago.

_______________________

Chicago, Ill.: Your recent polls show that President George Bush is a head of Senator John Kerry by double digits, my question is how can you make a report that reflects the views of the nation based on 1,202 randomly selected adults nationwide, more importantly 952 self-identified registered voters. Clearly, when the media is reporting such poll numbers they are only trying to discourage individuals from voting, because there is no way that 1,202 people can speak on behalf of millions.

Christopher Muste: This is a good follow to the last question. According to statistical theory, if your survey has a representative sample of 1,000 people, the results you get will usually be within plus or minus three percentage points of what the results would be if you were able to ask everyone. Sampling theory also says that occasionally (once out of twenty times) the results from a 1,000 person sample will be off by more than three points. The key is getting a representative sample, which good polling organizations go to great lengths to accomplish, primarily by ensuring that every adult has the same chance of being interviewed.

This is hard to believe for many people, but sampling theory has been used extensively in a wide range of applications, and as long as the sample is representative, sampling theory works well.

As for trying to turn off voters, we'd actually much rather everyone voted, since it would make the job of figuring out who's going to vote much easier!

_______________________

Washington, D.C.: What did you find when you looked at battleground states vs. the national results? And what does the gender gap look like post-convention? Did more women switch from Kerry to Bush?

Richard Morin: Bush had a four-point advantage among likely voters in 19 battleground states, about half his lead nationally. Bush did better among women in this survey than a month ago, but Bush did even better among men, resulting in an 18-point gender gap. A majority of men are still voting for Bush while most women still favor Kerry.

_______________________

Jonesport, Maine: Don't the polls themselves have an effect on the outcome of an election? Especially for "undecided" voters? After all, everybody loves a winner. As a supporter of Kerry, even I have felt depressed and defeated already by those polls.

Richard Morin: Aw, take heart, Kerry supporter! There's lots more time and lots more interesting politics to come.
While everybody loves a winner, so-called "bandwagon" effects due to polling seem to play little or no role in elections. Ballots typically feature a number of races and measures, so there are lots of reasons to vote besides the presidential contest. Also, I doubt that the kind of person who would be swayed by a poll finding would even be registered to vote.

_______________________

Washington, D.C.: Do national polls have any real meaning? Do you attempt to determine what a poll result might mean in terms of Electoral College impact?

Richard Morin: Studies by the Gallup Organization confirm that national surveys are reasonably accurate though not flawless predictors of the final results, both in terms of each candidates' share of the vote and the ultimate winner. In other words, the person who leads in the polls typically wins the presidential election.

_______________________

Charlotte, N.C.: What is a "likely voter"?

Richard Morin: In our most recent poll, a "likely voter" is anyone who is certain to vote and voted in 2000 (or was too young to vote in 2000 but is closely following the presidential campaign).

_______________________

Laurel, Md.: As a lifelong Democrat and Kerry supporter, I am puzzled at these recent polls that show a huge Bush bounce after the lies and distortions at the Republican National Convention.

Since the vast majority of people in this country either hate or love George W. Bush, I wonder how these types of huge shifts in voter preferences could have happened or if they in fact are not accurate.

I know that virtually no one who voted for Gore in 2000 is voting for Bush (according to many political commentators and reporters) so where are these new people who suddenly support Bush coming from?

Richard Morin: Actually, both Bush and Kerry's bounces were modest, at best: four points for Bush, three for Kerry among likely voters.
I would suggest another way to think about swing voters to explain shifts in preference. I also believe there are not a large number of voters who can't make up their minds between Bush and Kerry. I do believe there are a significant number of registered voters who may or may not vote in November. These voters don't participate in every election, so while they may favor one candidate or the other, so the key decision for these "swing" voters is whether to vote or not. I did a focus group in Erie, Pa., with several of these voters and it is interesting to talk to people who aren't as passonate about these candidates as most other voters seem to be. Get-out-the-vote efforts are largely designed to get this type of voter to the polls.

_______________________

Bowie, Md.: Would you please explain how you obtain your polling data? Specifically: (1) Are responses gathered by telephone calls? (2) What screening questions are used to cull "likely voters" or self-identified registered voters? (3) Are attempts made to include minorities, urban residents and residents of Native American reservations in your analyses? I ask these questions because none of my elderly relatives, all with published phone numbers, nor black people at all, have ever been contacted by any presidential polling organization. Thus, all polls are viewed by minorities as exclusive to Caucasian, suburban voters. If you choose to respond, many thanks.

Christopher Muste: Thanks for your question, Bowie.

1) Washington Post / ABC News polls are done by telephone.

2) The questions we use to find out who are registered voters is to ask them if they are registered. To determine likely voters, we ask those who have already said they're registered how certain they are to vote. This is a particularly tough problem for pollsters, since there's no "gold standard" way to determine who is or isn't going to vote.

3) The methods we use are designed to give everyone who has a telephone an equal chance of being called, regardless of who they are or where they live (although we don't interview in Alaska or Hawaii). There are some groups of people in which phone ownership is low, mostly very poor and rural households, not urban households, so they are often slightly undersampled in telephone surveys.

But we also "weight" the surveys slightly before we analyze them, so that the people in our survey match the national adult population in terms of age, race, sex, and education.

_______________________

Wellington, New Zealand: There have been serious concerns raised bout the Time and Newsweek polls around the question of how many Republican voters were included in the samples. Add in some controversy over a recent Los Angeles Times poll and other polls in California and this issue of sample weighting has become a biggie.

What are the proportions of Republicans, Democrats and Independents in your latest poll and how do these compare with previous polls?

Regards

Richard Morin: Hello, New Zealand (or a Washington consultant vacationing in New Zealand).
We, as other polling organizations, had a larger proportion of Republicans in our latest sample...and we had a larger proportion of Democrats in our post-Democratic convention sample. Our overall sample was 32 percent Democratic, 31 percent Republican and 27 percent independent. So far this year our samples have averaged 4.3 percentage points more Democratic. Te GOP skew was larger for registered voters and likely voters. I believe these post-convention surveys accurately reflect attitudes at the time they were taken and obviously are affected by the conventions--that's what we're trying to measure!

_______________________

Washington, D.C.: I have to admit as a Kerry supported I'm am crushed by the latest polling data. But I have to admit you pollsters are all over the map. Some polls have Bush ahead by double digits others have Kerry with a slight lead and others yet have Bush with a much smaller.

What gives? How much stock should we place in these silly polls? Why is their so much variation in the polling data? Also, why is there so much fluctuation in an electorate that has supposedly "made up its mind" already (i.e. Kerry up 10 pts in Ohio among likely voters two weeks ago now down 11 pts).

Arrggghh! Can you pollsters cut us some slack!

Richard Morin: Hey, we pollsters will cut you some slack if you give us some! I think the recent round of polls have been very, very consistent--In all of the major media polls (Post-ABC, Gallup, CBS, Newsweek and Time) Bush is ahead and he improved his standing with voters.
When evaluating state polls, make sure the results are from the same polling organization. Also, remember that not all polls are created equal--there are a lot of bad polls out there.

_______________________

Washington, D.C.: any quirks in the response rate for the subgroups? Did you have problems reaching people in Florida during the storms?

Also, was your likely voter screen based on how strongly the respondent indicated they would vote (e.g., those that say certain to vote even if there is a family problem) or on voting history (I've voted in every election since I turned 18)?

Richard Morin: Interesting question. I did look at the Florida subsample. It was very small--70 or so people, so it's hard to say anything with certainty. We did have a bit of a problem with people not answering the phone--perhaps because the phone was still buried under a mound of stlorm debris or the homeowner was at a shelter. The horserace in Florida had Bush up a couple of points, about where Gallup had the race immediately before the GOP convention.

_______________________

Little Rock, Ark.: At what juncture in a (pre)campaign does a state get designated as a swing state? The red/blue idea seemed to be on the radar far ahead of the nominating process. Are tracking polls a continuous process?

Richard Morin: The phrase "swing state" is a term of art, not science. We look at recent polls as well as past election results to determine which states are currently in play. So the list can vary from poll to poll. For example, we dropped Louisiana from the list of swing states.

_______________________

Harrisburg, Pa.: Question on the three candidate polling: if a poll in Pennsylvania has Bush leading by 1 percentage point and Nader getting 3% of the vote, what does this mean since Nader won't be on the Pennsylvania ballot? Might this mean, in fact, that the race is dead even in Pennsylvania?

Christopher Muste: If the poll in Pennsylvania was a typical poll with 1,000 respondents, that 1% Bush lead is in reality probably somewhere between a 7% Bush and a 5% Kerry lead. That assumes the sample is representative of Pennsylvania voters and that the questions were good ones.

The Nader results are likewise an estimate of Pennsylvania as a whole, and probably fall in a somewhat smaller range (1-5%, I'm guessing off the top of my head). Nader is fighting to get or stay on the ballot in a number of states, but if he ends up not on the ballot, the big question is what will his voters do? They can stay at home, write in Nader, or vote for one of the candidates on the ballot. The problem trying to answer this question from a polling perspective is that 2% of a 1,000 person poll is about 20 people. With that few people it's very hard to project what they'll do.

_______________________

Bethesda, Md.: Isn't it too close to the Republican convention to accurately poll? Aren't we still in the post convention "glow"?

Richard Morin: Yes, the last poll absolutely was affected by the GOP convention, just as our post-Democratic convention poll was influenced by what happened in Boston. But we wanted to measure the effects of the Democratic and Republican conventions on voter attitudes. We know this will change, and we intend to track this change. So watch this space! Lots more polls to come.

_______________________

Bethesda, Md.: I'm a Bush supporter. How happy should I be with these poll results?

Christopher Muste: You should be very happy with the results of our and most other polls out this week. John Kerry's supporters should be happy that there are still two months left in the campaign.

_______________________

Richard Morin: Thank you! That was fun. We look forward to doing this again after our next poll.

_______________________


© 2004 Washingtonpost.Newsweek Interactive
Viewpoint: Paid Programming

Sponsored Discussion Archive
This forum offers sponsors a platform to discuss issues, new products, company information and other topics.

Read the Transcripts
Viewpoint: Paid Programming