Election 2008: The Latest Polling
Thursday, September 25, 2008; 2:00 PM
Washington Post polling director Jon Cohen and polling analyst Jennifer Agiesta were online Thursday, Sept. 25 at 2 p.m. ET to discuss the latest numbers from recent polling in four swing states in Virginia ( data) and nationwide (data).
For more answers, read the Behind the Numbers blog.
The transcript follows.
Jennifer Agiesta: Hi everyone. Thanks for joining us for the chat today. We've got a lot of data to discuss -- we released both national and Virginia data this week -- so let's get right to it...
College Park, Md.: I believe your poll reflects The Post's inherent bias against Republicans. If you look at the particular phrasing of each question etc., it is very clear what answers you were seeking to receive. Your poll does not stand up alongside any other poll besides the New York Times, which is no surprise. Rasmussen, Zogby, etc. are all at odds with your findings.
Jon Cohen: Good afternoon. We welcome comments and (constructive) criticisms about our polls. We publish our questionnaires in full so that you (or anyone) can see exactly what we asked and in what order. What questions do you find unbalanced?
Great Falls, Va.: After all these years of being ignored in the presidential elections, it's exciting to actually have the candidates in the state to campaign. Does Warner have strong enough coattails (funny to have to put it that way) to actually put Virginia in the Democratic column this year?
Jennifer Agiesta: A great question from Great Falls. Our poll found an incredibly tight race for president in Virginia, meaning it is a real battleground for the first time in many years, and Warner's popularity seems to be a big contributing factor. Overall, 46 percent of Virginia voters said they plan to vote a Democratic ticket, backing both Warner and Obama, and 29 percent indicated they plan to support both Republicans, McCain and Gilmore.
And in the battle for the remainder of voters, the Warner "coattails" appear to be real: Nearly all of Obama's supporters also back Warner, and Obama hangs on to a majority of non-Democrats who support Warner.
Cambridge, Mass.: With an increasing number of people using cell phones exclusively, can we really trust the recent polling data?
Jon Cohen: Greetings Cambridge. This is one of the key questions this year. It was also the cause of much angst among pollsters in 2004. While (many) polls were good then, the percentages of adults abandoning landline telephone service has continued to soar.
Nevertheless, most data continue to show little systematic bias to polls that don't include cell phones. Among the reasons: Young adults who have home phones are as apt to support Obama as are those without. Therefore, if we know how many young people we're missing (and we do, thanks to the Census) and we adjust the sample to make up for this miss, the end result is little to no bias.
That said, there's a new Pew report out today that shows how things may be shifting ever so slightly. We have included cell phones in our polling, and continue to closely monitor developments.
Rolla, Mo.: It looks like one of the things people knocked Obama for -- his calm, cool demeanor -- actually helped him in the past couple of weeks. McCain seemed to freak out a bit, calling for the SEC (or was it FEC?) chair to be fired, contradicting himself in the span of a day or two, etc. Do people expect leaders to react differently to different types of crises? For instance, would they prefer decisive, immediate action on a foreign policy crisis vs. an economic one?
Jennifer Agiesta: The battle about which candidate lives up to voters expectations of leadership has been a close one all along. We ask voters which candidate they consider to be the stronger leader; in this poll, McCain and Obama are just about deadlocked: 47 percent Obama, 46 percent McCain. And despite the shift in the top-line numbers, this measure has remained relatively stable since the primary contests ended.
On a related, question, Obama has gained some ground in the past two weeks. When asked whom they trust more to handle an unexpected major crisis, 47 percent prefer McCain, 46 percent Obama -- another near tie. Immediately following the Republican Convention, McCain held a big 17-point edge; back in August, his lead was 11 points.
Washington: As you guys know, surveys results are only as good as the sample that they are based upon (or the post-data-collection methodology used for adjusting a sample frame's representation or lack thereof). Can you give us a short explanation as to how The Post adjusts for factors that we know will result in biased findings -- specifically, the greater reliance on cell phone use and potentially higher voter turnout from younger persons and African Americans? What is the basis for adjusting your findings one way or the other?
Jon Cohen: Washington as in this D.C.? We conduct telephone polls of randomly selected respondents (typically landline only, but also on cells when appropriate). We adjust our samples to reflect on age, race, sex and education. We do not do any adjustments based on party identification, a big topic today.
Seattle: Question: When Day 1's result is 50-50 with a margin of error of 3 percent, why does it matter that Day 2 is 51-49 with the same margin? Aren't those two results the same thing?
Jon Cohen: I certainly would not characterize those as different results. Sampling error is real, even on exit polls, which many view as "fact."
Pine Grove, Calif.: I noticed the most people polled were not first-time voters. We've have a huge registration bump this year. How do you think the new voters affect these numbers?
Jon Cohen: There has been a big jump in new registrations this year; there typically is in presidential election years. We do random-digit dialing; we don't poll off registered voter lists that may or may not include those who are new to the voter rolls. So those new registrants would be in our samples; it'll be interesting to see how big a factor they prove to be in November.
Washington: Do you plan to update your boilerplate methodology wording to include something about how mobile phones are included in the frame?
Jon Cohen: Yes, we are long overdue on updating those items. Our polling porters at ABC News did a write-up of what we did in August. Check it out.
Burke, Va.: Thanks for taking questions on polling. As an independent Virginia voter, I'm curious to know if any polls strictly limit their some polling to independent voters. Because this voter demographic will have a great impact on the presidential election outcome, how do pollsters measure the impact of independent voters like me?
Jennifer Agiesta: Hi Burke. Political independents are an important analysis point in our polling, and we pay close attention to their impact. In fact, we were so interested in independents that last summer, we did an in-depth survey of independents with the Kaiser Family Foundation and Harvard University to get a real sense of the group before the campaign got underway. That poll included a Virginia component as well, so if you'd like to get a profile of your fellow Virginia indies, check it out.
In our current polling, we do not limit our respondents by party identification -- we interview everyone, and as Jon said, we do not adjust for party identification.
washingtonpost.com: Post Polling on The Independents
Jennifer Agiesta: Here's the link to our Independents survey from last year.
Seattle: Who are these "so-called" undecideds, how can they be undecided after all this time, and how can one recognize them on sight to observe them like some rare and unusual new species?
Jennifer Agiesta: Hi Seattle. While we haven't published a field guide to undecideds, here's how we determine who falls into that category. They are comprised of two groups -- first, voters who named a candidate (Obama, McCain, Nader, Barr or anyone else specific) but say that there's a chance they could change their mind before Election Day, and those who say they are likely to vote, but don't support any of the current candidates or haven't yet made up their mind.
In this poll, they make up about 20 percent of all voters, down from 25 percent in early September and 30 percent about a month ago.
Portland, Ore.: How do you determine who is a "likely voter"? Is it someone who says they are certain to vote in this election, someone who has voted in the most recent election, or do you use some other criteria?
Jon Cohen: Great question -- my sense is that there's far more variation in polls based on likely voter models than whether cell phones are included. We construct a range of models, using respondents' answers to questions about their registration status, their certainty of voting, their voting history and others.
Exeter, N.H.: In polling, is a one-point margin of error the standard deviation?
Jon Cohen: The error margin we typically report is the "margin of sampling error," the error that derives from drawing a random sample of a population instead of interviewing everyone. There are other sources of error in polls, but this is the one we can enumerate easily.
Reston, Va.: Did you assess effects of new residents in Northern Virginia? If so, what the patterns tell us? Are there differences in race, nationality and other cultural factors?
Jennifer Agiesta: Hi Reston. In a poll of about 1,000 people, it's difficult to get a large enough sample size to analyze a group that small. However, we can look at those who have moved to Virginia in the past 10 years, and those voters actually lean a bit more toward the Republican candidates than do those who have lived in the state for longer periods.
But that is a statewide measure. Looking specifically at Northern Virginia, that area's changing nature is clear from vote results in recent elections. In 2000, Gore and Bush split the area nearly evenly: 49 percent for Gore, 47 for Bush. By 2004, Kerry held a larger advantage over Bush: 54 percent to 46 percent. Democrats have suggested they need to win 60 percent of voters in this region to win statewide, and our poll finds Obama coming very close to that mark.
Rockville, Md.: Historically, how accurately have polls taken at this point in the campaigns in terms of predicting the winner in November?
Jon Cohen: Our aim is not to predict the future -- though wouldn't it be nice if we could know where the stock market is headed? Polls are reasonably good at estimating where things stand now.
In the two previous presidential elections, polls showed an extraordinarily tight contest throughout the fall, just as they were on Election Day.
This year's polls typically have been on either side of a competitive race. Our new national poll has a clear Obama lead, the first time either candidate has had a significant edge in Post-ABC polling. For the moment, the poll is a bit on the high side for Obama compared with other polls, so we'll see if it sticks.
What about the Bailout?: How likely is that to play a role in shifting the polls?
Jennifer Agiesta: Our poll found mixed feelings on the actions of the Federal Reserve and Treasury Department, but they cut across party lines rather than by party lines. Obama holds an advantage among those on both sides of the bailout.
A more important measure is whether voters feel the current economic situation is a cyclical downturn or a more serious, long-term decline. Those who say it's a long-term downturn favor Obama 70 percent to 26 percent; those who say it's normal break for McCain, 62 percent to 33 percent.
Washington: Is there a website that you know of that posts meta poll data? In other words, I appreciate The Washington Post polls and other pollsters that I know to be of value; the problem is that sometimes the ones I'm interested in simply are not available by the source I'm looking at because they haven't worked on that particular line of questioning or geographic area. So I'm curious if there is any sort of site, professional or casual, that links to all sorts of different polls so I can go to try to find what I'm looking for without visiting each pollster site.
Jon Cohen: Four main sites that people turn to for poll summaries are pollster.com, pollingreport.com, realclearpolitics.com and fivethirtyeight.com. All have advantages -- check them out.
Among the things to note, however, is that with few exceptions, the focus is on the horse race -- often the least interesting part of a poll. Also, there's a wide range in terms of poll quality; something that sometimes is overlooked.
Mountain View, Calif.: Is the pool of respondents the same every time for any given poll? Also, isn't the selected pool or the weighting already influenced by preconceptions about how a particular demographic or group of people is going to vote, leading to some sort of a self-fulfilling poll?
Jennifer Agiesta: Hi Mountain View. This may get a bit into the methodological weeds, but in short, each of our polls is conducted using a freshly drawn sample, meaning we do not talk to the same people each time we poll. Our weighting methods have nothing to do with voter preferences -- we weight to Census parameters for all adults by age, sex, race and education. From there, as Doris Day said, que sera, sera!
Burke, Va.: I guess this is an inherently unanswerable question, but I wonder whether people who refuse to respond to telephone polls trend towards one party or another?
Jon Cohen: While there are many unknown unknowns (pardon the paraphrase), the data indicate that our polls remain broadly representative of the adult population. The biennial National Election Study is conducted in person, so we also get data from that, which we'll be able to compare to phone polling, as well as the (mostly) in-person exit poll. Historically, the three are basically in sync in terms of partisan identification.
Montgomery Village, Md.: Some of the polls report from 6 percent to 12 percent of voters "undecided." After all these months, do you think there are that many people who have not yet made up their minds?
Jennifer Agiesta: Great question. There's actually a distinction to be made between those who say they are "undecided" and those who are actually uncommitted. In our latest national poll, for instance, just 4 percent said they did not yet support a candidate or weren't sure who they would vote for, but we consider 20 percent of voters to be movable or uncommitted (i.e. those who don't yet support a candidate or who said they could change their minds between now and election day).
The "undecided" numbers often reported are usually the first type, made up only of those who do not name a candidate in the vote question. We tend to think variation in that result is more a function of polling technique, and that determining "uncommitted" voters is a more complex enterprise.
Jon Cohen: Thanks for all your questions; sorry we couldn't get to more. Have a great day.
Editor's Note: washingtonpost.com moderators retain editorial control over Discussions and choose the most relevant questions for guests and hosts; guests and hosts can decline to answer questions. washingtonpost.com is not responsible for any content posted by third parties.