Many news consumers know by now to take any single election-year poll with a grain of salt. Because of sampling variation and the vagaries of survey research, the best approach is to focus not on individual polls but on polling averages.
Our research suggests yet another reason not to overreact to news stories about the newest poll: Media outlets tend to cover the surveys with the most “newsworthy” results, which can distort the picture of where the race stands.
Why? Consider the incentives of the news business. News outlets cover polls because they fit the very definition of newsworthiness. They’re new, timely, often generate conflict and allow political reporters to appear objective by simply telling readers and viewers what the public thinks. Horse-race stories are also popular.
Given that readers are drawn to drama and uncertainty, polls that offer intrigue or new developments — such as a close race or signs that one candidate is surging — are more likely to be deemed newsworthy. In particular, polls with unusual results may be more likely to make the news.
On the other hand, surveys that reveal stability or a lack of drama — such as one candidate maintaining a modest, steady lead — are less likely to get attention. Such judgments may lead news outlets to distort the true state of the race.
Though all this is well known, systematically observing this selection process is hard to do. The best way to study “gatekeeping” is to compare what gets on the news to all of the stories that could have made it on the news. And scholars have a hard time knowing all the possible stories that don’t make it past the gates, although some researchers have tried.
Here’s how we did our research
To get past that, we compared two data sets about polling during the 2008 presidential campaign. First, we collected the entire array of publicly available national presidential vote polls. Second, we analyzed poll coverage on the three broadcast networks (ABC, CBS, and NBC) and on CNN, Fox News, and MSNBC for the last five months of the campaign, from June 4 to Nov. 4. We did this by analyzing transcripts on these channels between 6 and 10 p.m., and recording every mention of a poll, as well as what margin it reported between Obama and John McCain.
That meant we could compare all the polls that the TV networks could have reported on to the polls that they actually did report on.
Television news reported on “dramatic” polls more than the average poll
Here’s what we found. Television news was indeed more likely to report on polls that showed a tight race between Obama and McCain than on polls in which one candidate had a larger lead.
In the graph below, we show our data. The red line shows us the shape of poll results, which tells us the probability of a given result for all released polls. For the purposes of comparison, the margins have been standardized.) The polls were roughly normally distributed. Many showed a 0 — essentially a tied race — and others showed either small leads for Obama (which you’ll see in the positive numbers) or McCain (which shows up in the negative numbers).
But TV news was more likely to report on polls, as you’ll see in the blue line, where the two candidates were close, those between +1 and -1. You can see that in the blue spike right around the 0 mark. In other words, television news viewers got a picture of the 2008 presidential race as much closer than the polls as a whole typically showed it to be. The longer tails of the blue line also suggest television news gave disproportionate air time to a few extreme, outlier polls.
TV news also spent more time reporting on the more dramatic polls
Not only that, but we also compare the polls that made it on the air to the attention those polls received. Sure enough, television news spent more time reporting on surveys in which the race seemed close than was warranted by the actual polling.
In the graph below, we see in the red line that for all the polls that were reported on, Obama tended to hold a lead. That’s why the curve is shifted to the right of zero. Real Clear Politics’ (RCP) polling average had Obama in the lead 144 out of 154 days during the time period examined, so that’s pretty accurate.
But check out the blue line, which shows which of these polls got the most air time. That curve is clustered closer to the zero-line. On average, viewers likely heard more about polls that portrayed the race as close than about polls that more accurately showed Obama in the lead.
Finally, TV news gave more air time to surveys that showed a big jump in one direction or the other. In other words, big changes in poll margins are more dramatic — and more likely to be discussed on air.
So for TV news consumers, the message remains the same: Keep your eye on those polling averages.
Kathleen Searles is an assistant professor of political communication in the Manship School of Mass Communication at Louisiana State University.
Martha Humphries Ginn is an associate professor of political science at Augusta University.
Jonathan Nickens is a Ph.D. student in political science at Louisiana State University.