John Sides is an associate professor of government at George Washington University and a contributor to Behind the Numbers.

How often do you listen to the radio on a typical day? Would you say never, less than two hours, between two and four hours, or more than four hours?

 That seems like a simple question, and public opinion surveys are full of similar questions — attempts to measure exposure to radio, television, newspapers and the Internet. But do these questions elicit accurate answers?

 Quite often they do not, at least with a high degree of precision. And this raises big problems for those seeking to understand the relationship between media usage and political views.

 Political scientists Lynn Vavreck and Michael LaCour investigated radio usage with a novel technology developed by a company called IMMI (which is now part of Arbitron). IMMI provided small cellphones to participants that also functioned as recording devices. For 10 out of every 30 seconds, the device digitally encoded the sounds in the person’s environment. The digital encoding was then matched to the radio and television programming in the participant’s media market to identify what, if any, programs they had listened to or watched. (Note that this is not an audio recording, and thus it cannot be “listened to” or “played back.”)

 Vavreck and LaCour then compared how much respondents said they listened to radio to how much they actually did. Here’s their graph, separating participants by their response to the survey question and then dividing them into three groups: those who accurately reported, over-estimated or underestimated their radio listening.

Most people who gave lower estimates were fairly accurate. For example, just about everyone who said “less than two hours” really did listen to less than two hours of radio that week (although a few listened to three or even four hours).  But most of the people who gave higher estimates — between two and four hours or more than four hours — actually listened to much less radio. They overestimated their actual exposure.

 For each category, Vavreck and LaCour also present a “box-and-whisker” plot.  The dark bar in the middle of the box represents the median value for each category. If you compare those who say they never listen to the radio with those who say they listen for four or more hours every day, you will see that the median values are actually less than one hour apart. This particular survey item thus exaggerates the variation in how much people actually listen to the radio.


Something similar happens when people estimate their television exposure. Political scientist Markus Prior has found that the audience for the nightly network news would be 300 percent larger than it actually is if surveys were to be believed. Note that this is not because people are deliberately fudging the truth. Prior has also found that people are just making honest mistakes. We just aren’t always very good at remembering these sorts of details.

 Between now and November 2012, we will see all sorts of estimates of the impact of political ads, presidential debates, and so on. It’s natural to want the answers. But it’s not always clear how we should interpret the numbers. Obviously, analysts will want to compare people who say they watched or read or heard some media content to those who say they did not — and thereby infer whether that content affected their opinions. But the challenges in precisely measuring media consumption can make these inferences tenuous, at best.

More from the Post polling team

Sign up for Post polling e-mails

Follow Post polling on Twitter

Like Post Politics on Facebook