If you are asked to answer a 15-minute survey about your health habits, what format would you prefer? A phone call to your land line? A cellphone call? A questionnaire in the mail? An e-mailed one? Or maybe a survey you can access through a smartphone app?
Inquiring research minds want to know.
Surveys may have a bad rap as sneaky ways to sell consumers more stuff or find out more about them. But health surveys have a higher goal: They might ask whether you smoke, whether you have been tested for HIV and whether you got a flu shot this year because the answers are crucial for government agencies, research scientists and others who assess health issues and make policy decisions. The information is often used to determine funding and initiate programs that might improve lives and extend life spans.
For example, in New Mexico, health experts cited data from the Behavioral Risk Factor Surveillance Survey showing that colorectal-cancer screening rates were significantly better in states with mandatory insurance coverage for the testing. That prompted New Mexico’s legislature to pass a law requiring that insurers cover colorectal screening for residents 50 and older.
In the District, BRFSS data have been used to compile a comprehensive overview of how often people get themselves screened for breast cancer and cervical cancer, and to provide baseline data for the city’s diabetes program. (This information was used to create consumer fact sheets and to aid in applying for grants.)
BRFSS, a national survey coordinated annually by the Centers for Disease Control and Prevention, has long been the model to follow for health surveys. Traditionally done by random-digit dialing of land-line telephone numbers, it got responses from more than 350,000 adults in 2010. But growing privacy concerns and changes in communications technology have caused a drop in response rates for the BRFSS and other surveys. And health researchers are desperate to adapt to the new reality.
“Telecommunication is changing around us, and we have to adapt as fast as we can in order to stay in business,” says Ali Mokdad, a former director of the BRFSS who is now a professor of global health at the Institute of Health Metrics and Evaluation at the University of Washington.
Thirty percent of U.S. households now have no land line and rely on cellphones, according to the National Center for Health Statistics. That’s up 14 percentage points from three years earlier. So the BRFSS began calling cellphone numbers in 2009.
But calling cellphones, which would include cellphone-only users, raises the question of how representative the survey will be.
Research published in 2009 by Stephen Blumberg, a senior scientist at the NCHS, found that cellphone-only users were more likely than land-line-only and land-line-plus-cellphone users to be male, Hispanic and young, with incomes below the poverty level. In addition, they were more likely than others to binge-drink and smoke, to have been tested for HIV and to have no health insurance. As a result of these differences, simply adding cellphone users to the survey can skew results.
Blumberg says that “even when factors such as age and income are accounted for, there are still differences between the two groups.” Those differences mean the new data will be hard to compare with the old. But not calling cellphones also exacts a price: The data collected only from land lines skew the picture of the nation’s health by excluding a growing group.
And adding cellphones poses other challenges as well: No published lists of cellphone numbers exist, so interviewers dial random numbers. Yes, that’s the same method used for calling land lines, but cellphone users are different: Some use their phones for texting only and don’t answer calls to a greater extent than land-line users; others simply don’t answer calls from unknown numbers. These failures increase the time it takes to reach a responder, thus raising the cost of the study.
Cellphone surveys also cost more because many research organizations reimburse subjects for the incoming minutes used, with gift cards or money in order to get a higher rate of responses. Mokdad estimates that it costs about $70 to get a complete survey response from a land-line user; that cost doubles with a cellphone user.
Doubts had been raised about the value of including cellphone responses in surveys, the thought being that respondents might be more distracted by noise or poor call quality than people who answer land lines. But a recent doctoral dissertation found no strong evidence that the answers from cellphone respondents are inferior.
Mokdad and some other researchers think that in these days of heightened concerns about losing privacy and disclosing personal information, people are more likely to respond, and respond fully and accurately, if they can choose how to take a survey. “Perhaps even responding on Facebook could be an option down the road,” says Mokdad.
The new databases also let surveys use multiple modes of contact.
Michael Link, a former senior researcher at the BRFSS who is now head of survey methodology at the polling firm Nielsen, says that in 50 to 60 percent of cases, researchers are able to match land-line phone numbers to addresses. That provides more than one way to reach a potential responder, which can increase the response rates: A researcher might send a postcard to an address asking that an adult household member call the phone number listed; if there’s no response to the postcard, the researcher can follow up with a phone call.
Link says that telephone interviews are still preferred over, say, written questionnaires, because phone interviewers use computerized forms that prompt additional questions based on the responses, a process not easily done with a paper format. But researchers say they can adapt to any format.
In January, Mokdad’s group at the University of Washington began work on a survey for the National Institutes of Health on health-care-access disparities and chronic conditions. Researchers sent postcards asking people to go to a Web site to choose how they would like to answer survey questions — by mail, e-mail, phone or in-person interview. Mokdad says that 10 percent of the postcard recipients took the survey, which his team considered a strong response.
But Mokdad points out that in this case the research team was starting from scratch, and so could design the survey in a variety of ways. Surveys that already have an infrastructure, such as the BRFSS phone banks, will be able to tinker, such as by adding follow-up methods, but won’t be able to make drastic changes for years, largely because the funding doesn’t exist to make broad changes all at once.
One technology that is being studied for survey use is smartphone apps.
In a small unpublished study, Trent Buskirk, an associate professor of biostatistics at the St. Louis University School of Public Health, found that those responding to surveys via a smartphone app were more likely than responders in other formats to answer all the questions. Incomplete responses are a problem in many surveys.
Nielsen’s Link says that survey firms and government groups are recognizing that they are probably facing lower response rates for many pivotal studies.
“People are inundated by requests for information on the many platforms — phone, e-mail, social networking — they use to communicate, and that’s also lowering the response rate for even crucial government-sponsored surveys. So we are looking to have a representative sample of survey participants instead. . . . That way, if we’re going to fewer people, we are improving the chances they are a more representative sample.
“A 60 to 70 percent [response] rate was a good goal,” says Link. “That’s becoming less and less a key indicator. Instead, it’s ‘Did I have the right mix?’ ’’
Kritz is a freelance writer.