In May, CEOs from 13 Fortune 500 companies called on Congress and President Trump to craft public policies that will combat the economic and environmental effects of climate change. The call received substantial media attention, not only because the group included members from several large energy and automotive companies, but because these groups have substantial political clout.
However, whether lawmakers feel compelled to respond depends, at least in part, on what ordinary Americans think about climate change. Depending on what Americans believe, legislators are likely to consider different policies for preventing or mitigating climate change’s risks. If Americans doubt that human activities have caused the climate to change, they may not care about whether their legislators craft policies to respond to its effects.
So how many Americans believe that climate change is caused by human activities? The answer is surprisingly complicated.
Public opinion researchers come to very different conclusions about Americans’ beliefs about climate change
Since public opinion researchers began studying this topic in the late 1980s, you’d think they would know what Americans believe. You’d be wrong. Some polls suggest that nearly two-thirds of Americans think that humans have changed the climate, and are worried that it’s getting worse. Others find that percentage is just under or just over 50 percent.
Some recent polling data and academic studies suggest that most Republicans believe that humans are causing the climate to change but don’t really support policy measures to combat it. Others find that less than a quarter of Republicans believe in anthropogenic climate change.
Why are different polls and academic studies arriving at such different conclusions?
Minor differences in how researchers phrase the questions can lead to major differences in how people answer
We find that minor and seemingly arbitrary differences in the way public opinion researchers ask questions about climate change can have major effects on their findings.
For example, some surveys ask respondents to choose whether they think climate change is caused by human activity or natural causes (we call this a “discrete choice” question). Others ask respondents how much they agree or disagree with the idea that climate change is human caused, using Likert scales ranging from Strongly Agree to Strongly Disagree.
Because surveys take time to administer and ask questions about topics that some might find uninteresting, some people simply agree with questions they are asked — a phenomenon known as acquiescence bias. Likert-style questions can therefore inflate the number of people who say they “agree,” no matter what the question is about.
And because many Americans do not consider climate change to be a high priority, some surveys explain what climate change is (“introductory text”) before asking respondents questions about it. On politically contentious issues, like climate change, the use of introductory text like this has been shown to increase partisan disagreement.
How we did our research
In a paper recently published at Climatic Change, we investigated how minor changes in question format like those listed above might influence results. To do this, we surveyed more than 7,000 Americans online via Lucid’s Fulcrum Academic service.
Lucid invited people to participate in our study from a massive opt-in panel of potential respondents, using quota sampling. This means that respondents were sampled proportional to known census benchmarks on a variety of demographic factors, including age, race, sex, educational attainment and income. We then weighted the data to account for any remaining deviations between our sample and nationally representative demographic benchmarks. Although our data are not formally nationally representative, past research has shown that the composition of Lucid samples tends to closely match nationally representative survey data.
After recruiting respondents to participate in the study, we then randomly assigned them to answer one of eight questions about anthropogenic climate change. We varied these questions in three different ways to see whether the way they were posed might change the answers.
First, the questions used either a discrete-choice response format or an agree-disagree Likert scale. In the first, respondents were asked, “which of these three statements about the Earth’s temperature comes closest to your view” and could choose from the following options: “the Earth is getting warmer mostly because of human activity, such as burning fossil fuels”; “the Earth is getting warmer mostly because of natural patterns”; or “there is no solid evidence that the Earth is getting warmer.” In the second, respondents were asked to tell us how much they agreed or disagreed with the statement “The Earth is getting warmer mostly because of human activity, such as burning fossil fuels,” marking their answers on a seven-point scale, with 1 being “strongly agree” and 7 being “strongly disagree.”
We also randomly varied whether respondents were expressly told that they could say “don’t know,” and whether respondents read some introductory text about climate change before we asked them questions.
These seemingly trivial variations produced dramatically different results. For example, our discrete choice questions suggested that only a slim majority of the American public, and less than a third of self-identified Republicans, believe in anthropogenic climate change. In contrast, our Likert-style format found that belief held by more than two-thirds of the general public, and more than 60 percent of self-identified Republicans.
Our research cannot say for certain what Americans actually believe. Instead, it shows that researchers, pollsters and reporters must carefully consider what conclusions they draw from opinion polls about climate change. It’s important to compare any polling results with the results from similarly structured questions — not questions that are structured differently.
What should public opinion researchers do now?
We hope that the academic community and survey researchers will work to standardize how they ask questions about climate change. This is important for two reasons.
First, consensus can help rule out whether how the question was asked is why polls get such different results. This would help us better study whether changes in media coverage are responsible for changes in belief about climate change, over time.
Second, without standardization, polls send mixed signals to policymakers. Uncertainty could make it difficult for lawmakers to understand whether citizens want them to pursue policies to mitigate climate change.
For now, we encourage those who study or share climate change opinion to openly discuss their survey design choices. Given the policy implications, accurately measuring opinion on climate change is more important than ever.
Matt Motta (@matt_motta) is a science of science communication postdoctoral fellow at the Annenberg Public Policy Center at the University of Pennsylvania and is based at the Yale Law School.
Dan Chapman is a science of science communication postdoctoral fellow at the Annenberg Public Policy Center at the University of Pennsylvania and is based at the Yale Law School.
Dominik Stecula (@decustecu) is a science of science communication postdoctoral fellow at the Annenberg Public Policy Center at the University of Pennsylvania and a nonresident postdoctoral fellow at the Centre for Public Opinion and Representation at Simon Fraser University.
Kathryn Haglin is a science of science communication postdoctoral fellow at the Annenberg Public Policy Center at the University of Pennsylvania and is based at the Yale Law School.