In the past few days, the climate change debate has reverted to a familiar mode: Follow the money. The New York Times and many other outlets reported on climate-doubting researcher Wei-Hock (“Willie”) Soon and his apparent receipt of energy industry monies to support his research programs. Soon, who has not responded to comment requests from The Washington Post, has often argued that the sun is the real factor driving climate change — a position that is rejected by the U.N.’s Intergovernmental Panel on Climate Change, the world’s leading authority on climate science.
The Soon story prompted much outrage — and now, it has also prompted another attempt to follow the money. On Tuesday, House Democrats on the Committee on Natural Resources sent letters inquiring after the funding sources of several other researchers who have challenged various aspects of climate science — requests that, some suggest, may go too far.
Funding sources can certainly have a significant influence on scientific results. There’s much research to support this idea, which is why disclosure is very important. Nonetheless, funding sources alone may not fully explain why it is that in the climate debate, we witness an odd but persistent phenomenon of politicized scientific disagreement.
On the one hand, a large bulk of climate scientists accept the idea that humans are causing global warming — and a large bulk of predominantly liberal and centrist people agree with them and think they’re the real, credible “experts.” On the other hand, a smaller minority of scientists that includes Soon — often allied with politically conservative think tanks — disagree. But plenty of (predominantly conservative) people think that members of this smaller group are the real “experts” — the Galileos, as they might put it, bravely standing up to an oppressive scientific majority.
This is a psychologically interesting situation, to say the least. Aliens who arrived on Earth and examined the climate debate might infer that human beings don’t identify scientific experts primarily based on their knowledge or their credentials, but rather based upon the actual positions they hold — and how congenial those positions are to their own ideological or religious identities.
Those aliens would be on to something.
To understand why, you first need to understand the theory of motivated reasoning, which explains many mysteries about humans — in particular, how they cling to personally important beliefs in the face of contrary evidence and come up with intricate and sophisticated arguments to explain why those beliefs are right.
The theory posits that because certain views are so important to our identities, our senses of self and our tribal or group affiliations (e.g., with the Democratic or Republican Party), our brains emotionally tag new, incoming information based on how it fits into a broader, preexisting worldview. Much of this happens unconsciously; we aren’t even aware of it. But because it happens, we are primed to feel suspicious about — or immediately reject — new information that challenges existing beliefs, and also to welcome new information that favors those beliefs.
This spills directly over into arguments about science, but with a twist. The structure of science, you see, is such that if you want to use it to build arguments that make you feel validated, then you need to cite studies, reports and experts. This means that out of the huge volume of information being churned out on the subject of, say, climate change, you must pick and choose your studies and your experts.
And how do you do so? The answer is obvious. You do so based on your preexisting beliefs and commitments.
Thus, in one 2011 study, Yale researcher Dan Kahan and two colleagues examined what they called the “cultural cognition of scientific consensus,” or how our in-group affiliations help determine whom we believe to be a scientific expert. Research subjects were asked to identify whether a fictional scientist was a “knowledgeable and trustworthy expert” after being shown some credentials and an excerpt from a book the expert had supposedly written. The excerpt took a pro or con stand on one of three issues — global warming, nuclear waste disposal or the carrying of concealed firearms.
The study found that political ideology was a very strong predictor of whom we consider to be a scientific expert on these issues. Thus, 89 percent of liberal-leaning “egalitarian communitarians,” but only 23 percent of conservative-leaning “hierarchical individualists,” agreed that a scientist who described global warming as real and caused by humans was a “trustworthy and knowledgeable expert.”
So we self-select and validate the experts who agree with us and who make us feel good. But what’s in it for the experts themselves?
If you believe the “follow the money” narrative, then the answer is, well, a paycheck. But the motivated reasoning interpretation sees things rather differently. Without denying that funding sources can influence how we see the world, it further posits that: a) the experts are often ideological, too; b) they are very unwilling to admit errors in their past work and past positions; and c) they are just as susceptible as the rest of us to “myside biases” in how they interpret new information.
Thus, not only might conservatives who are trained in science or engineering be more inclined to credit scientific explanations that challenge climate change, but they’re also more inclined to seek out the arguments of other sympathetic experts who agree with them.
Meanwhile, funders, too, seek out viewpoints they agree with and are more inclined to give them financial support.
Indeed, it may even be the case that experts are capable of being more biased than those with less expert knowledge, at least up to a point. Thus, numerous studies have shown that as people become more scientifically and mathematically adept, politics has more of a polarizing influence on how they interpret the science of climate change.
However, this may not be true at the highest echelons of science, where research has shown that those who have published “most actively” in the field of climate change research overwhelmingly agree with the mainstream scientific consensus.
Nonetheless, motivated reasoning helps us understand how we can have a group of scientists who repeatedly challenge the scientific consensus on global warming, who come up with very sophisticated and technical arguments to explain why that consensus is wrong and who are not swayed when scientists who support the consensus refute their arguments. And why these doubting scientists are such heroes to those who, for political reasons, don’t want to see action on climate change.
This analysis could lead to a position of radical subjectivity: Everybody’s biased, even the experts, and there’s no truth out there to be discerned. Not by flawed humans, anyway.
But that would be a mistake. Rather, the more reasonable position to take in light of the theory of motivated reasoning is that we should trust the scientific community as a whole but not necessarily any individual scientist. Individual scientists will have many biases, to be sure. But the community of scientists contains many voices, many different agendas, many different skill sets. The more diverse, the better.
Thus, in a large community of researchers, where criticism is encouraged, biases offset and check each other — in theory. It may not always work out perfectly, but it’s the best system we have in an imperfect world. And that’s why the scientific process must be respected and why scientific consensus matters and should be taken very seriously — on climate change, or anything else.
In a sense, scientific consensus is simply what is left standing after all the biases run their course.
In the end, then, the problem isn’t what any individual researcher believes, or any individual’s biases. Rather, it’s a political and media structure in which people are encouraged to argue about science by citing their own facts and their own experts — and where scientific consensus is often disregarded or attacked and not given the weight it deserves.
We’re biased, yes, but it’s up to us how much we let those biases shape the world.