But wait. Aren’t they saying the same thing about us? Take a controversial food issue — GMOs, organics, pesticides, pollution — and there are decent people on each side who believe the blindness and prejudice are all on the other.
Years ago, I read Jonathan Haidt’s book “The Righteous Mind: Why Good People Are Divided by Politics and Religion” (and food). It’s a cogent and persuasive account of just how crappy we humans — all of us — are at evidence-based reasoning. Like Daniel Kahneman’s “Thinking, Fast and Slow,” it makes the case that we lead with our guts. Our decision-making, while feeling evidence-based to us, is really value-based. We’re driven by our tribes, our affiliations and our instincts, and we see evidence through that prism. Confirmation bias rules the human psyche.
That book scared the bejeezus out of me because my job — like that of every other science journalist — is to sort through the body of evidence to find what’s most likely to be true. Yet my brain — like that of every other human — is optimized to focus on evidence that supports my prior beliefs and dismiss evidence that doesn’t. Yikes.
That book changed my approach to journalism. My top priority became finding ways to identify, and compensate for, my own biases. Like everyone, I see other people’s ideological rabbit holes with ease and clarity, but my own are maddeningly elusive. In the process of trying to do my job better, a funny thing happened. As I met farmers, and consumers, and scientists, and policymakers, and even industry bigwigs, some of my positions softened. I became more sympathetic to people who disagree with me, even those who have views that I believe are not supported by evidence. My relationship with certainty changed.
There’s not a robust body of evidence on how to go about finding your own biases, so I’ve basically been winging it. After years of that, I’ve concluded that the best antidote for bias is also Sartre’s conception of hell: other people.
When I ask scientists how they protect against their own bias, the most frequent answer is that they read a lot. The body of evidence. The meta-analyses. The most authoritative voices in the field. And I do that too, but I don’t think it works very well. We’re just too good at confirmation bias to read ourselves out of our ideological corner. If you go into it believing pesticide residues are deadly — or not — you’re likely to come out of it believing pesticide residues are deadly, or not.
But face-to-face contact is different. “You realize the humanity,” says Dominique Brossard, chair of the department of life sciences communication at the University of Wisconsin at Madison. “It reminds you that those of us who don’t agree are actually alike in so many ways. They’re real human beings.”
I asked both Brossard and her Wisconsin colleague Dietram Scheufele how they challenge their own biases. Brossard talked about a life that’s taken her from Argentina to Nicaragua to Ethiopia to France to Wisconsin (with a few other places in between). It meant she was always in rooms where she didn’t quite fit in, and it made her comfortable with different points of view. For Scheufele, it’s teaching undergraduates. “I’m at a university that’s fairly liberal,” he told me. “I’m surrounded by faculty that’s liberal. But an undergrad class has a diversity of opinion. I get a lot of pushback from students, viewpoints I wouldn’t have seen or anticipated.”
Ratcheting down partisanship starts with our own, and that effort starts by being convinced that humans really don’t assess evidence dispassionately. Passions, after all, are those things reason is slave to. If you don’t believe it, read Haidt or Kahneman or the vast body of interesting empirical work of the Cultural Cognition Project by Yale’s Dan Kahan. While there are controversial issues galore out there, there’s very little dissent over the idea that humans are value-based, rather than evidence-based reasoners.
Once you’ve arrived at that uncomfortable truth, then what?
Try spending time with smart, thoughtful people who disagree with you. And disagree with each other. An argument on paper is too easy to dismiss. An argument on social media is even easier. An argument from a living, breathing human who you have reason to respect is much harder.
And so, here are the tactics I use to try to stay out of my own ideological rabbit holes:
Find the smartest person who disagrees with me, and listen.
Make the strongest case I can for the position I don’t hold. It’s easy to cut down bad arguments.
Try to understand what motivates views that don’t jibe with the preponderance of the evidence. People believe vaccines aren’t safe because they’re suspicious of the pharma industry and they’re trying to protect their children. People believe GMOs aren’t safe because they are suspicious of the ag industry and trying to opt out of industrialized farming. People believe in new-earth creationism because it bonds them with their community. I am sympathetic to all those positions, although I believe vaccines are essential to protecting public health, GMOs are perfectly safe to eat, and human evolution can be traced back to the primordial ooze.
Keep my social media populated with those same smart people who disagree with me. And not just to keep tabs on the enemy, to try to really understand their position.
Ideological bias isn’t the only kind. Because I am paid both to write and to speak, there’s funding bias as well, and the protection is similar: other people and multiple points of view. I’ve worked with media ethicists to craft guidelines designed to prevent conflicts of interest, and I disclose everything I do. But whether or not money changes hands, I try to spend time in rooms filled with as many different perspectives as possible. Nothing entrenches a bias like room after room of people who agree with you.
Talking to someone who disagrees with you is different from talking to your comrade in arms. You throttle back. Your tone changes. You soften. “Basket of deplorables,” Scheufele says. “Would Clinton have said that in a room full of conservatives?” Um, probably not.
Don’t get me wrong. I’m still fully capable of being snarky on Twitter or digging my heels in on a position. But I’ve also started to use mind-changing as a metric of success for my efforts to make myself more persuadable. It doesn’t happen often, even though I go out in the world actively looking to change my mind, but I have changed my positions on several issues. I used to oppose subsidizing fruits and vegetables because we can’t peg health improvements to the intervention, but I have since decided that those programs do much good even if we don’t (yet) have solid evidence that they improve public health. I changed my mind on whether we need stronger oversight of dietary supplements. (I used to think we did.) And there are a couple others.
My biggest about-face, though, was about the nature of human decision-making, I used to believe I was a reliably rational arbiter of evidence, and it’s now clear to me that I’m not. My efforts to make myself more persuadable are my best shot at compensating.
But in making myself more persuadable I hope to also make myself more persuasive. Ultimately, my work is still about fact-finding, but facts don’t persuade people. People persuade people, and if I’m less dug-in, maybe other people will be, too, and we can actually talk to each other. And talking to each other is where partisanship goes to die.
More from Food: