Your risk of contracting covid-19, the disease caused by the virus, depends on a number of variables: where you live, how old you are, whether you work indoors or out, whether you have high blood pressure or other so-called co-morbidities, how many people in your community have been tested, and so on. But ultimately, the difficulty of grasping the threat of this virus may have less to do with data and more to do with fundamentals: Humans are not that good at it.
Howard Kunreuther, co-director of the Wharton Risk Management and Decision Processes Center at the University of Pennsylvania, studies why people tend to be poor judges of whether they will experience a natural disaster and the policy implications of that. In the 2017 book “The Ostrich Paradox: Why We Underprepare for Disasters,” which he co-wrote, he explains that people often don’t like to think too far in the future; they also misunderstand threats and are influenced by those around them. Kunreuther sees parallels between the way people downplay the threat of natural disasters and the way they dismiss the threat of the coronavirus.
According to scientists, risk assessment involves two basic types of thinking: intuitive and deliberative. The intuitive kind is “thinking without thinking,” explains Ralf Schmälzle, a biological psychologist at Michigan State University, and “is rooted in the evolutionary need to survive.” By contrast, deliberative thinking, he says, is “mostly conscious” and “effortful ... whereby one reflects about reasons and weighs available evidence, perhaps comparable to comparing options in a chess game and deciding which move is best.”
Intuitive thinking is why, for example, toddlers can be such picky eaters, tending to avoid things with a bitter taste, and why we tend to be put off by bad smells such as rotting food. But there are threats that we haven’t evolved to perceive, and need to learn about, which is why your kid won’t eat broccoli but will happily toddle toward an electrical outlet, bobby pin in hand.
In the instance of a threat like the coronavirus, information without feeling is largely ineffective. “Knowledge alone is not enough to motivate,” explains Schmälzle. So, even though we hear that hundreds of thousands have died of covid-19, that risk may feel distant if we don’t know anyone who has had it. Our unwillingness to change our minds based on information alone can lead in the opposite direction, too: Many humans remain fearful of flying, even though statistically it’s incredibly safe.
Risk perception is also highly individual. In 2013, Schmälzle and a team from the University of Konstanz in Germany published a study that measured the brain activity of people who had watched a documentary about the dangers of the H1N1 virus. Before watching the documentary, the participants were asked how seriously they took that threat. Those who gave the most extreme answers had their brain activity measured with magnetic resonance imaging. In those who said they perceived the risk of H1N1 to be high, the part of the brain that reacts to threat stimuli lit up when they watched parts of the documentary that highlighted the dangers of contracting it. By contrast, those who said they didn’t think the virus was much of a threat did not show the same brain activity. The results, Schmälzle says, “provided a glimpse into the neural underpinnings of something that we already knew, namely that risk is inherently subjective.”
Our judgments are also shaped by those around us. “The communities that we are a part of play a huge role in how we interpret information. Oftentimes, the things that we believe connect us to certain communities,” says Meghan Moran, an associate professor of health, behavior and society at the Johns Hopkins Bloomberg School of Public Health. “We are more likely to believe things that fit in with our existing worldview or value system, so information that does that becomes more appealing.” This can contribute to a belief in conspiracy theories or a disdain for expert advice.
Scientists can do only so much to persuade us, and it doesn’t help that they’re frequently hesitant to speak in definitive terms. Schmälzle, for instance, is excited by the findings from his 2013 study but also cautioned against reading too much into them because of the small sample size and other limitations. This is how evidence-based science works. With a new threat like covid, experts’ understanding is rapidly evolving, and so are their recommendations. Take masks. In February, before widespread lockdowns, the Centers for Disease Control and Prevention said healthy people didn’t need to wear them in public. By April, as multiple studies began to show people can spread the virus without knowing they have it, the CDC reversed that recommendation, angering some and confusing many.
“In general, folks who are not scientists are seeking certainties, so they want to know what’s going to happen, and it will always be that way,” Moran says. “I think we get into trouble when we rely on terms and phrases that are a little more ambiguous than what the public is comfortable with hearing. It can be misinterpreted that scientists don’t know what they are doing. It can result in a lack of trust when it’s not conveyed in a way that’s clear to the public.”
So, what’s the solution? There are no magic bullets. But until there is a vaccine, perhaps recognizing our inherent cognitive limitations is one more way to protect ourselves.
Lia Kvatum is a writer in Maryland.