Page 2 of 2   <      

The EPA's 'cancer premium' shows how fear can trump real risks

The EPA proposal tries to respect those feelings. Under its "cancer premium," the VSL of lives saved from cancer would be 50 percent more than that of lives saved from other causes of death. So this premium would incorporate our greater fear of cancer into the analysis of whether regulations are worth the cost. That may seem pretty democratic: Cancer is scarier, and shouldn't our government protect us from the things we're more afraid of?

The danger of this approach, which is under review by the Economics Advisory Committee of the EPA's Science Advisory Board and open for public comment, gets back to the psychology of risk perception.

Our assessment of risk is not a matter of pure, conscious reason. It's a quick, subconscious, instinctive combination of facts and feelings, reason and gut reaction. This is neither smart or dumb, rational or irrational. It's just part of who we are, and despite its pitfalls, it's pretty good at keeping us safe.

But for all its success at getting us this far, a system that depends on affect and not just fact can also get us into trouble, because sometimes, we get risk wrong.

We're more afraid of some risks than the evidence indicates we need to be, or not as afraid as the evidence says we ought to be. Some people are excessively worried about vaccines, or fluoride (as a carcinogen), or radiation from nuclear power, but they aren't worried enough about the measles or whooping cough the vaccines prevent, or the tooth decay the fluoride staves off, or the tens of thousands of people killed every year from particulate pollution because we get more of our electricity from burning coal and oil than from splitting atoms.

And America has many more laws, and spends far more on research, as part of the war on cancer than it devotes to the fight against heart disease, which kills 50,000 more of us per year.

These perception gaps are risky in and of themselves.

If you as an individual choose to decline vaccination or fluoridation, or if you're more afraid of nuclear power than fossil fuel power, that's your concern. Your feelings about the risks might not match the facts, and your personal perception gap might cause you harm, but unless your perceptions and actions hurt me or society, they're your business alone.

But when we all fear similar risks because they have psychological characteristics that make them scary, and as a community we're more afraid of smaller risks and less afraid of bigger ones, our individual perception gaps become societal. Together, we push for government policies that protect us more from what we're afraid of than from what's more likely to kill us. Resources devoted to lesser risks aren't available to protect us from the bigger ones - meaning that our overall risk goes up.

The EPA itself recognized this in the late '80s in a study called "Unfinished Business: A Comparative Assessment of Environmental Problems." It found that the agency was spending too much to reduce some relatively small risks - hazardous-waste sites, underground fuel tanks and garbage dumps, all of which were hot topics in the news back then - and not enough on some bigger ones, such as radon, global warming (this was back in 1987!) and chemicals being dumped into rivers and coastal waters. The report said it directly: "EPA's priorities appear more closely aligned with public opinion than with our estimated risks."

This is just what the EPA's cancer premium proposal would enshrine as a matter of policy. It would give an advantage to regulations to control carcinogenic chemicals in the air, for instance, and disadvantage rules to control particulate air pollution, which contributes to cardiovascular deaths - which are far more common but, we think, less scary. As unpleasant as it may seem to argue against the cancer premium, it could increase the overall environmental death toll in America.

So how, in a democracy, is government supposed to deal with the risk of risk perception? How do we square the subjective way we perceive risk as individuals with the social and governmental goal of rationally using our communal resources in ways that will do the most good?

Here's a suggestion: We need to recognize that, just as there are physical risks that we study and try to manage, there are very real risks from the perception gap that also need to be recognized, studied and accounted for in policymaking. Getting risk wrong is risky.

We use tools such as toxicology and epidemiology and economics to identify and analyze how to deal with those physical threats. We should also use neuroscience and psychology and sociology and economics to recognize the dangers posed by our misperceptions and to analyze those threats the same way we analyze and manage any others.

That can help us handle the gap between the facts and our feelings about the facts.

David Ropeik is an instructor in the Harvard Extension Program, a consultant on risk perception and risk management, and author of "How Risky Is It, Really? Why Our Fears Don't Always Go with The Facts."

<       2

© 2011 The Washington Post Company