In an emergency, our future may hang on a few words, provided by someone in authority, as we face a fateful decision. The clock might be ticking quickly, as when a severe storm approaches, or slowly, as when we ponder treatment for a severe illness.
Good information doesn't make the world better, but it does tell us what we're up against. The back-to-back calamities on the Gulf Coast have shown what can happen when people have bad or incomplete information to act upon. Sadly, many of the problems there were predictable, given how official communications were managed and how rampant rumors became. Preparations could have been a lot better had state, local and federal authorities spoken with a single voice and made sure that their words were being understood.
There are lots of things that might be said in advance about any looming risk, whether immediate or long-term. Far from the action, it's interesting to think about how oceanic loop currents may (or may not) affect hurricane intensity or how tighter airport security does (or does not) reduce the odds of a terrorist attack or how dietary fat may (or may not) affect cancer risk. But when we need to make a quick decision, what we really need are the most critical facts, such as:
* Will there be enough gas available on the evacuation route?
* Is the third rail still active when the lights are out in the Metro?
* Is vaccination at all effective after exposure to smallpox?
* How dangerous is the fallout from a dirty bomb?
* Can I trust the schools to protect my children, while I ride out the crisis at work?
* Just how bad are the side effects of those painkillers?
It's not that hard to understand the answers to these questions. It's not even hard to understand a well-prepared explanation of the research that allows us to say that the answers are true and accurate. But if people who need to know these facts don't get them in a timely and understandable way, then those responsible for communicating the facts have failed.
Without a scientific analysis of what happened after Hurricanes Katrina and Rita, we won't know exactly what officials said and what citizens heard. A preliminary assessment suggests that the messages worked much better for some people than for others. In both hurricanes, many people got out of harm's way. In Katrina, many others remained behind and suffered. Some stayed behind willfully; some didn't understand their predicament; some had no options. With Rita, many people left who could have stayed home had they only understood -- and believed -- official communications. Instead, many fled, creating a massive traffic jam in which a bus overheated and exploded, killing 23 passengers.
This confusion has unnerved people far from Hurricane Alley, who are left wondering whether they can rely on what the authorities are saying. That uncertainty raises new questions. Will they be forced to choose among competing experts? Will they feel duped into actions that leave their families vulnerable?
Communicators fail for known reasons. Here are some possibilities that a Katrina Commission should examine looking backward, and that every citizen should consider looking forward, in assessing the communicators in a crisis:
They don't understand how to talk to their public. People tend to exaggerate how well they can put themselves in others' shoes. That happens sometimes among friends. It is even more likely when experts address nonexperts. As a result, the experts say things that people already know, in terms that the public doesn't understand, while omitting important information. When talking to patients, physicians have some chance of recognizing that they're missing the mark. Experts making official announcements, over the radio and television, have no chance at all.
Nor is there any substitute for asking people in the intended audience how they interpret a message, before releasing it to the world. Seemingly simple terms, such as "shelter in place," "climate," "rare side effect" and even "safe sex" mean different things to different people. No one would put a drug on the market without testing it. Yet we rely on labels that leave users guessing at the extent of the risks and benefits. We issue emergency instructions without running them by anyone.
They don't trust their public. Sadly, once we misunderstand other people's predicaments, we often judge them harshly if their actions don't make sense to us. We don't guess that a woman might have stayed behind in a hurricane because she didn't know that her ex-husband had already taken their child to safety, or that a man on probation might decide he couldn't leave because he hadn't been able to reach his probation officer.
Unless they know their public well, officials are not immune to these biases. One often hears "experts" predict mass panic in an emergency. Yet studies since the London blitz during World War II have shown that people behave responsibly, even bravely, in crises.
In surveys, Americans say that they want to be leveled with, even if things are bad. Both Israel and the United Kingdom have communication policies that embody such a fundamentally respectful stance. Living in Jerusalem during the 1973 Yom Kippur War, my wife and I were not happy to hear the Israeli authorities say "Our troops are still in the process in slowing down the enemy." However, we were grateful to know where things stood.
They don't care solely about their public. Risk communication is a public health function, helping people make the best choices for themselves and their loved ones. Yet the communications job is often given to public affairs, or public relations, or even marketing people. These professionals have valuable skills. Any organization without them can get eaten alive. However, they naturally focus on their employers' welfare, and not just that of their audience. When people feel that their lives are on the line, they just want unspun facts.
Only the most cynical and shortsighted public affairs person would emphasize spin over substance in a crisis. However, research finds that changing hats is not that easy. Conventional concerns so permeate people's thinking that they cannot set aside their habitual ways of communicating, however hard they try.
My wife and I were in London during the third week of July. On the 21st, the day of the second bombing that month, we heard a public health-focused response, soberly revealing the facts as they became known. On the 22nd, we heard the same agencies provide what turned out to an inaccurate account of the police killing of an innocent Brazilian immigrant, creating a crisis of confidence that is still reverberating.
They have nothing to tell their public. Without the proper research in advance, an agency cannot assess the risks that its public faces. In that case, it's better to plead ignorance than to offer confident guesswork. Of course, no one wants to be told, "Beats me. You're on your own." But that's a more useful message than an unsupportable "Trust me." Still, there's always the temptation to pretend to know what should have been known, and to blame the victims for not doing the impossible.
Communication is part of any relationship. After Katrina and Rita, after the anthrax mailings, the D.C. sniper attacks and the abortive smallpox vaccination campaign -- indeed, after any public crisis -- citizens will ask, "Did you listen to us, so that you could tell us what we needed to know?" When the answer is yes, the authorities increase their standing as information providers. When the answer is no, the authorities undermine our resilience as a society.
Author's e-mail: firstname.lastname@example.org
Baruch Fischhoff is a professor of social and decision sciences at Carnegie Mellon University in Pittsburgh, and president of the Society for Risk Analysis. He does research and writes frequently about assessing and conveying accurate information about risks.