At 8:07 a.m. Saturday, the Hawaii Emergency Management Agency activated its civilian early warning system with a message sent to cellphones in the state:


The one part of the message that was correct was that it was indeed not a drill. It was an accident. No missile was incoming, and the warning was, according to the agency, the result of simple human error. As part of a routine test of readiness, a staffer selected “live alert” rather than “test alert.” That was all that it took for panic and fear to spread. Hawaiian officials scrambled to tell the public that it was a false alarm, but it still took 38 minutes for the same system to broadcast a message of safety.

We live in a tense time. Former secretary of defense William J. Perry recently warned on Twitter that “we are at greater risk of nuclear catastrophe now than we were during the Cold War,” and although experts and wonks can argue the point, we certainly have all the appearances of being in the grip of a “nuclear war scare” of the sort not yet seen since the Cold War. Sen. Lindsey O. Graham (R-S.C.) suggested recently that there was a 30 percent chance of Washington using a “military option” against North Korea, and increased that to 70 percent if North Korea tested another nuclear weapon. While some in the Trump administration, leaks suggest, favor a “bloody nose strike” against the regime, Graham at least acknowledges that any attack against North Korea would probably be an existential one to the regime: “There is no surgical strike option …. So if you ever use the military option, it’s not to just neutralize their nuclear facilities — you’ve got to be willing to take the regime completely down.” North Korea would surely suspect the same, and react to any attacks as an existential threat, the one scenario in which they could be readily assumed to use their nuclear arsenal against U.S. allies and/or American cities.

How much of the war talk is a real reflection of internal administration thinking, how much of it is speculation, and how much is a bluff? Not only is it hard to know from the outside, it may be impossible to know at all at this point. If the past tells us anything, it is that these decisions may not be made, certainly may not be finalized — may not be part of some grand strategy or plan. They rest on the judgment and determination of very small number of people, on the balance of rivalries and disagreements among Cabinet members, staffers and whoever might have President Trump’s ear at any given moment (even whoever he may be watching). In the case of nuclear weapons use, the authority to order an attack lies with a single person: the president himself.

What does the Hawaii mishap have to do with this? It is, if we needed one, a powerful reminder that we live in a world where information and misinformation spreads like wildfire, where tensions run high and the systems to detect, warn and react are all immensely complex and rely on the participation of fallible human beings.

There are, of course, better and worse ways to design these systems. For instance, it was a surprise to learn that the difference between a test alert and an actual alert wasn’t better marked, and maybe more than one confirmation box could have popped up before “THIS IS NOT A TEST” was broadcast to a million people. Of course, in a world full of confirmation boxes screaming for attention, we have all become somewhat desensitized to them, but user-interface designers have surely come up with systems that are less likely to be triggered accidentally.

But if it’s not one thing, it’s another. As the sociologist Charles Perrow wrote decades ago in his aptly titled book “Normal Accidents,” “trivial events in nontrivial systems” are unavoidable and should be expected. On Saturday, it was the click of a wrong button. In the future, it might be a malfunctioning or hacked computer, a malicious internal operator, an erroneous sensor — who knows? Perrow’s research found that accidents of an unexpected character, especially those that resulted from some kind of unlikely technical malfunction coupled with human error, happened on an alarmingly regular basis, no matter how much effort was put into quashing the bugs. Usually the damage done was local and reversible. But in some systems, such as those involved with nuclear weapons, the consequences could be massive.

The Cold War was full of such computer and sensor malfunctions, both in the United States and the Soviet Union, which set off warnings that missiles were inbound. Those that came during times of reduced tensions were easily understood as probable errors. Those that came during the times of fear and mistrust — such as the Cuban missile crisis or the War Scare of 1983 — pushed the world perilously close to the nuclear brink.

Some of the stories are so absurd as to be scarcely believable, if the documentation did not back them up. Scott Sagan, in his book “The Limits of Safety,” relates several, but here was the strangest: At a military base in Minnesota during the height of the Cuban missile crisis, a guard mistook a bear for a Soviet saboteur. He sounded the intruder alarm, which rang warning bells at other bases in the region. At Volk Airfield, however, the warning bell was erroneously wired, and instead a nuclear attack warning went off, causing nuclear-armed jets to scramble, searching for enemies. You can’t make this stuff up.

One of the lessons that President John F. Kennedy and Soviet Premier Nikita Khrushchev took away from the crisis as a whole is that once tensions are high, even the supposed leaders are no longer in complete control. All it would take is one stray fighter, one malfunctioning system, one bad decision at a low level, one incorrectly communicated message, and hundreds of millions would die. As it was, there were many false alarms during the Cuban missile crisis alone — including one in which an American warning system erroneously detected an incoming missile from Cuba! It was not the robustness of our system, or even the judgment of our politicians and military, that saved us then or during the rest of the Cold War. It was, as then-Defense Secretary Robert McNamara would later emphasize: “In the end, we lucked out. It was luck that prevented nuclear war.”

The Post visited a nearly untouched 1960s fallout shelter in Washington, D.C., to see what lessons we can learn from the past. (Erin Patrick O'Connor, Daron Taylor, Monica Hesse, Thomas LeGro/The Washington Post)

We don’t have to accept this as our fate, as yet another terrible aspect of the world we happen to live in. There are ways to make the chance of war, by poor decision or malfunction, less likely. At one extreme, of course, we could find a way to dismantle the global nuclear weapons system. This does not seem likely in the next year or so, especially given that the soon-to-be-released Nuclear Posture Review of the United States seems to put (if a leaked draft is an indication) even more emphasis on the importance of nuclear weapons. And it seems unlikely that North Korea will be scared into giving up its nuclear arsenal in the near term. No state that has ever given up nuclear arms it possessed did so out of fear; fear is what drives states to get, and to improve upon, their weapons, not to give them up.

Similarly, one could hope that the Hawaii incident will be a wake-up call for state and federal agencies who are meant to communicate with the public on these matters to review their systems with closer scrutiny to the possibility of human error, system malfunction or deliberate manipulation by hackers.

But ultimately the short-term solution would be to decrease international tensions, to get us out of this war scare. Our leaders and diplomats have the power to defuse this crisis, if they choose to use it. They probably can’t force North Korea to denuclearize. But they can send a strong, unambiguous signal: The United States does not seek war if it can at all be avoided, and the United States would not attack North Korea unless it crossed a well-defined and credible red line (such as attacking our bases or allies). If North Korea didn’t think we were itching to destroy it, it might be less likely to do something fatal.

Because if our systems can fail, so can theirs. And consider this: What’s scarier than a false warning of a North Korean nuclear missile? A North Korean early warning system, all too fallible itself, ever searching for that incoming attack.

Read more: