Preparation is the only answer for confronting the impacts of tornadoes, but the average lead time is just 13 minutes. Residents of Moore, Okla. had just over 30 minutes and look at the horrific toll.
Scientists are working feverishly to improve tornado forecasts and warnings, and extend lead times, but the challenge is daunting.
Forecasters must keep an eye on 100,000 thunderstorms that form in the U.S. each year, of which just 1,000 spawn tornadoes (1 percent). Of those 1,000 tornadoes, just one may reach the intensity of the Moore, Okla. twister (0.01 percent of tornadoes reach level 5 on the 0-5 Enhanced Fujita scale).
Doppler radar has brought about dramatic improvements in tornado detection, but it’s often difficult to ascertain tornado intensity from radar and false alarm rates are high. Radar can detect which storms are spinning, but they can’t affirmatively determine if a tornado is on the ground in most instances (unless there is an obvious debris ball signature). For every four tornado warnings issued in the U.S., only about one actual tornado touches down.
Last year, Capital Weather Gang’s Jack Williams profiled the cutting edge VORTEX2 research program, which is exploring the question why some rotating or “supercell” thunderstorms produce tornadoes and some don’t. Scientists involved with the project, set to finish around 2017, are optimistic it will advance predictions. Williams writes:
Josh Wurman, one of the eight scientists on the VORTEX2 steering committee, says that in addition to increasing the warning time from today’s average of 13 minutes before a tornado hits a particular community to 20 or 30 minutes, VORTEX2 might help push the false alarm rate from the current 70 percent down to 50 percent or lower
Cliff Mass, a professor of atmospheric science at the University of Washington, believes the most promise for gains in tornado prediction lies in powerful supercomputer model simulations, which knowledge gained from projects like VORTEX2 could feed into.
“There IS the potential for major forecasting advances in the period from 1 to roughly 6 hours before the storm, if we can run models with enough resolution and can get enough information to describe the initial 3D atmosphere with lots of detail,” Mass writes in his blog.
Mass concludes: “Skillful 1-6 hr forecasts are potentially achievable since the forecasts are short enough that the growth in forecast error is modest. And a few hour warning of a major storm allows sufficient time to evacuate folks from areas which severe weather is probable.”
As we reported last week, a tenfold boost in the muscle of the National Weather Service’s supercomputers is in the pipeline thanks to an infusion of funds from Hurricane Sandy relief legislation. Louis Uccellini, director of the National Weather Service, said this ramp-up in computer power will enhance the resolution of its so-called “convective allowing models” which simulate thunderstorms.
Irrespective of forecasting improvements, tornadoes will continue to pose major societal challenges in future years and decades. Even if lead times improve dramatically, allowing people to evacuate vulnerable areas, will they heed warnings? What’s an optimal lead time? Is there such thing as too much lead time?
And, of course, even if improved forecasts and warnings help get people out of harm’s way, growing populations, wealth and infrastructure will cause damage costs to escalate when violent tornadoes strike. The NY Times’ Andrew Revkin explores some of these difficult issues in his insightful blog post: A Survival Plan for America’s Tornado Danger Zone
Beyond better forecasts and preparation, is there any legitimate form of weather modification to physically destroy a violent tornado before it strikes?
Lots of schemes have been proposed, but there is no serious strategy for this on the table. Think about it: the Moore, Okla. contained the power of multiple nuclear bombs. Good luck defusing such a monster.