Satellite image of the President's Day Storm of 1979. (NOAA) Satellite image of the President’s Day Storm of 1979. (NOAA)

On Feb. 18 and 19, 1979, an epic winter storm took Mid-Atlantic forecasters by complete surprise. The storm buried much of the Washington, D.C.-Baltimore area under more than 20 inches of snow, sometimes falling at rates up to a virtually unprecedented five inches per hour.

It would go down in history as the infamous President’s Day Storm of 1979. While it wasn’t the first surprise storm for forecasters — nor would it be the last — these surprises are far less frequent thanks to hard-won advances in meteorology that were indisputably stimulated by the storm, and the development of operational weather prediction models and strategies that came after it.

On May 24, 2014, a colloquium was held to focus on the advances in understanding and prediction of extra-tropical cyclones in the 35 years since the surprise snowstorm. A number of eminent atmospheric scientists participated, including luminaries in the field. Wes Junker, Jeff Halverson and Steve Tracton represented the Capital Weather Gang at this meeting. Here we present some of the key advances in cyclone prediction discussed at the colloquium.

A look back at the President’s Day Storm

Paul Kocin and Louis Uccellini, the director of the National Weather Service (NWS), began the program by presenting overviews of the storm. One of the slides presented (see below) displayed a comparison between the observed analyses and model forecasts of the storm at high altitudes (500 mb) and at the surface. The model drastically under-predicted the strength of the system 36 and 24 hours prior to the event. Even the forecast just 12 hours in advance still predicted the storm too weak and far offshore. The storm intensified enough to produce an eye-like structure in visible satellite imagery.

A summary of the storm’s synoptic structure as seen from satellite (note that the storm had a central eye), upper air and surface analyses. Also shown (bottom right) are LFM forecasts of the storm. (Paul Kocin, NOAA)

Uccellini noted that the blizzard came in two parts, around four inches of snow fell from a warm front prior to the explosive deepening that occurred during the early morning hours of Feb. 19. He noted that the snow stopped at around midnight. That lull prompted forecasters to drop the warning with most forecasters feeling that only another inch or two might fall with the upper center.

Then the rapid intensification kicked in and another 20 inches fell in some areas. In between the two periods of snow, the moon showed faintly through the clouds in some areas. The snow during the early morning hours fell so rapidly that it produced whiteout conditions leading to total accumulations of up to 27 inches on the east side of the city. Luckily, that Monday morning was a holiday.

A catalyst for change

The poor forecasts of the storm prompted a number of investigators to try to understand what went wrong and get a better grip on processes that led to such rapid deepening and excessive snowfall rates. Uccellini noted that the storm ushered in the use of model simulation to better understand the physical processes involved in such storms.

Prior to that time, most model research was aimed at severe weather outbreaks. A number of researchers headed by Uccellini and Lance Bosart dove into studying the storm. Back in 1970, case studies took years to complete. Today, similar diagnostic studies can be completed almost instantaneously. Such studies eventually aided in the development of three 3-D conceptual models of East Coast snowstorms.

Steve Zubrick, the science and operations officer at the NWS office serving Washington, D.C., noted that local forecast operations have changed markedly since 1979. He interviewed one of the forecasters that had worked during the 1979 storm. The individual noted that only limited model fields plus the two forecast models available had much lower vertical, horizontal and temporal resolution than current models (see the table below). Satellite data were not yet available in the models, and boundary layer processes were missing from the models.

The red highlighted models were the only ones available to forecasters in 1979; the models in black text constitute the suite in 2014.  Note the models listed are only those run by the NWS and do not include other international centers. The grid spacing for the GEFS and GFS is through day eight, the resolution of both become coarser beyond that time.

Suffice to say that advances in models are exceedingly more complex in principle and practice than suggested by the table. A number of physical properties such as radiation, cloud physics, convection, evaporation, soil properties (to name a few) must be parameterized because they occur below the scale of the grid boxes. These are essentially estimations of what happens during these processes. As models have become more sophisticated, these parameterizations have increased but still are far from perfect.

The increase in model resolution and addition of more sophisticated physics has greatly improved the ability to forecast major storms. It also made it more likely that models would correctly forecast the snow banding structure that often occurs in major snowstorms. Snow banding tends to be a wildcard in the forecast, but improvements in high-resolution models now enable more precise predictions of snow band location, movement and intensity trends. Snow bands create extreme gradients in snowfall — the proverbial “boom scenario” for some locations — and this can be more precisely delineated a few hours in advance.

The next catalyst that forced changes in the way storms were forecast was the “no surprise” snowstorm of Jan. 25, 2000. That storm was infamous because one week before the storm, NWS Headquarters released a statement that the introduction of a new supercomputer “puts us closer to reaching our goal of becoming America’s no-surprise weather service.” Unfortunately, the general public received almost no warning of the record-breaking snow that resulted from the storm in Raleigh, N.C., or the near-blizzard that struck the nation’s capital. Uccellini noted that NCEP ran a number of ensemble members on the storm after the fact with data that was archived prior to the storm. Slight tweaks to the initial conditions of the various members allowed several of them to correctly predict the storm. This ensemble forecast helped initiate the increased use of ensemble model runs to start quantifying the uncertainty in forecasts of storms and their associated precipitation. The NWS Weather Prediction Center now routinely produces probabilistic forecasts of snowfall amounts.

A new emphasis on uncertainty

Back in 1979, there was little contact between the forecast office and decision-makers. Even during the storm, in all its vigor, the forecaster on duty that night noted that there were few — if any — phone calls made to the National Weather Service. Today, numerous methods of communication are available for coordination between the Weather Service and communities that need to act on the forecasts.

In addition to the operational models, the NWS office serving the D.C. area utilizes 32 ensemble members to help make more probabilistic forecasts and also to produce best- and worst-case scenarios. And the NWS is now staffed with Warning Coordination Officers whose job it is to talk to public officials including the Virginia Department of Transportation, Maryland Department of Transportation and the federal Office of Personnel Management (OPM) prior to a storm and provide them with worst-, best-, and most-likely-case scenarios based on the ensemble guidance.

An example of snowfall probability forecasts, generated at the NWS Sterling office, greatly improve the communication of forecast uncertainty available to the public. (NWS)

At the colloquium, Bob Ryan, a well-known and respected TV meteorologist in the D.C. area, noted that a storm’s start time is often even more important than the total amounts. How to best convey uncertainty about forecasts to the general public still is a work in progress especially for extreme events since some individuals may have never seen a similar storm. For some end-users, forecast uncertainty, more than snow amount, drives critical decisions. For example, state motor highway agencies will assign crews to sander trucks if told that one to three inches of snow is likely. However, if six to eight inches of snow is also possible, the agency may opt to equip the sander trucks with plows — just in case.

In another common scenario, airlines at Reagan National and other airports weigh the risk of a snow event against risk tolerance; if certain pre-determined thresholds are exceeded (i.e. the forecast is highly probable for three-hour snowfall exceeding 1.5 inches, or 24-hour snow exceeding 6 inches) airlines will cancel flights before the snowstorm ensues.

The big ones don’t catch us off-guard anymore

However, improvements to forecasts have not been driven by improvements in resolution and physics alone. Cliff Mass, a professor of atmospheric science at the University of Washington, noted that intense extra-tropical cyclones on the West Coast sometimes are accompanied by hurricane-force winds. A case can be made that these Pacific coastal storms are perhaps the most powerful extra-tropical cyclones to strike North America. He noted that storms such as the 1962 Columbus Day Storm were much more intense than the 1993 Storm of the Century and had winds up to 145 mph. More than more 11 billion board feet of timber were blown down in northern California, Oregon and Washington during the Columbus Day Storm, and another intense wind storm destroyed the Hood Canal Bridge.

But neither of these storms were predicted. When higher-resolution models of today were run using the data from 1966, the more sophisticated model still missed the intensity of the 1966 storm. Mass indicated that, prior to around 1990, most of these major wind storms were missed by the models, but since then the models almost always correctly predict such storms.

Several attendees noted that the big change in the 1990s was the increase in the use of satellite data in the models and improvements in the way that data was used. The combination of better data coverage, increased resolution and improved physics has greatly improved the forecasts of the complex interactions between these storms’ circulation and complex terrain. Now according to Mass, such storms are rarely missed.

Others noted that there have also been significant advances in detection and observation of extreme oceanic storm because of advances in satellite remote sensing and the algorithms used on the various sensors and model products. These improvements have considerable economic benefits since, on a global basis, fully 90 percent of goods by weight are shipped via the oceans. Now, extreme winds can be remotely sensed by scatterometers. Models now predict oceanic swells and wave heights and directions.

Scatterometer data (colored swaths) showing hurricane-force winds (white color) in the southwest quadrant of a western Atlantic wintertime cyclone. Scatterometry data reveal that the strongest winds are typically found in this quadrant. (Joseph Sienkiewicz, NWS Ocean Prediction Center)

Looking ahead: The role of extratropical cyclones at the weather-climate interface

Despite all of the advances the weather community has made in forecasting technology since the President’s Day Storm, we still have a long way to go.

Ensemble forecast systems, especially for smaller scale events, need to be improved upon since they often lack enough spread and tend to cluster around their parent model. We need a better understanding of what governs the predictability of weather systems. Large scale features are easier to forecast than smaller scale ones, but what other factors govern predictability?

The most extreme, high-impact storms now have good, advanced predictability in some cases up to eight days out (e.g. 1993 Storm of the Century and 2012 Superstorm Sandy) and, in most cases, up to two or three days in advance. Yet lower magnitude, moderate-impact storms remain problematic. Social scientists are needed to help the NWS understand the psychology of risk perception and to find better ways of conveying prediction uncertainty to the public.

Another question that needs further study is what mechanisms drive atmospheric pattern or regime changes. Prior to the President’s Day Storm, the winter was extremely cold — even colder than our frigid winter of 2013-2014. However, the winter of 1979 very much ended with the blizzard. The bomb cyclone marked an abrupt transition from an anomalously cold and dry winter to a warm and wet regime across the East Coast.

Bosart and grad students at SUNY Albany are studying the role of cyclones in climate regime connections. Bosart showed examples of extreme cyclone activity coinciding with major pattern changes across the Northern Hemisphere during the 2009-2010 and 2010-2011 winters.

For instance, a negative arctic oscillation (AO) preconditions for intense storm formation, or cyclogenesis, along the southern periphery of its arctic air mass. Once one or more intense storms develop along the East Coast or western Atlantic, they reinforce the strong Greenland blocking pattern, intensifying the negative phase of the AO. Or, a single cyclone can initiate a downstream block and negative AO onset. An example of how multiple East Coast cyclones reinforced the AO blocking pattern from Dec 31, 2009, to Jan. 8, 2010, is shown below.

Example slide taken from Bosart’s presentation on how cyclones and large-scale flow regimes such as the AO/NAO strongly interact. (Lance Bosart)

Climate regimes in the deep tropics are also connected to mid-latitude storms over North America. Intense tropical storminess over the west Pacific warm pool — including the Madden Julian Oscillation (MJO) and typhoons — serve as precursor events for intense cyclones arriving along the U.S. West Coast (and even traversing the entire continent). For instance, the MJO can intensify the subtropical jet across the southern U.S., and typhoons feed moisture into developing U.S. West Coast cyclones.

Exploring the climate-weather interface in this manner offers many exciting prospects for improved short-range prediction of major flow regime changes; indeed, cyclones – once considered “noise” in the climate system – seem to be part of a complex feedback determining the evolution of stormy and quiescent periods across the United States.

Steve Tracton contributed to this post.