Guest commentary

Numerous times in this winter of numerous storms, the U.S. global weather model has been outperformed by those from other nations. And we’re not just talking forecasts of 8 inches versus 10 inches. In some cases the difference has been storm versus no storm.

On multiple occasions, for example, the well-respected European model has accurately predicted a storm to slam the East Coast with substantial snow, sleet and/or freezing rain several days in advance, while the American model has simultaneously forecast the same storm to barely graze the coast or stay out to sea. Recent performance scores show the U.S. model now lags behind models from Canada and the United Kingdom as well.

Similar scenarios have happened time and again in recent years, and often for the storms that have the greatest human and financial impact. It happened most notably in 2012 with Superstorm Sandy, when the European model accurately showed a direct hit on the Northeast six days in advance, while the U.S. model predicted the storm would harmlessly stay out to sea. The U.S. model eventually caught on, as it does with most storms. But less lead time means less time to prepare, fewer lives saved and more money lost.

Alarmingly, despite an expected upgrade to the supercomputer that runs the American model, U.S. weather forecasting is on track to get worse, not better.

You see, weather models are only as good as the data that goes into them, and we are now facing a potentially imminent gap in critical weather data collected by U.S. weather satellites, which are operated by the National Oceanic and Atmospheric Administration (NOAA). This gap in observations of temperature, pressure, moisture and other important variables is expected to last 17 to 53 months or more according to the Government Accountability Office.

NOAA’s apparent lack of urgency in doing something about the gap is surprising given it has known of the risk for more than three years, and given a November report that found an “unacceptably high probability” of a gap that could have “catastrophic national consequences.”

Congress, on the other hand, is concerned enough that it requested NOAA provide a detailed plan for addressing the gap with submittal of its Fiscal Year 2015 budget earlier this month. Yet the published budget materials provide no such plan for mitigating the gap that could begin as early as this year, and say little of the various options assessed in last February’s gap-mitigation analysis conducted for NOAA by Riverside Technology.

Nor do they respond to Congress’s repeated urging, most recently at a hearing in September titled “Dysfunction in Management of Weather and Climate Satellites,” to use data from U.S. commercial sources to mitigate the gap, avoid future gaps, and reduce skyrocketing program costs. Only just today did NOAA issue a request for industry input to “improve our understanding of existing and potential options … to mitigate the loss of data,” something it should have, and could have, done long ago.

Multiple U.S. companies are planning to launch weather satellites that could provide high-quality data in time to reduce the impact of a gap, and also build much needed resiliency into vital weather data for years and decades to come, all for tens of millions of dollars less than the cost of government-operated satellite programs. That’s more money and resources that could be devoted toward upgrading weather models and improving the flow of science and research into the models.

NOAA has now been working on its congressionally mandated plan to obtain satellite data from commercial sources for more than six years. To date, the agency has yet to support any one of these companies by simply signaling its non-binding intent to purchase data once available and only if it has the funding to do so. That signal alone would unlock millions of dollars in private capital that is poised to drive the commercialization of weather satellite data and ultimately a better forecast.

The markets don’t need an iron-clad commitment. They just need an indication of support, not unlike how the National Geospatial-Intelligence Agency supported development of a commercial industry for satellite imagery around a decade ago. That industry has stabilized the supply of mission-critical information used by the U.S. intelligence community and the warfighter, dramatically lowered government costs, and led to technology breakthroughs including Google Earth and online mapping.

In similar fashion, a commercial weather satellite industry promises to augment government programs with an affordable and steady supply of the data that is most important to accurate forecasts and early warnings, at a much lower cost. Such a public-private partnership would also spur much faster innovation in sensors and satellites than the government could ever accomplish alone.

Until now, NOAA has been the sole U.S. provider of weather satellite data, and has even foot much of the bill for the world’s weather data needs. That is an admirable goal given the role of the weather forecast as a global social good. It is also completely unrealistic given modern-day budget constraints, and even worse it is an approach that has now become detrimental to the weather forecast.

The status quo in weather modeling and weather satellite development and management has led to delays and spiraling costs that now seriously threaten the accuracy and timeliness of weather forecasts that are so important to our daily lives and economic stability. Once the weather forecast leader, the U.S. has a lot of work to do to catch up with the rest of the world.

It’s past time to get started.

Anne Hale Miglarese is President and Chief Executive Officer of PlanetiQ, and formerly Chief of Coastal Information Services at NOAA. PlanetiQ plans to fly the first commercial constellation of weather satellites, providing data to government and commercial customers worldwide by 2017.

The views expressed here are the author’s alone and do not represent any position of the Capital Weather Gang.