Update, 7:50 p.m.: See statement from Louis Uccellini, director of the National Weather Service, at the bottom of this post.
Original post, from 11:48 a.m.
On the two-year anniversary of Superstorm Sandy, the U.S. global weather forecast model has gained no ground on the European model which proved superior in its long-range prediction for the epic storm. While the U.S. has committed to upgrading its modeling, it has not kept pace with counterparts in Europe and is not positioned to be the world leader in global weather prediction.
Recent statistics show the accuracy (or “skill”) of the primary U.S. model – the GFS (Global Forecast System) – ranks third behind the European Centre for Medium-Range Weather Forecasting (ECMWF) global model, frequently referred to as the “European model”, and the United Kingdom Met (UK Met) Office global model.
After the European model schooled the U.S. GFS during Sandy, Congress took notice and authorized $23.7 million for forecasting equipment and supercomputer infrastructure in the National Weather Service (NWS).
“I rallied my colleagues on a bi-partisan basis to put money in the Federal checkbook to come up with the computational capacity to have a new American model that will be the fastest, the most accurate, have the greatest resolution of any facility in the world,” Senator Barbara Mikulski (D-Md.) said in July 2013.
But it seems the ECMWF and UKMet office are winning the weather arms race.
In 2013, the ECMWF invested approximately $64 million in a “Cray multi-petaflops supercomputing infrastructure.”
On Monday, the UK Met Office awarded Cray a $128 million super computer contract. “It will lead to a step change in weather forecasting and climate prediction, and give us the capability to strengthen our collaborations with partners in the South West, UK and around the world,” said Met Office chief executive Rob Varley.
Cliff Mass, a professor of meteorology at University of Washington and dogged advocate for more powerful NWS computing, notes in a blog post that the computing power of the UK Met office, ECMWF, and even the Korean Meteorological Administration will exceed the NWS by many multiples.
For example, the UK Met Office computer will have a processing power of 16 petaflops (flops are simply a measure of the number of calculations the computer can perform per second) by 2017, according to the BBC. A high level NWS Powerpoint presentation titled “Current and Future High Performance Computing” from August 5, 2014 projects NWS computing will have an estimated processing power of 2.7 petaflops by 2018-2020.
“We will increase our supercomputing capacity to a total of 1.4 petaflops by January 2015 under an existing contract,” says Susan Buchanan, spokesperson for the NWS. “The procurement process for further upgrades is underway, and we cannot discuss details or provide a timeline for the project until a contract is awarded.”
Consider that in 2013, the NWS projected it would have already obtained 2.6 petaflops of processing power by fiscal year 2015. That goal seems to have been pushed back about 3 years.
“The message is clear: the U.S. is rapidly falling behind in the computational resources necessary for high quality numerical weather prediction,” Mass writes.
Ryan Maue, a meteorologist for the private sector firm WeatherBell, put it this way in two scathing tweets on Monday:
“Year-and-half ago NOAA touted “game-changing” improvements for US weather forecasting. So far — zip, zero, zilch.”
“NOAA leadership has fumbled ball on weather forecasting computer upgrades. ECMWF & MetOffice are a decade ahead (!)”
Statement from Louis Uccellini, director of the National Weather Service, in response to this post:
The National Weather Service and its research partners have been working together to improve the operational computer models, as we expanded our computing capacity. For example, we have recently implemented the upgraded Hurricane WRF and the High Res Rapid Refresh (HRRR), which could only have been done with the increased computing capacity we’ve already brought on board since 2013. And by December 2014, we will implement the new 13-kilometer Global Forecast System run out to 10 days, a major advancement in our global modeling capabilities.
The procurement process for further computer upgrades is underway for increasing our computing capacity through 2015. While we are involved in contract negotiations, we can’t discuss specifics but with these increases, we’ll be addressing improvements in the GFS that will include 4-D data assimilation, and also higher resolution ensembles derived from the Global Ensemble Forecast System — all of which will have major impacts on our forecast capabilities by the end of 2015.