Euro ensemble forecasts four days out and one day out for Hurricane Irma. (Gabriel Florit/The Washington Post)

Of the massive, high-powered computer models run by governments and institutions to forecast hurricanes, the vaunted European model had the best performance during Irma. But there’s a little-known entity that preliminary data show outperformed the European model as well as the others.

Enter Panasonic, the electronics company best known for making televisions.

Panasonic has a subgroup in its aviation division, known as Panasonic Avionics, that works on a weather model. The model’s foundation is based on the National Weather Service’s well-known Global Forecast System model, often referred to as the American model.

But the Panasonic model is beefed up with additional data, not incorporated into the GFS, that may be helping it produce even more accurate forecasts. Panasonic gained access to this valuable data after acquiring AirDat in 2013, a company that engineers instruments aboard commercial jets that gather weather data.

On balance, forecast data released by Panasonic reveals its forecasts were the most accurate leading up to Irma’s landfall on Marco Island, Fla. (Note that this forecast data, provided by Panasonic, has not been independently evaluated. But Panasonic has posted the data online and welcomes scholars to review it.)


Average forecast track error for Hurricane Irma by Panasonic model (red), American GFS model (blue), and European ECMWF model (green) in the days leading up to Irma. (Neil Jacobs/Panasonic)

The Panasonic model forecasts were especially good about four to seven in days in advance, the data show, outperforming the European and American GFS models. Its forecast for Irma’s track had a substantially smaller error, on average.

The European model, operated by the European Center for Medium-Range Weather Forecasts (ECMWF), based in Reading, England, narrowly had the most accurate track forecast in the period two to four days before Irma struck. Then, within 36 hours, all the models were comparably good.

The Panasonic model forecast performance stood out most for its consistency, said Benjamin Noll, a meteorologist at the National Institute of Water and Atmospheric Research in Auckland, New Zealand. He analyzed Panasonic’s data and found the model first predicted six days out that Irma would strike Florida’s Gulf Coast and did not waver much from that accurate prediction.

“In one word, the model was impressive,” Noll told the Capital Weather Gang. “When you see you a model [forecast] verifying within 50-100 miles [of the storm position] 6 days out, it’s compelling to say the least. It would be nice to see more of the model and examine more cases.”

The Panasonic model is not publicly available as the company sells its data and information for commercial purposes. It is joined by IBM, which is working on an in-house model called Deep Thunder.

“There’s absolutely no reason that Panasonic and IBM can’t make a superior version of the GFS,” said Ryan Maue, a private-sector meteorologist and scholar at the Cato Institute. “As a private-sector company, they have ability to do research and test various configurations of models using their own in-house expertise. They have computing power and manpower dedicated to this task.”

As new computer models are born in the private sector, with the ability to improve on the performance on government models, it raises questions about access to their information — an issue that has some ethical dimensions. If private companies have data that can help protect life and property, should they make it available? Is it the government’s role to step in and buy this information for the purpose of making it available to the public and decision-makers?

These are questions the weather community is grappling with.

While new players enter the weather forecasting arena, the traditional large institutions, such as the U.S. National Weather Service and the ECMWF, are investing considerable effort to improve their models.

In his post-Irma story on the state of the American and European weather models, Mashable’s Andrew Freedman noted the ECMWF plans to build a “next-generation supercomputer center in Italy” for its model, and that the National Weather Service also has upgrades in the works for the GFS.

“NOAA has further improvements planned,” Freedman reported. “For example, the primary and backup computer system is set to go from 2.8 petaflops to 4.2 petaflops of computing power. In addition, the agency is upgrading how the model ingests and processes weather data, which could yield big improvements.”

Ultimately, perhaps, the competition between the governments and the private sector will lead to even better forecasts — a positive outcome all around.

More stories on the aftermath of Irma

Wunderground founder: Want better hurricane forecasts? Invest in the research.

16 million people without power and 142-mph winds: Hurricane Irma, by the numbers

Why Hurricane Irma wasn’t far worse, and how close it came to catastrophe

Why the storm surge forecast for Irma wasn’t so bad, just incomplete

Hurricane Irma drained the water from Florida’s largest bays — but it wasn’t gone for long

Irma’s track forecast was adequate, but there’s significant room for improvement

Before and after Hurricane Irma massively mauled the land and mixed the ocean