In his post on AccuWeather’s new 25-day forecasts, Jason Samenow says, “Never mind, the skill in forecasts rapidly deteriorates beyond 5-7 days.”

Notice he didn’t say accuracy because for meteorologists the score that counts in judging forecasts is “skill,” not “accuracy.”

In simple terms, without getting into mathematics or the actual comparisons used for calculating the scores, skill measures how much better a forecast was compared to a “no-brainer” forecast, such as using persistence or climatology.

For persistence forecasts you “predict” the future weather will be the same as the current weather. For example, if you want to forecast whether .01 inches or more of precipitation will fall tomorrow, you’ll say what happened today will happen tomorrow.

Climatology means you’ll “forecast” that the weather will be the same as the “normal” (the 30-year average) for the date.

At times you can look like a meteorological star using persistence or climatology--if you pick the right location.

The NWS office in downtown Los Angeles is a perfect place to strut your stuff as a forecaster if you go for July precipitation forecasts. In this case, you forecast whether or not .01 inch or more rain will fall each day of the month.

July rain in downtown Los Angles averages close to zero. From 1921 through 2005 .01 or more inches of rain fell on only 11 days. Most years, you’d have a 100% accuracy score using persistence.

Even if you happened to hit one of the extremely rare days when it rained, you’d still be extremely “accurate” using persistence because your “forecast” would be wrong only twice: the day you forecast dry that turned out wet, and the next day when you forecast “wet” by sticking with persistence. Since you’d be wrong on only two out of 31 days, your accuracy score would be 95.5%. You’d do even better with climatology by forecasting “dry” for each day of the month. Being wrong on only one of 31 days gives you an accuracy score of 96.7%

Of course, neither of these no-brainer forecasts would work nearly as well in Washington with our much more variable weather. For example, a quick look at the last 10 days of March in the Preliminary Monthly Climate Data from Washington National Airport shows that persistence forecasts for .01 inch or more rain would have had an accuracy score of 30%, that is, correct only three days of the 10.

One way to do your own climatology “forecasts” for National Airport in May starts with the fact that during May DCA averages 11 days with more than .01 inch of precipitation.

Since May has 31 days, count out 31 small pieces of paper. Write an “R” on 11 of them and toss all 31 into the proverbial hat. Then draw them out one by one. If the first one you draw has an “R” mark May 1 down as a rainy day. If the next piece is blank mark May 2 down as a dry day, and so on for 31 days.

This could be a good project for a Middle School earth science class with the class checking to see how its forecast is working out each day.