As mentioned in an earlier post, the National Climatic Data Center (NCDC) has released new climatic “normals” for the entire country for the period 1981-2010. The recalculation, as recommended by the World Meteorological Organization (WMO), is performed by most countries at the close of every decade to reveal climate change. In case you missed it, average nationwide temperatures showed a 0.5 degree F increase over the 1971-2000 period, making the last decade 1.5 degrees F warmer than the 1970s, a particularly cool decade.
Although climate records exist for most American cities going back 140 years or so, the recalculation omits the records from earlier years. This is because, to include them, would greatly obscure recent climate change. On the other hand, to include fewer than 30 years might overemphasize the vagaries of certain years.
With the release of the new “normals,” the perennial question rears its head, as it always seems to do every 10 years, if not sooner: “What does normal weather really mean?” Is it the weather to be expected or is it just an average of weather over the past 30 years?
Weather averages, or “normals,” have been used informally for many hundreds of years, if not longer. But according to the NCDC, it wasn’t until the advent of official record-keeping that the idea really took hold. (Which makes sense, of course, because prior to that, there were no official “normals.”) Weatherbug meteorologist Chad Merrill says that the current method of calculating “ normals” came into existence [officially] in 1956 after the numbers were crunched for the 1921-1950 period.” Since then, normals have been updated in accordance with WMO standards every decade since.
But the word “normal” itself, as used in statistical analysis, relates to a “normal distribution,” a series of values in a so-called bell-shaped curve. Within that bell-shaped curve, three common measures of “central tendency” are often used: the mode (most frequent value); the median (the value above and below which the same number of values exist); and the average, or arithmetic mean of all values. The latter is sometimes referred to as the “normal.” Therein lies the problem, because as a rule, when it comes to weather, the public regards “normal” as the weather to be expected.
In 1954, Dr. Helmut E. Landsberg, then Director of Climatology of the U.S. Weather Bureau, stated it well, when he said,
The layman is often misled by the word [normal]. In his every-day language, the word…..means something ordinary or frequent. ...When (the meteorologist) talks about ‘normal’, it has nothing to do with a common event..... For the meteorologist the ‘normal’ is simply a point of departure or index which is convenient for keeping track of weather statistics..... We never expect to experience ‘normal’ weather.
Now, almost 60 years later, the public perception of weather “normals” seems to be no different: when the temperature is, say, 15 degrees below normal, people tend to view such an aberration as “abnormal.” To counter this interpretation, TV weathercasters, I believe, have recently tried to use the word “average” more frequently, at least when referring to temperatures.
However, as stated in my post about Washington “normals” in April of last year, many of us seem to delight in being told that if we had to suffer through a long stretch of “abnormal” cold or heat, at least we suffered more than anyone else ever did during those dates in history.
The charts shown in this post, however, focus on 2011 only, and how this year’s temperature and precipitation compare with the latest “normal” tables (1981-2010). Clearly, what stands out more than anything else, are the exceptional rainfall totals for August and September, compared with normal. Not to be overlooked, of course, is the fact that every month of the year except January and March has been above normal temperature-wise, although it initially looked like September would be below normal.