The Case For Reporting High & Low Temperatures to One or Two Decimals
Sporting events like track & field and swimming determine their winners using times that go to two decimals; digital thermometers report body temperature to one decimal place. Yet, daily high and low temperatures are still reported as whole numbers. Despite improved computer models and advances in radar technology that have enhanced forecasting capabilities, why are we still using whole numbers when reporting highs and lows? (One place where temperature is reported to one decimal place is average monthly or annual temperature). In our statistics-crazed world, changing this quirk would seem to be a no-brainer.
Precipitation has been measured to two decimals since the 19th century, so why not temperature? (I've come across one explanation, but it's not very convincing.) Perhaps the best reason for adopting more precise figures is that it would break many ties in temperature records. For example, two years that share a record high of 90° might actually be 89.6° and 90.4°, which is clearly not a tie. Through the end of 2015 only 23 record highs and 91 record lows stood alone; meanwhile, it's rare to have a tie with daily rainfall or snowfall records. This change would likely help break the log jams on March 31, June 2 and Sept. 8, which have six-way ties for their record lows (in New York.)
In the opening paragraph I mentioned that monthly and annual averages are expressed to one decimal, but these averages might be slightly different if monthly averages were calculated using daily highs and lows that were also expressed to one decimal. This makes me wonder how the ten coldest and warmest rankings for each month might change since a lot of months are just 0.1 or 0.2 degree apart (along with numerous ties). For instance, October 1947 and 2007 are tied as the warmest on record, while just 0.1 degree separates the two mildest Novembers, the two hottest Julys and the two coolest Septembers. Furthermore, there's a tie for the second mildest January (1990 and 1950), second chilliest May (1967 and 1907), and the second, third and fourth coldest Novembers are each 0.1 degree apart.
Finally, another aspect of weather record-keeping that would be changed would be the number of 90-degree or 100-degree days reported each summer as well as winter highs and lows of 32° or colder (and readings of 0°). Temperatures reported as 89.5° to 89.9° would no longer be put in the 90-degree column, nor would highs/lows of 32.1° to 32.4° be part of the 32° or colder categorization. This might serve as a tie breaker between the summers of 1991 and 1993, which are tied for most 90-degree+ days (39 each); 1991 had 12 days with highs of 90°while 1993 had seven such days. Food for thought.
here's another thing, 1990, 1991 and 1998 are tied as the second warmest year on record. 2012 is the warmest (but only by a tenth of a degree). this post brought to mind another question, why are snowfall amounts only measured in tenths and never in hundredths like rainfall amounts are?
Posted by: William | 11/13/2017 at 01:37 PM
I am interested in historical temperatures to 1 decimal point, but I assumed that part of the argument is that we don't have recorded (daily) temperatures to the nearest 0.1 and we cannot go back and change history. It would be a welcome practice to start recording, if not already done, to the nearest 0.1 in official measurements (weather.gov for instance). It should not matter whether all the temperatures called in by individuals and phoned in to the local tv weather have this level of accuracy, just the official records.
Posted by: Greg Werner | 12/03/2020 at 09:34 AM