July 2012 is no longer the hottest month ever recorded in the United States.
In March, the country's heat record returned to the previous winner: July 1936. The switch happened when the National Oceanic and Atmospheric Administration (NOAA) updated the enormous data set it uses to track national, state and regional land temperatures.
But no one noticed the flip-flop until this week, when it was reported by bloggers who are skeptical of how the government determines its temperature data. The discovery has breathed new life into an old conspiracy theory: that NOAA is manipulating temperature records to enhance the dire impact of global warming.
"Is history malleable? Can temperature data of the past be molded to fit a purpose? It certainly seems to be the case here," blogger Anthony Watts told Fox News last year.
But the accusations of a government agency secretly manipulating its records are simply untrue. Here's why. [Top Ten Conspiracy Theories]
Temperatures from the NCDC Credit: NCDC
First, the database update was no secret. NOAA previewed the changes years in advance by publishing descriptions of its methods in peer-reviewed scientific journals. The government agency also announced the new data set through public statements, and created a tool for users to compare and contrast temperatures from before and after the update. NOAA also makes its data and computer code available for anyone who wants to check the numbers. The new data set is called nClimDiv, and you can find more information about it on NOAA's National Climatic Data Center website.
Second, NOAA never changes the actual temperatures that were so carefully recorded over the decades. But it's no simple task to compare the present with the past. Methods of measuring temperature have changed markedly over the past century. The database tweaks are meant to make the comparisons between modern and obsolete technology more accurate.
In a June 29 blog post, Watts called this practice of adjusting temperatures "unsupportable" and said NOAA offered "no explanation to the public as to why" the July temperatures had changed.
But every scientific group that analyzes long-term climate trends does the same kind of tweaking. It's called standardizing, or homogenizing, the data. Independent climate analysis groups, such as the Berkeley Earth Project, have validated NOAA's approach.
For example, some weather stations once measured temperature in the morning, and others did so at sundown. Evening temperatures are warmer than in the morning, and directly comparing the two might artificially skew any long-term temperature trends. Instead, NOAA has standardized all its stations to morning reporting - a correction that led to widespread cooling of about 1 degree Fahrenheit (more than half a degree Celsius) in older records.
Other algorithms correct for changes in the number and location of weather stations. And even the thermometers have modernized, from glass to electric systems. The nClimDiv update also included thousands of digitized temperatures, painstakingly added from old paper records, which shifted some older temperature trends. Scientists with the National Climatic Data Center also sifted through old data to fix typos in the records and improve the monthly records for individual states. [The World's 10 Weirdest Weather Events]
Finally, the new data set also includes more high-elevation weather stations, so some regions are now cooler than they used to be, because mountainous regions are generally colder year-round.
Temperature records in 2012. Credit: Karl Tate, Live Science
Same argument, new data
The new data set is simply the latest in a long line of improvements to the methods NOAA uses to calculate national, state and regional temperature trends. The temperature records have shifted before (to the consternation of climate skeptics) and will likely shift again, as computers get faster and more records become available.
"This is a great example of why data sets are living things," said Derek Arndt, chief of NOAA's Climate Monitoring Branch at the National Climatic Data Center. "They can continually be refined and improved, and we can catch things today that we couldn't catch before."
And although the community of global warming skeptics focuses on temperature, the data update also affected precipitation and humidity records - but no one seems to be complaining about that online.
"This is progress," Arndt said. "If this were maybe a little less visible data set, these kinds of improvements would be welcome advances."
Arndt noted that the majority of the record changes are tiny, except for the typos caught by hand. For the two hot Julys, the temperatures recorded in 1936 and 2012 are now so close that it's more accurate to consider the top spot a tie, he said.
"When you consider the uncertainty, they're effectively tied, and if they're not tied, it was a photo finish," Arndt told Live Science.
The old temperatures were 77.6 F (25.3 C) for July 2012 and 77.4 F (25.2 C) for July 1936.
The new, revised record pushes both temperatures down slightly, with July 1936 at 76.80 (24.89 C) and July 2012 just a hair lower at 76.77 F (24.87 C).
2012 also holds the title for the hottest year on record for the United States, and that conclusion hasn't changed with the new update, Arndt said.
The update also did not significantly change overall trends for the rise in national temperature since 1895, when the government first started its tracking. The new data set shows an increase of 1.33 F (0.74 C) per century, compared to 1.30 F (0.72 C) per century in the previous data set.
Editor's note: This story was updated on July 2 to correct Anthony Watts' name.
Copyright 2014 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
El Niño officially came to an end in early June, and experts are calling for a La Niña to develop in its footsteps.Read Story >