By Joseph D’Aleo, CCM, AMS Fellow
Temperature Measurement Timeline Highlights
Virtually every month and year we see stories in the once reliable media and from formerly unbiased data centers that proclaim the warmest such period in the entire record back to 1895 or earlier. The following suggests most of the period is model-based guesswork.
1975 – National Academy of Science makes the first attempt at determining ‘global’ temperatures and trend, which they limited to the Northern Hemisphere land areas (U.S. and Europe) This was because they recognized reliable date on a larger scale and over the ocean was just not available or trustworthy. The data they were able to access showed a dramatic warming from the 1800s to around 1940 then a reversal ending in a matching cooling by the late 1970s when even the CIA wrote that the consensus of scientists, we might be heading towards a dangerous new ice age. The cooling continued to the end of the decade roughly eliminating the nearly 60 years of warming.

1978 – New York Times reported there was too little temperature data from the Southern Hemisphere to draw any reliable conclusions. The report they references was prepared by German, Japanese and American specialists, and appeared in the Dec. 15 issue of Nature, the British journal. It stated that “Data from the Southern Hemisphere, particularly south of latitude 30 south, are so meager that reliable conclusions are not possible,” the report says. “Ships travel on well-established routes so that vast areas of ocean, are simply not traversed by ships at all, and even those that do, may not return weather data on route.”

1979 – global satellite temperature measurement of the global atmosphere begins at UAH and RSS.
1981 – NASA’s James Hansen et al reported that “Problems in obtaining a global temperature history are due to the uneven station distribution, with the Southern Hemisphere and ocean areas poorly represented,” (Science, 28 August 1981, Volume 213, Number 4511(link))
1989 – At that time, in response to the need for an accurate, unbiased, modern historical climate record for the United States, personnel at the Global Change Research Program of the U.S. Department of Energy and at NCEI defined a network of 1219 stations in the contiguous United States whose observation would comprise a key baseline dataset for monitoring U.S. climate. Since then, the USHCN dataset has been revised several times (e.g., Karl et al., 1990; Easterling et al., 1996; Menne et al. 2009). The three dataset releases described in Quinlan et al. 1987, Karl et al., 1990 and Easterling et al., 1996 are now referred to as the USHCN version 1 datasets.
The documented changes that were addressed include changes the time of observation (Karl et al. 1986), station moves, and instrument changes (Karl and Williams, 1987; Quayle et al., 1991). Apparent urbanization effects were also addressed in version 1 with a specific urban bias correction (Karl et al. 1988)

Tom Karl wrote with Kukla and Gavin in a 1986 paper on Urban Warming: “MeteoSecular trends of surface air temperature computed predominantly from [urban] station data are likely to have a serious warm bias… The average difference between trends [urban siting vs. rural] amounts to an annual warming rate of 0.34°C/decade (3.4C/century) … The reason why the warming rate is considerably higher [may be] that the rate may have increased after the 1950s, commensurate with the large recent growth in and around airports. Our results and those of others show that the urban growth inhomogeneity is serious and must be taken into account when assessing the reliability of temperature records.”

1989 – The NY Times reported the US Data failed to show warming trend predicted by Hansen in 1980.

1992 – NOAA’s first global monthly assessment began (GHCNm – Vose). Subsequent releases include version 2 in 1997 (Peterson and Vose, 1997), version 3 in 2011 (Lawrimore et al. 2011) and, most recently, version 4 (Menne et al. 2018). GHCNm v4 consisted of mean monthly temperature data only.
1992 – The National Weather Service (NWS) Automated Surface Observing System (ASOS), which serves as the primary data source for more than 900 airports nationwide and is utilized for climate data archiving was deployed in the early 1990’s. Note the criteria specified a RMSE of 0.8F and max error of 1.9F. ASOS was designed to supply key information for aviation such as ceiling visibility, wind, indications of thunder and icing. It was not designed for assessing climate.
1999 – The USHCN temperature still trailed 1934 as it had a decade earlier – James Hansen noted “The U.S. has warmed during the past century, but the warming hardly exceeds year-to-year variability. Indeed, in the U.S. the warmest decade was the 1930s and the warmest year 1934.” When asked why the discrepancy, Hansen said the US was only 2% of the world and both could be right.

2000 – A network of nearly 4000 diving buoys (ARGO) were deployed world wide to provide the first real time monitoring of ocean temperatures and heat content.

2001 -The IPCC in its third report (2001) conceded: “In climate research and modelling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.” (Chapter 14, Section 14.2.2.2. )
2004 – National Climate Reference Network was established with the guidance of John Christy of UAH to provide uncontaminated temperatures in the lower 48 states. The 114 stations met the specifications that kept them away from local heat sources.
2005 – Pielke and Davey (2005) found a majority of stations, including climate stations in eastern Colorado, did not meet requirements for proper siting. They extensively documented poor siting and land-use change issues in numerous peer-reviewed papers including “Unresolved issues with the assessment of multi-decadal global land surface temperature trends” (2007)
2007 – a new version, USHCNv2 replaced the urban adjustment with significant other adjustments including the removal of urban warming adjustments replaced by ‘homogenization’. The trend reversed with 1998 now warmer than 1934 and the mean trend higher than the 1930s.

David Easterling, Chief Scientific Services Division for NOAA’s Climate Center expressed concern in a letter to James Hansen at NASA “One fly in the ointment, we have a new adjustment scheme for USHCNv2 that appears to adjust out some, if not all of the local trend that includes land use change and urban warming”. It reduced the “bothersome 1940 warm blip” that Tom Wigley wanted to be minimized.

2008 – In a volunteer survey project, Anthony Watts and his more than 650 volunteers at http://www.surfacestations.org found that over 900 of the first 1,067 stations surveyed in the 1,221 station U.S. climate network did not come close to the specifications as employed in Climate Reference Network (CRN) criteria. Only about 3% met the ideal specification for siting. They found stations located next to the exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat.
They found 68 stations located at wastewater treatment plants, where the process of waste digestion causes temperatures to be higher than in surrounding areas. In fact, they found that 90% of the stations fail to meet the National Weather Service’s own siting requirements that stations must be 30 m (about 100 feet) or more away from an artificial heating or reflecting source.


2009 –
From Climategate emails on the bothersome 1940 warm blip and data not supporting models
From: Tom Wigley, Date: Sun, 27 Sep 2009
“So, if we could reduce the ocean blip by, say, 0.15 degC, then this would be significant for the global mean – but we’d still have to explain the land blip. I’ve chosen 0.15 here deliberately. This still leaves an ocean blip, and i think one needs to have some form of ocean blip to explain the land blip (via either some common forcing, or ocean forcing land, or vice versa, or all of these).”
From: Tom Wigley, Date: Fri, 06 Nov 2009
“We probably need to say more about this. Land warming since 1980 has been twice the ocean warming — and skeptics might claim that this proves that urban warming is real and important.”
From: Kevin Trenberth, before Wed, 14 Oct 2009
“The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.”
2009 – NASA’s Dr. Edward R. Long in a 2009 analysis looked at the new version of the US data. Both raw and adjusted data from the NCDC (now NCEI) has been examined for a selected Contiguous U.S. set of rural and urban stations, 48 each or one per State. The raw data showed 0.13 and 0.79 C/century temperature increase for the rural and urban environments, consistent with urban factors. The adjusted data yielded 0.64 and 0.77 C/century respectively.
Comparison of the adjusted data for the rural set to that of the raw data shows a systematic treatment that causes the rural adjusted set’s temperature rate of increase to be 5-fold more than that of the raw data. This suggests the consequence of the NCDC’s protocol for adjusting the data is to cause historical data to take on the time-line characteristics of urban data. The consequence intended or not, is to report a false rate of temperature increase for the Contiguous U. S., consistent with modeling utilizing the Greenhouse theory.

2010 – A 2009 review of temperature issues was published by a large group of climate scientists entitled Surface Temperature Records: A Policy Driven Deception. Many issues in the US and globally were discussed. Even as the stations incorporated in the global surface data sets increased in number and coverage, their reliability became a challenge, with many large continents having a large percentage of missing months in the station data. That required the data centers to guess the missing data to get a monthly and then annual average.

Analysis and graph: Verity Jones
Many may be surprised to see in the figure above that this missing data problem still exists today, in fact it appears worse with missing data estimated by using data from the nearest stations, sometime many hundreds of miles away. See the initial data regions in September 2018 that were filled in by algorithms. It includes filling in a large data void region with a record warmth assessment (Heller 2018).

2010 – A landmark study Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends followed, authored by Souleymane Fall, Anthony Watts, John Nielsen-Gammon, Evan Jones, Dev Niyogi, John R. Christy, Roger A. Pielke Sr represented years of work in studying the quality of the temperature measurement system of the United States.
2010 – In a review sparked by this finding, the GAO found “42% of the active USHCN stations in 2010 clearly did not meet NOAA’s siting standards. Whatsmore, just 24 of the 1,218 stations (about 2 percent) have complete data from the time they were established.”
2010 – The CRU scientist at the center of the Climategate scandal at East Anglia University, Phil Jones, made a candid admission on BBC (2010)that his “surface temperature data are in such disarray they probably cannot be verified or replicated, that there has been no statistically significant global warming for the last 15 years and it has cooled 0.12C/decade trend from 2002-2009. Jones specifically disavowed the “science-is-settled” slogan.”
2013 – NOAA responded to papers on siting and GAO admonition by removing and/or replacing the worst stations. Also in monthly press releases no satellite measurements are ever mentioned, although NOAA claimed that was the future of observations.
2015 – A pause in warming that started around 1997 was finally acknowledged in he journal Nature by IPCC Lead Author Kevin Trenberth and attributed to cyclical influences of natural factors like El Nino, ocean cycles on global climate. The AMS Annual Meeting in 2015 had 3 panels to address ‘the pause’.
2015– NOAA under pressure put an end to the pause by altering the ocean temperatures from Argo buoys to better match ship intake temperatures that had become the dominant method in prior decades despite concerns over warm contamination from the ship engines. This resulted in the global surface data better fitting the theory of greenhouse warming. John Bates, data quality officer with NOAA detailed how Tom Karl in a paper in Science in June 2015, just a few months before world leaders were to meet in Paris to agree on a costly Paris Climate Accord, removed the inconvenient pause by altering ocean temperatures. Since the oceans cover 71% of the globe, even small adjustments have a major impact.
2017 – a new U.S. climate data set nClimDiv with climate division model reconstructions and statewide averages was gradually deployed and replaced USHCNv2.The result was NOAA gave 40 out of 48 states ‘new’ warming. The Drd964x decadal CONUS warming rate from 1895 to 2012 was 0.088F/decade. The new nClimDiv rate from 1895 to 2014 is 0.135F/decade, almost double.
2017 – In the ADDENDUM to the Research Report: On the Validity of NOAA, NASA and Hadley CRU Global Average Surface Temperature Data & The Validity of EPA’s CO2 Endangerment Finding, Abridged Research Report, Dr. James P. Wallace III, Joseph S. D’Aleo, Dr. Craig D. Idso, June 2017 (here) provided ample evidence that the Global Average Surface Temperature (GAST) data was invalidated for use in climate modeling and for any other climate change policy analysis purpose.
“The conclusive findings of this research are that the three GAST data sets are not a valid representation of reality. In fact, the magnitude of their historical data adjustments, that removed their cyclical temperature patterns, are totally inconsistent with published and credible U.S. and other temperature data. Thus, it is impossible to conclude from the three published GAST data sets that recent years have been the warmest ever – despite current claims of record setting warming.”
2019 – Tony Thomas in Quadrant Online on Dr. Mototaka Nakamura who in a book on “the sorry state of climate science” titled “Confessions of a climate scientist: the global warming hypothesis is an unproven hypothesis” wrote ‘The supposed measuring of global average temperatures from 1890 has been based on thermometer readouts barely covering 5 per cent of the globe until the satellite era began 40-50 years ago.” Further, he was contemptuous of claims about models being “validated”, saying the modelers are merely “trying to construct narratives that justify the use of these models for climate predictions.” And he concluded, “With values of parameters that are supposed to represent many complex processes being held constant, many nonlinear processes in the real climate system are absent or grossly distorted in the models. It is a delusion to believe that simulation models that lack important nonlinear processes in the real climate system can predict (even) the sense or direction of the climate change correctly”.