Do-It-Yourself Climate Analysis

This article was first posted at Watts Up With That on July 12, 2014

People in different places are wondering: What are temperatures doing in my area? Are they trending up, down or sideways? Of course, from official quarters, the answer is: The globe is warming, so it is safe to assume that your area is warming also.

But what if you don’t want to assume and don’t want to take someone else’s word for it. You can answer the question yourself if you take on board one simplifying concept:

“If you want to understand temperature change,
  you should analyze the changes, not the temperatures.”

Analyzing temperature change is in fact much simpler and avoids data manipulations like anomalies, averaging, gridding, adjusting and homogenizing. Temperature Trend Analysis starts with recognizing that each micro-climate is distinct with its unique climate patterns. So you work on the raw, unadjusted station data produced, validated and submitted by local meteorologists. This is accessed in the HADCRUT3 dataset made public in July 2011. Of course, there are missing datapoints which cause much work for climatologists. Those are not a big deal for trend analysis.

The dataset includes 5000+ stations around the world, and only someone adept with statistical software running on a robust computer could deal with all of it. But the Met Office provides it in folders that cluster stations according to their WMO codes.
http://www.metoffice.gov.uk/research/climate/climate-monitoring/land-and-atmosphere/surface-station-records

I am not the first one to think of this. Richard Wakefield did similar analyses in Ontario years ago, and Lubos Motl did trend analysis on the entire HADCRUT3 in July 2011. With this simplifying concept and a template, it is possible for anyone with modest spreadsheet skills and a notebook computer to answer how area temperatures are trending. I don’t claim this analysis is better than those done with multimillion dollar computers, but it does serve as a “sanity check” against exaggerated claims and hype.

The method involves creating for each station a spreadsheet that calculates a trend for each month for all of the years recorded. Then the monthly trends are averaged together for a lifetime trend for that station. To be comparable to others, the station trend is presented as degrees per 100 years. A summary sheet collects all the trends from all the sheets to provide trend analysis for the geographical area of interest.

I have built an Excel workbook to do this analysis, and as a proof of concept, I have loaded in temperature data for Kansas . Kansas is an interesting choice for several reasons:

1) It’s exactly in the middle of the US with little change in elevation;
2) Kansas has a manageable number of HCN stations:
3) It has been the subject lately of discussion about temperature processing effects;
4) Kansas legislators are concerned and looking for the facts; and
5) As a lad, my first awareness of extreme weather was the tornado in OZ, after which Dorothy famously said: “We’re not in Kansas anymore, Toto.”

For the Kansas example, we see that BEST shows on its climate page that the State has warmed 1.98 +/-0.14°C since 1960. That looks like temperatures will be another 2°C higher in the next 50 years, and we should be alarmed.

Well, the results from temperature trend analysis tell a different story.
From the summary page of the workbook:

Area State of Kansas, USA
History 1843 to 2011
Stations 26
Average Length 115 Years
Average Trend 0.70 °C/Century
Standard Deviation 0.45 °C/Century
Max Trend 1.89 °C/Century
Min Trend -0.04 °C/Century

So in the last century the average Kansas station has warmed 0.70+/-0.45°C , with at least one site cooling over that time. The +/- 0.45 deviation shows that climate is different from site to site even when all are located on the same prairie.

And the variability over the seasons is also considerable:

Month °C/century Std Dev
Jan 0.59 1.30
Feb 1.53 0.73
Mar 1.59 2.07
Apr 0.76 0.79
May 0.73 0.76
June 0.66 0.66
July 0.92 0.63
Aug 0.58 0.65
Sep -0.01 0.72
Oct 0.43 0.94
Nov 0.82 0.66
Dec 0.39 0.50

Note that February and March are warming strongly, while September is sideways . That’s good news for farming, I think.

Temperature change depends on your location and time of the year. The rate of warming here is not extreme and if the next 100 years is something like the last 100, in Kansas there will likely be less than a degree C added.

Final points:

When you look behind the summary page at BEST, it reports that the Kansas warming trend since 1910 is 0.75°C +/-0.08, close to what my analysis showed. So the alarming number at the top was not the accumulated rise in temperatures, it was the Rate for a century projected from 1960. The actual observed century rate is far less disturbing. And the variability across the state is considerable and is much more evident in the trend analysis. I had wanted to use raw data from BEST in this study, because some stations showed longer records there, but for comparable years, the numbers didn’t match with HADCRUT3.

Not only does this approach maintain the integrity of the historical record, it also facilitates what policy makers desperately need: climate outlooks based on observations for specific jurisdictions. Since the analysis is bottom-up, micro-climate trends can be compiled together for any desired scope: municipal, district, region, province, nation, continent.

This example analyzed monthly average temperatures at a set of stations. This study used HADCRUT3, but others are done with CRUTEM4 and GHCN. The same technique can be applied to temperature minimums and maximums, or to adjusted and unadjusted records. And since climate is more than temperatures, one could also study precipitation histories, or indeed any weather measure captured in a time series.

The trend analysis workbook is provided below. It was the first iteration and the workbook was refined and enhanced in subsequent studies, also posted at this blog.

HADCRUT3 Kansas TTA

Analyzing Temperature Change using World Class Stations

This article was first posted on July 28, 2014 at Watts Up With That.

This is a study to see what the world’s best stations (a subset of all stations I selected as “world class” by criteria) are telling us about climate change over the long term. There are three principle findings.

To be included, a station needed at least 200 years of continuous records up to the present. Geographical location was not a criterion for selection, only the quality and length of the histories. 247 years is the average length of service in this dataset extracted from CRUTEM4.

The 25 stations that qualified are located in Russia, Norway, Denmark, Sweden, Netherlands, Germany, Austria, Italy, England, Poland, Hungary, Lithuania, Switzerland, France and Czech Republic. I am indebted to Richard Mallett for his work to identify the best station histories, to gather and format the data from CRUTEM4.

The Central England Temperature (CET) series is included here from 1772, the onset of daily observations with more precise instruments. Those who have asserted that CET is a proxy for Northern Hemisphere temperatures will have some support in this analysis: CET at 0.38°C/Century nearly matches the central tendency of the group of stations.

1. A rise of 0.41°C per century is observed over the last 250 years.

Area WORLD CLASS STATIONS
History 1706 to 2011
Stations 25
Average Length 247 Years
Average Trend 0.41 °C/Century
Standard Deviation 0.19 °C/Century
Max Trend 0.80 °C/Century
Min Trend 0.04 °C/Century

The average station shows an accumulated rise of about 1°C over the last centuries. The large deviation, and the fact that at least one station has almost no warming over the centuries, shows that warming has not been extreme, and varies considerably from place to place.

2. The warming is occurring mostly in the coldest months.

The average station reports that the coldest months, October through April are all warming at 0.3C or more, while the hottest months are warming at 0.2C or less.

Month °C/Century Std Dev
Jan 0.96 0.31
Feb 0.37 0.27
Mar 0.71 0.27
Apr 0.33 0.28
May 0.18 0.25
June 0.13 0.30
July 0.21 0.30
Aug 0.16 0.26
Sep 0.16 0.28
Oct 0.34 0.27
Nov 0.59 0.23
Dec 0.76 0.27

In fact, the months of May through September warmed at an average rate of 0.17C/Century, while October through April increased at an average rate of 0.58C/Century, more than 3 times higher. This suggests that the climate is not getting hotter, it has become less cold. That is, the pattern suggests milder winters, earlier springs and later autumns, rather than hotter summers.

3. An increase in warming is observed since 1950.

In a long time series, there are likely periods when the rate of change is higher or lower than the rate for the whole series. In this study it was interesting to see period trends around three changepoints:
1.1850, widely regarded as the end of the Little Ice Age (LIA);
2.1900, as the midpoint between the last two centuries of observations;
3.1950 as the date from which it is claimed that CO2 emissions begin to cause higher temperatures.

For the set of stations the results are:

°C/Century Start End
-0.38 1700’s 1850
 0.95 1850 2011
-0.14 1800 1900
 1.45 1900 1950
 2.57 1950 2011

From 1850 to the present, we see an average upward rate of almost a degree, 0.95°C/Century, or an observed rise of 1.53°C up to 2011. Contrary to conventional wisdom, the aftereffects of the LIA lingered on until 1900. The average rate since 1950 is 2.6°C/Century, higher than the natural rate of 1.5°C in the preceding 50 years. Of course, this analysis cannot identify the causes of the 1.1°C added to the rate since 1950. However it is useful to see the scale of warming that might be attributable to CO2, among other factors.

Conclusion

Of course climate is much more than surface temperatures, but the media are full of stories about global warming, hottest decade in history, etc. So people do wonder: “Are present temperatures unusual, and should we be worried?” In other words, “Is it weather or a changing climate?” The answer in the place where you live depends on knowing your climate, that is the long-term weather trends.

Note: These trends were calculated directly from the temperature records without applying any adjustments, anomalies or homogenizing. The principle is: To understand temperature change, analyze the changes, not the temperatures.

Along with this post I provide below the World Class TTA workbook for readers to download for their own use and to check the data and calculations.

World Class TTA

Temperatures According to Climate Models

In December 2014, Willis Eschenbach posted GMT series generated by 42 CMIP5 models, along with HADCRUT4 series, all obtained from KNMI.

CMIP5 Model Temperature Results in Excel


The dataset includes a single run showing GMT from each of 42 CMIP5 models. Each model estimates monthly global mean temperatures in degrees Kelvin backwards to 1861 and forwards to 2101, a period of 240 years. The dataset from CMIP5 models includes 145 years of history to 2005, and 95 years of projections from 2006 onward.

The estimated global mean temperatures are considered to be an emergent property generated by the model. Thus it is of interest to compare them to measured surface temperatures. The models produce variability year over year, and on decadal and centennial scales.

These models can be thought of as 42 “proxies” for global mean temperature change. Without knowing what parameters and assumptions were used in each case, we can still make observations about the models’ behavior, without assuming that any model is typical of the actual climate. Also the central tendency tells us something about the set of models, without necessarily being descriptive of the real world.

What temperatures are projected by the average model?

Periods HADCRUT4 ALL SERIES ALL MINUS HADCRUT4
1850-1878 0.035 0.051 0.016
1878-1915 -0.052 0.024 0.076
1915-1944 0.143 0.050 -0.093
1944-1976 -0.040 -0.008 0.032
1976-1998 0.194 0.144 -0.050
1998-2014 0.053 0.226 0.173
1850-2014 0.049 0.060 0.011

The rates in the table are C/decade. Over the entire 240 years time series, the average model has a warming trend of 1.26C per century. This compares to UAH global trend of 1.38C, measured by satellites since 1979.

However, the average model over the same period as UAH shows a rate of +2.15C/cent. Moreover, for the 30 years from 2006 to 2035, the warming rate is projected at 2.28C. These estimates are in contrast to the 145 years of history in the models, where the trend shows as 0.41C per century.

Clearly, the CMIP5 models are programmed for the future to warm more than 5 times the rate as the past.

Is one model better than the others?

In presenting the CMIP5 dataset, Willis raised a question about which of the 42 models could be the best one. I put the issue this way: Does one of the CMIP5 models reproduce the temperature history convincingly enough that its projections should be taken seriously?

I identified the models that produced an historical trend nearly 0.5K/century over the 145 year period, and those whose trend from 1861 to 2014 was in the same range. Then I looked to see which of the subset could match the UAH trend 1979 to 2014.

Out of these comparisons the best performance was Series 31, which Willis confirms is output from the INMCM4 model. Rates in the table below are C/decade.

Periods HADCRUT4 SERIES 31 31 MINUS HADCRUT4
1850-1878 0.035 0.036 0.001
1878-1915 -0.052 -0.011 0.041
1915-1944 0.143 0.099 -0.044
1944-1976 -0.040 0.056 0.096
1976-1998 0.194 0.098 -0.096
1998-2014 0.053 0.125 0.072
1850-2014 0.049 0.052 0.003

Note that this model closely matches HADCrut4 over 60 year periods, but shows variances over 30 year periods. That is, shorter periods of warming in HADCrut4 run less warm in the model, and shorter periods of cooling in HADCrut4 run flat or slightly warming in the model. Over 60 years the differences offset.

It shows warming 0.52K/century from 1861 to 2014, with a plateau from 2006 to 2014, and 0.91K/century from 1979-2014. It projects 1.0K/century from 2006 to 2035 and 1.35K/century from now to 2101. Those forward projections are much lower than the consensus claims, and not at all alarming

In contrast with Series 31, the other 41 models typically match the historical warming rate of 0.05C by accelerating warming from 1976 onward and projecting it into the future. For example, while UAH shows warming of 0.14/decade from 1979-2014, CMIP5 models estimates average 0.215/decade, ranging from 0.088 to 0.324/decade.

For the next future climate period, 2006-2035, CMIP5 models project an average warming of 0.28C/decade, ranging from 0.097 to 0.375/decade.

The longer the plateau continues, the more overheated are these projections by the models.

What’s different about the best model?

Above, I showed how one CMIP5 model produced historical temperature trends closely comparable to HADCRUT4. That same model, INMCM4, was also closest to Berkeley Earth and RSS series.

Curious about what makes this model different from the others, I consulted several comparative surveys of CMIP5 models. There appear to be 3 features of INMCM4 that differentiate it from the others.

1.INMCM4 has the lowest CO2 forcing response at 4.1K for 4XCO2. That is 37% lower than multi-model mean.

2.INMCM4 has by far the highest climate system inertia: Deep ocean heat capacity in INMCM4 is 317 W yr m^-2 K^-1, 200% of the mean (which excluded INMCM4 because it was such an outlier)

3.INMCM4 exactly matches observed atmospheric H2O content in lower troposphere (215 hPa), and is biased low above that. Most others are biased high.

So the model that most closely reproduces the temperature history has high inertia from ocean heat capacities, low forcing from CO2 and less water for feedback. Why aren’t the other models built like this one?

Conclusions:

In the real world, temperatures go up and down. This is also true of HADCRUT4. In the world of climate models, temperatures only go up. Some variation in rates of warming, but always warming, nonetheless.

Not all models are created equal, and the ensemble average is far from reality and projects unreasonable rates of future warming. It would be much better to take the best model and build upon its success.

Excel workbook is here: CMIP5 VS hADCRUT

Is It Warmer Now than a Century ago? It Depends.

This study was first published on August 20, 2014 at No Tricks Zone. The dataset ends with 2013 records.

In a previous study of World Class station records, the effects of urban development could not be discounted since the 25 long service records come from European cities. This is a study to see what the best sites in the US can tell us about temperature trends in the last century. There are two principal findings below.

Surfacestations.org provides a list of 23 stations that have the CRN#1 Rating for the quality of the sites. I obtained the records from the latest GHCNv3 monthly qcu report, did my own data quality review and built a Temperature Trend Analysis workbook.

As it happens, the stations are spread out across the continental US (CONUS): NW: Oregon, North Dakota, Montana; SW: California, Nevada, Colorado, Texas; MW: Indiana, Missouri, Arkansas, Louisiana; NE: New York, Rhode Island, Pennsylvania; SE: Georgia, Alabama, Mississippi, Florida.

The records themselves vary in quality of coverage, but are all included here because of their CRN#1 rating. The gold medal goes to Savannah for 100% monthly coverage, with a single missing daily observation since 1874. Pensacola was a close second among the four stations with perfect monthly coverage. Most stations were missing less than 20 months with coverages above 95%.

1. First Class US stations show little warming in the last century.

Area FIRST CLASS US STATIONS
History 1874 to 2013
Stations 23
Average Length 118 Years
Average Trend 0.16 °C/Century
Standard Deviation 0.66 °C/Century
Max Trend 1.18 °C/Century
Min Trend -1.93 °C/Century

The average station shows a rise of about 0.16°C/Century. The large deviation, and the fact that multiple stations had cooling rates shows that warming has not been extreme, and varies considerably from place to place. The observed warming for this group is less than half the rate reported in the European study.

2. Temperature trends are local, not global.

Most remarkable about these stations is the extensive local climate diversity that appears when station sites are relatively free of urban heat sources. 35% (8 of 23) stations reported cooling over the century. Indeed, if we remove the 8 warmest records, the rate flips from +0.16°C to -0.14°C.

And the multidecadal patterns of warming and cooling were quite variable from place to place. Averages over 30-year periods suggest how unusual these patterns are.

For the set of stations the results are:

°C/Century Start End
0.78 Start 1920
-1.21 1921 1950
-1.11 1951 1980
1.51 1981 2013
0.99 1950 2013

The first period varied in length from each station’s beginning to 1920. Surprisingly the second period cooled in spite of the 1930s. Warming appears mostly since 1980. As mentioned above, within these averages are many differing local patterns.

Conclusion:

Question: Is it warmer now than 100 years ago?
Answer: It depends upon where you live. The best observations from US stations show a barely noticeable average warming of 0.16°C / Century. And 35% of stations showed cooling at the same time that others were warming more than the average.

Note about Data Quality.

The attached workbook for Truman Dam & Reservoir is an example of my data quality review method. There are sheets showing the incoming qcu values, removal of flags and errors, audit of outliers (values exceeding 2 St. Dev.) and CUSUM and 1st Differences analyses to test for systemic bias. Note that Truman missed out entirely on warming from 1956 to 2002, in contrast to the conventional notion of global warming from the 1970s to 2000.

Truman Dam & Reservoir also provides a cautionary tale about temperature analysis. The station’s annual averages appear to rise dramatically from 2003 to present. On closer inspection, that period is missing values for 6 Decembers, 8 Januarys and 5 Februarys. So the annual warming is mostly the result of missing data-points.

This shows why analyzing the temperatures themselves can be misleading. By relying only on the station’s monthly slopes, TTA analysis effectively places missing values on the trend line of the existing values.

Note about Fall et al. (2011).

This was the first study to use CRN 1 to 5 ratings to look at US temperature trends in relation to station siting quality. Much discussed at the time was the finding of CRN 1&2 showing warming of 0.155 C/Decade for the period 1979 to 2008. The comparable finding from this analysis is 0.151 C/Decade for CRN 1 stations.

Little noticed was Figure 10 on page 10 of Fall et al. That graph shows that CRN 1&2 rate of warming Tavg unadjusted was about 0.2 C/Century for the period 1895 to 2008. This analysis shows a comparable 0.16 C for CRN 1 for the same period up to 2013.

Excel Workbook is here: First Class US TTA

Truman Dam and Reservoir is here: 238466

Auditing the Abuse of Temperature Records

What I don’t get is the disrespect of the adjusters for the reality of micro climates. BEST acknowledges that >30% of US records show a cooling trend over the last 100 years. Why can’t reported cooling be true?

I did a 2013 study of the CRN top rated US surface stations. Most remarkable about them is the extensive local climate diversity that appears when station sites are relatively free of urban heat sources. 35% (8 of 23) of the stations reported cooling over the century. Indeed, if we remove the 8 warmest records, the rate flips from +0.16°C to -0.14°C. In order to respect the intrinsic quality of temperatures, I calculated monthly slopes for each station, and combined them for station trends.

Recently I updated that study with 2014 data and compared adusted to unadjusted records. The analysis shows the effect of GHCN adjustments on each of the 23 stations in the sample. The average station was warmed by +0.58 C/Century, from +.18 to +.76, comparing adjusted to unadjusted records. 19 station records were warmed, 6 of them by more than +1 C/century. 4 stations were cooled, most of the total cooling coming at one station, Tallahassee.  So for this set of stations, the chance of adjustments producing warming is 19/23 or 83%.

In the Quest for the mythical GMST, these records have to be homogenized, and also weighted for grid coverage, resulting in cooling being removed as counter to the overall trend.

The pairwise homogenization technique assumes that two local climates move in tandem. Thus, if one of them diverges, it must be adjusted back in line.  But I question that premise. It’s obvious that a mountaintop site will typically show lower temps than a nearby sea level site. But it is wrong to assume that changes at one should be consistent with changes in the other. Not only are the absolute readings different, the patterns of changes are also different. Infilling does violence to the local climate realities. It is perfectly normal that one place can have a cooling trend at the same time another place is warming.

Weather stations measure the temperature of air in thermal contact with the underlying terrain. Each site has a different terrain, and for a host of landscape features documented by Pielke Sr., the temperature patterns will differ, even in nearby locations. However, if we have station histories (and we do), then trends from different stations can be compared to see similarities and differences..

In summary, temperatures from different stations should not be interchanged or averaged, since they come from different physical realities. The trends can be compiled to tell us about the direction, extent and scope of temperature changes.

What Paul Homewood, Steven Goddard, Booker and others are doing is a well-respected procedure in financial accounting. The Auditors must determine if the aggregate corporation reports are truly representative of the company’s financial condition. In order to test that, samples of component operations are selected and examined to see if the reported results are accurate compared to the facts on the ground.

Discrepancies such as those we’ve seen from NCDC call into question the validity of the entire situation as reported. The stakeholders must be informed that the numbers presented are misrepresenting the reality. The Auditor must say of NCDC something like: “We are of the opinion that NCDC statements of global surface temperatures do not give a true and fair view of the actual climate reported in all of the sites measured.”

The several GHCN samples analyzed so far show that older temperatures have been altered so that the figures are lower than the originals. In some cases, more recent temperatures have been altered to become higher than the originals. Alternatively, recent years of observations are simply deleted. The result is a spurious warming trend of 1-2F, the same magnitude as the claimed warming from rising CO2. How is this acceptable public accountability? More like “creative accounting.” Once a researcher believes that rising CO2 causes rising temperatures, and since CO2 keeps rising, then temperatures must continue to rise, cooling is not an option. In fact 2015 dare not be cooler than 2014.

We are learning from this that GHCN only supports the notion of global warming if you assume that older thermometers ran hot and today’s thermometers run cold. Otherwise the warming does not appear in the original records; they have to be processed, like tree proxies. Not only is the heat hiding in the oceans, even thermometers are hiding some.

Once you accept that facts and figures in the historical record are changeable, then you enter Alice’s Wonderland, or the Soviet Union, where it was said: “The future is certain; only the past keeps changing.” The apologists for NCDC confuse data and analysis. The temperature readings are facts, unchangeable. If someone wants to draw comparisons and interpret similarities and differences, that’s their analysis, and they must make their case from the data to their conclusions. Usually, when people change the record itself it’s because their case is weak.

My submission to the International Temperature Data Review project has gone to the panel.

https://rclutz.wordpress.com/2015/04/26/temperature-data-review-project-my-submission/

About US CRN Station Ratings

The USCRN rating system classifies the sites of weather stations with ratings from 1 to 5 (1 being the best)

From the USHCRN manual:

The USCRN will use the classification scheme below to document the “meteorological measurements representativity” at each site.  This scheme, described by Michel Leroy (1998), is being used by Meteo-France to classify their network of approximately 550 stations. The classification ranges from 1 to 5 for each measured parameter. The errors for the different classes are estimated values.

  • Class 1 – Flat and horizontal ground surrounded by a clear surface with a slope below 1/3 (<19deg). Grass/low vegetation ground cover <10 centimeters high. Sensors located at least 100 meters from artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots. Far from large bodies of water, except if it is representative of the area, and then located at least 100 meters away. No shading when the sun elevation >3 degrees
  • Class 2 – Same as Class 1 with the following differences. Surrounding Vegetation <25 centimeters. Artificial heating sources within 30m. No shading for a sun elevation >5deg.
  • Class 3 (error 1C) – Same as Class 2, except no artificial heating sources within 10 meters.
  • Class 4 (error >= 2C) – Artificial heating sources <10 meters.
  • Class 5 (error >= 5C) – Temperature sensor located next to/above an artificial heating source, such a building, roof top, parking lot, or concrete surface.

Click to access X030FullDocumentD0.pdf

Starting in 2007, the Surfacestations.org project undertook to make field inspections of weather stations to classify them according to the CRN criteria. The website provides the names of 23 stations that have the CRN#1 Rating for the quality of the sites, along with all stations that have been rated thus far.

http://www.surfacestations.org/USHCN_stationlist.htm

As it happens, the CRN#1 stations are spread out across the continental US (CONUS): NW: Oregon, North Dakota, Montana; SW: California, Nevada, Colorado, Texas; MW: Indiana, Missouri, Arkansas, Louisiana; NE: New York, Rhode Island, Pennsylvania; SE: Georgia, Alabama, Mississippi, Florida.

Update to Adjustments Warming US CRN#1 Stations

In response to a comment, this post shows the effect of GHCN adjustments on each of the 23 stations. The average station was warmed by +0.58 C/Century, from +.18 to +.76, comparing adjusted to unadjusted records.

19 station records were warmed, 6 of them by more than +1 C/century. 4 stations were cooled, most of the total cooling coming at one station, Tallahassee.

So for this set of stations, the chance of adjustments producing warming is 19/23 or 83%.

Unadjusted Adjusted Adjusted – Unadjusted
Years in Stn Trends Stn Trends Stn Trends
Record  °C/Century  °C/Century  °C/Century
351862 CORVALLIS 125 0.38 1.05 0.67
350412 BAKER CITY 125 -0.01 1.48 1.49
51564 CHEYENNE WELLS 118 0.84 1.18 0.34
83186 FT MYERS 121 1.18 1.05 -0.12
121873 CRAWFORDSVILLE 115 -2.00 -0.43 1.57
97847 SAVANNAH 141 -0.09 0.56 0.65
42941 FAIRMONT 93 0.90 1.91 1.01
48702 SUSANVILLE 119 0.04 0.84 0.81
80211 APALACHICOLA 111 -0.07 0.95 1.02
86997 PENSACOLA 135 0.27 0.10 -0.17
88758 TALLAHASSEE 123 0.07 -0.48 -0.54
160549 BATON ROUGE 122 -0.07 0.74 0.81
226177 NATCHEZ 121 -0.78 0.59 1.37
238466 TRUMAN DAM & RSVR 122 -0.56 0.62 1.19
245690 MILES CITY 123 0.28 0.31 0.03
269171 WINNEMUCCA 137 0.39 1.11 0.71
308383 SYRACUSE 112 0.78 0.73 -0.05
322188 DICKINSON 120 0.59 0.69 0.11
325479 MANDAN 102 0.43 0.68 0.25
369728 WILLIAMSPORT 120 0.10 1.01 0.92
376698 PROVIDENCE 130 0.68 1.25 0.56
417945 SAN ANTONIO 130 0.30 0.92 0.62
15749 MUSCLE SHOALS 74 0.54 0.58 0.04
Averages 0.18 0.76 0.58

About GHCN Temperature Data

A post below this one presents a comparison between unadjusted and adjusted GHCN temperature data. This article provides some background information on these data and how they are treated in Temperature Trend Analysis methodology.

GHCN acts as a global repository for surface weather station records submitted to it by National Weather Services (NWS). Each NWS reviews local records according to their procedures and certifies that the data accurately represent the weather experienced in their jurisdiction. The GHCN version 3 qcu file is composed of these data (qcu signifies quality controlled unadjusted.) Various bloggers, ranging from E.M. Smith (chiefio) to Nick Stokes (moyhu) are satisfied that this file is close to the data submitted by NWS agencies.

The quality control consists of attaching flags to values appearing in the file. Because my home computer has limited power, I worked with the Taverage monthly datasets. There a monthly value is flagged with an “a” if 1 daily value is missing in calculating the average, “b” is 2 dailies missing, and so on up to 9 omissions. 10 or more missing dailies and the month is assigned a “-9999”, indicating a blank for the month. An additional column beside each month identifies outlier values.

My principle is to include all data unless there is good reason to exclude. The data preparation procedure involves unzipping the downloaded file and opening it as a word document. The station records of interest are copied into a new word document, which my notebook can handle without processing delays. The text data is then put into an excel workbook, spread into cells, -9999s converted to blanks, flags and additional columns are removed.

My data quality assurance practices include scrutinizing each value greater than 2 Standard deviations away from mean. I use CUSUM and first differences to test for step changes in the record, which would suggest a non-climatic change in the data (e.g. Change of equipment, procedure or location). In the US CRN#1 dataset I found no step changes, and the outlier values were few. I tested excluding some high or low values, but found no discernible effect on the slopes

The same procedure was followed for the qca file (quality controlled adjusted). This meant adding two additional sheets into each station’s data workbook, examples of which are provided through links below.

In the US CRN#1 unadjusted workbook, there is a sheet for each station with the data pasted into a template that calculates several measures. The basic analysis is to compute the slopes for each month (Jan, Feb, etc.) over the lifetime of that station. The 12 slopes are then averaged for the station trend. In addition, trends are calculated for several shorter periods of interest, again by combining 12 monthly slopes for that station period.  A summary page brings together results from all the stations and generates averages of trends for the set of stations, by months, and by periods of years.

Data workbooks for two stations are provided here:
350412 Baker City, Oregon 350412
417945 San Antonio, Texas 417945

Adjustments Multiply Warming at US CRN1 Stations

A study of US CRN1 stations, top-rated for their siting quality, shows that GHCN adjusted data produces warming trends several times larger than unadjusted data.

The unadjusted files from ghcn.v3.qcu have been scrutinized for outlier values, and for step changes indicative of non-climatic biases. In no case was the normal variability pattern interrupted by step changes. Coverages were strong, the typical history exceeding 95%, and some achieved 100%.(Measured by the % of months with a reported Tavg value out of the total months in the station’s lifetime.)

The adjusted files are another story. Typically, years of data are deleted, often several years in a row. Entire rows are erased including the year identifier, so finding the missing years is a tedious manual process looking for gaps in the sequence of years. All stations except one lost years of data through adjustments, often in recent years. At one station, four years of data from 2007 to 2010 were deleted; in another case, 5 years of data from 2002 to 2006 went missing. Strikingly, 9 stations that show no 2014 data in the adjusted file have fully reported 2014 in the unadjusted file.

It is instructive to see the effect of adjustments upon individual stations. A prime example is 350412 Baker City, Oregon.

Over 125 years GHCN v.3 unadjusted shows a trend of -0.0051 C/century. The adjusted data shows +1.48C/century. How does the difference arise? The coverage is about the same, though 7 years of data are dropped in the adjusted file. However, the values are systematically lowered in the adjusted version: Average annual temperature is +6C +/-2C for the adjusted file; +9.4C +/-1.7C unadjusted.

How then is a warming trend produced? In the distant past, prior to 1911, adjusted temperatures decade by decade are cooler by more than -2C each month. That adjustment changes to -1.8C 1912-1935, then changes to -2.2 for 1936 to 1943. The rate ranges from -1.2 to -1.5C 1944-1988, then changes to -1C. From 2002 onward, adjusted and unadjusted values are the same.

Some apologists for the adjustments have stated that cooling is done as much as warming. Here it is demonstrated that by cooling selectively in the past, a warming trend can be created, even though the adjusted record ends up cooler on average over the 20th Century.

A different kind of example is provided by 417945 San Antonio, Texas. Here the unadjusted record had a complete 100% coverage, and the adjustments deleted 262 months of data, reducing the coverage to 83%. In addition, the past was cooled, adjustments ranging from -1.2C per month in 1885 gradually coming to -0.2C by 1970. These cooling adjustments were minor, only reducing the average annual temperature by 0.16C. Due to deleted years of data, San Antonio went from an unadjusted trend of +0.30C/century to an adjusted trend of +0.92C/century, tripling the warming at that location.

The overall comparison for the set of CRN1 stations:

Area FIRST CLASS US STATIONS
History 1874 to 2014
Stations 23
Dataset Unadjusted Adjusted
Average Trend 0.18 0.76 °C/Century
Std. Deviation 0.66 0.54 °C/Century
Max Trend 1.18 1.91 °C/Century
Min Trend -2.00 -0.48 °C/Century
Ave. Length 119 Years

These stations are sited away from urban heat sources, and the unadjusted records reveal a diversity of local climates, as shown by the deviation and contrasting Max and Min results. Six stations showed negative trends over their lifetimes.

Adjusted data reduces the diversity and shifts the results toward warming. The average trend is 4 times warmer, only 2 stations show any cooling, and at smaller rates. Many stations had warming rates increased by multiples from the unadjusted rates. Whereas 4 months had negative trends in the unadjusted dataset, no months show cooling after adjustments.

Periodic Rates from US CRN1 Stations

°C/Century °C/Century
Start End Unadjusted Adjusted
1915 1944 1.22 1.51
1944 1976 -1.48 -0.92
1976 1998 3.12 4.35
1998 2014 -1.67 -1.84
1915 2014 0.005 0.68

Looking at periodic trends within the series, it is clear that adjustments at these stations increased the trend over the last 100 years from flat to +0.68 C/Century. This was achieved by reducing the cooling mid-century and accelerating the warming prior to 1998.

Surfacestations.org provides a list of 23 stations that have the CRN#1 Rating for the quality of the sites. I obtained the records from the latest GHCNv3 monthly qcu report, did my own data quality review and built a Temperature Trend Analysis workbook. I made a companion workbook using the GHCNv3 qca report. Both datasets are available here:ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v3/

As it happens, the stations are spread out across the continental US (CONUS): NW: Oregon, North Dakota, Montana; SW: California, Nevada, Colorado, Texas; MW: Indiana, Missouri, Arkansas, Louisiana; NE: New York, Rhode Island, Pennsylvania; SE: Georgia, Alabama, Mississippi, Florida.

In conclusion, it is not only a matter of concern that individual station histories are altered by adjustments. But also the adjusted dataset is the one used as input into programs computing global anomalies and averages. This much diminished dataset does not inspire confidence in the temperature reconstruction products built upon it.

Update

In response to a comment, this update shows the effect of GHCN adjustments on each of the 23 stations. The average station was warmed by +0.58 C/Century, from +.18 to +.76, comparing adjusted to unadjusted records.

19 station records were warmed, 6 of them by more than +1 C/century. 4 stations were cooled, most of the total cooling coming at one station, Tallahassee.

So for this set of stations, the chance of adjustments producing warming is 19/23 or 83%.

Unadjusted Adjusted Adjusted – Unadjusted
Years in Stn Trends Stn Trends Stn Trends
Record  °C/Century  °C/Century  °C/Century
351862 CORVALLIS 125 0.38 1.05 0.67
350412 BAKER CITY 125 -0.01 1.48 1.49
51564 CHEYENNE WELLS 118 0.84 1.18 0.34
83186 FT MYERS 121 1.18 1.05 -0.12
121873 CRAWFORDSVILLE 115 -2.00 -0.43 1.57
97847 SAVANNAH 141 -0.09 0.56 0.65
42941 FAIRMONT 93 0.90 1.91 1.01
48702 SUSANVILLE 119 0.04 0.84 0.81
80211 APALACHICOLA 111 -0.07 0.95 1.02
86997 PENSACOLA 135 0.27 0.10 -0.17
88758 TALLAHASSEE 123 0.07 -0.48 -0.54
160549 BATON ROUGE 122 -0.07 0.74 0.81
226177 NATCHEZ 121 -0.78 0.59 1.37
238466 TRUMAN DAM & RSVR 122 -0.56 0.62 1.19
245690 MILES CITY 123 0.28 0.31 0.03
269171 WINNEMUCCA 137 0.39 1.11 0.71
308383 SYRACUSE 112 0.78 0.73 -0.05
322188 DICKINSON 120 0.59 0.69 0.11
325479 MANDAN 102 0.43 0.68 0.25
369728 WILLIAMSPORT 120 0.10 1.01 0.92
376698 PROVIDENCE 130 0.68 1.25 0.56
417945 SAN ANTONIO 130 0.30 0.92 0.62
15749 MUSCLE SHOALS 74 0.54 0.58 0.04
Averages 0.18 0.76 0.58

The excel workbooks with data and analyses are provided for your interest and review.

US CRN1 Adjusted TTA 2014 US CRN1 Unadjusted TTA2 2014