Among the many tortures in store for us over the next years, John Shimkus(R-IL) or Joe Barton (R-TX) will most likely be holding hearings on the "fraud" of human-caused global warming, expressing apologies to God and/or BP for any doubts that have been expressed about their environmental stewardship. Now may be a good time to help spread a little information on what actually became of the recent uproar over "Climategate" and the famous hockey-stick reconstruction of the temperature record over the last millennium.
First, we need to be clear about the role of these millennial reconstructions of climate history. They play no direct role in the modeling of the last century's temperature, for which we have have a combination of good thermometry records and physics-based theory. (see e.g. my little primer) Why then does anybody care enough about the "hockey stick" graph (showing roughly constant temperature, T, until the rise over the last 100 years), to make a fuss about it?
Unfortunately, given its fairly large uncertainties, the hockey stick was used as a dramatic illustration of warming for public presentations. Within the scientific argument, however, it plays a more modest role. Essentially, the reconstruction serves as a check of whether there's any evidence that something major has been left out of the physical models. In some sense, it's a little too easy to fit the actual T data (last ~150 years) since the key feedback parameter is not well-determined by the models themselves. Although the physical models do a good job, and lead to alarming predictions for what's coming up, we would like to make sure that they aren't just picking up some stray signal from some unknown source. That's why it's good to look back for 1000 years to make sure that changes like those over the last 100 years aren't just happening for unknown reasons.
Thus there's been a program of looking at "proxy" physical variables which can be dated and which are sensitive to T to reconstruct T's history. These proxies include tree-ring widths, certain yearly geologic deposits, pollen records, etc. The reconstructions aren't especially easy, since the proxies are somewhat imperfect records of local T, and the set of usable proxies doesn't uniformly cover the whole northern hemisphere, much less the whole globe.
The first widely publicized version of the hockey stick (Mann et al., 1998)had a mathematical error and, probably more seriously, failed to properly consider the role of various types of random time-varying errors (noise) in making the estimates of past T uncertain. These errors were seized upon with glee by some of the climate skeptics. More careful versions soon followed, but recently another (rather polemical) paper proposing larger error bars has been published, again with much reverberation in the non-scientific community.
One might fear that the whole reconstruction issue would remain in a polemical swamp, with bad winds wafting over toward the actual climate prediction process, to which the reconstructions are semi-relevant. Fortunately, however, several groups of serious statisticians have now gotten involved. The most recent paper (http://pubs.amstat.org/...from which earlier papers can be traced) in Journal of the American Statistical Association (September 2010, Vol. 105, No. 491, pp. 883-911, DOI: 10.1198/jasa.2010.ap09379) applies what's known as Bayesian hierarchical modeling to the problem. Such modeling attempts to realistically consider the likelihood of obtaining all the various types of data found given sensible flexible assumptions about prior knowledge. The paper concludes that proxy methods can do a good enough job of reconstructing past climate even in the presence of random noise. In published comments, discussants of the paper consider various possible modifications, improvements, further tests, and simpler principle-component alternative techniques. I am not an expert in these methods, but can read the papers well enough to see a few things:
- The discussions are now normal science, with the advocates of different methods generally agreeing on the strengths and weaknesses of the different approaches, and on the most promising ways of going forward.
- There is no evidence of any previous instance in the last 1000 years in which there was a change like that of the last 100 years. Thus there is no reason to think the models are distracted by too much unexplained noise.
- It still looks most likely that we are currently at the warmest point in 1000 years, although I think that particular much-discussed point is less important for our predictions than the absence of previous unexplained large changes.
What then does Climategate have to do with this science? Proxy reconstruction was the main scientific issue in Climategate. Although most of the revelations were irrelevant gossip, one wasn't. At some point, proxy reconstructors chose to make a presentation that would "hide the decline." That was not, contrary to many denialist descriptions, a decline in T, but rather a decline in a type of proxy. Over the period where we know T was going up, some tree-ring proxies first went up and then down. That means that these proxies were more complicated than one would like. (A tree might grow better in warmer climate, until it gets stressed as things heat up too much.) Obviously, it's wrong to shuffle that sort of complication into a side note, rather than being forthright with it.
Such complications add to the uncertainty about how effective the proxies would be at catching past anomalies. Using independent types of proxies (boreholes, pollen,...) is needed to make sure that anomalies aren't missed. It's important to note however, that uncertainty about the effectiveness of proxies in catching old anomalies is not the same as any positive evidence that there were such anomalies. And if (as seems unlikely) there were such anomalies, that would not be evidence that current global climate model estimates of warming should be reduced but rather evidence that they need larger error bars.
What's the bottom line? As I discussed before, the basic science of predicting the future effects of our greenhouse emissions continues to make slow progress. Uncertainty about past anthropogenic albedo changes will make it hard to reduce the uncertainties much until more data come in, i.e. until the best window for changing course has closed. The partially relevant side issue of reconstructing historical climates is now also making normal progress, after a start marred by sloppiness and by exaggerated polemical attacks. Whether or not we choose to be explicit about it, we are necessarily making a crucial decision under uncertainty, as we always do for everything important.