|Click to embiggen|
I've spent the past few days looking at the data and making this chart, showing the history of global temperatures during all of human civilization, and where we are currently projecting to be at the end of this century. The next time someone tells you things won't get that bad because climate sensitivity is low, show them this chart.
Geeks wanting more details on how it was made can follow below the fold.
D. M. Anderson, E. M. Mauk, E. R. Wahl, C. Morrill, A. J. Wagner, D. Easterling, T. Rutishauser. Global warming in an independent record of the past 130 years. Geophysical Research Letters Volume 40, Issue 1, pages 189–193, 16 January 2013. DOI: 10.1029/2012GL054271.
Shaun A. Marcott, Jeremy D. Shakun, Peter U. Clark, Alan C. Mix. A Reconstruction of Regional and Global Temperature for the Past 11,300 Years. Science 8 March 2013:
Vol. 339 no. 6124 pp. 1198-1201. DOI: 10.1126/science.1228026
Anderson's paper is behind Wiley's paywall, but the Supplemental Information contains all relevant data, which are 173 proxy temperature series covering 1730-1995. Marcott's data is also in Supplemental materials, and are 76 proxy temperature series covering 11,300 BP to -10 BP (where "BP" means years before 1950). Tamino recently noted an issue with the final few datapoints in Marcott (proxy dropout was causing spurious data spikes) which he corrected; I have followed Tamino's differencing procedure with both Marcott's data and Anderson's data.
The Anderson data was geographically weighted using a gridless system as follows: the great-circle distance from each dataset to every other dataset in a given epoch was determined, each distance was squared (to mimic the area coverage of a location), and the squares were added to give the final weight of that location. Then a standard weighted average was taken using those weights. This was a bit of a hassle and probably wasn't worth it, as the weighted and unweighted data are virtually indistinguishable. Therefore I didn't use geographical weighting for the Marcott data, as it is clear from his graphs that it makes little overall difference there either.
HADCRUT4 is the modern thermometer temperature record, available here.
Having all the data, I first reduced the HADCRUT4 and the Anderson data to 20-year means, centered on years evenly divisible by 20; this was needed so those data would match the data resolution of Marcott. Then I aligned the Anderson data against HADCRUT4 to minimize the variances during the period of overlap (1860-1980). The two datasets align almost perfectly during the years 1920-1980, with somewhat less agreement during earlier epochs.
Next I aligned the Marcott data against the Anderson data during the period of their overlap (1740-1940). While both datasets show rising temperatures during this period, Anderson shows more rise, primarily because it rises smartly during the 18th century, when Marcott is fairly static there.
Finally, I introduced three datapoints showing the temperature rise under the IPCC's A1B scenario (which involves modest reductions in greenhouse emissions). The IPCC projects radiative forcing of 6.03 Watts/m² by the end of the century. Since doubling of CO2 results in 3.7 W/m², that amounts to the equivalent of 1.37 doublings. Multiply that by the three chosen climate sensitivities (1.6, 2.4, and 3.2°C per CO2 doubling) gives you the three possibilities for 2100 temps shown on the graph.
There is a fair amount of uncertainty over the actual climate sensitivity. FWIW, I personally prefer one of the best long-term studies of sensitivity over the deep past (Royer et. al. 2007) which indicates that the climate sensitivity is about 2.8 W/m².