Most public discussion of global warming, whether in small-town letters to the editor or on blogs, consists of hurling talking points or citations to authorities. I have no illusion that can be changed, but perhaps a brief primer on the underlying science can equip some readers to do go a little farther on their own before having to fall back on experts.
The average temperature (T) of the earth's surface is set almost entirely by a balance between the radiant energy coming in from the sun and going out from the earth. (Energy flow from the earth's innards exists, but is unimportant.) The way in which the spectrum of radiant energy depends on T of a perfectly absorbing/radiating object ("black-body") has been known since the late 19th century, and precisely characterized since 1900, when it formed the basis of Planck's initial quantum hypothesis. The hotter an object is, the more it radiates, and the more the spectrum extends to high frequencies. Since T of the sun's surface is precisely known (from that spectrum!), as are its diameter and distance, one can calculate precisely how hot an ideal black-body here would have to be for its radiation to equal the outgoing radiation. The result is very roughly the earth's average surface T, but not precisely.
The earth is not a perfect black-body. A substantial fraction of the incoming energy (mainly visible light) is reflected, e.g. by clouds and ice. An even higher fraction of the outgoing energy (almost all infrared) is trapped by greenhouse gases, especially water vapor, CO2, and methane. As a result, the earth is warmer than an ideal black body would be. Any change in either of those two effects will change the energy balance and thus drive the surface T up or down by an amount ΔT.
Up to this point, everything I've written is truly universally agreed upon by every competent scientist. For example, Will Happer, distinguished physicist and noted vociferous climate contrarian testified to the Senate:
"Atmospheric concentrations of carbon dioxide (CO2) have increased from about 280 to 380 parts per million over past 100 years. The combustion of fossil fuels, coal, oil and natural gas, has contributed to the increase of CO2 in the atmosphere. And finally, increasing concentrations of CO2 in the atmosphere will cause the earth’s surface to warm."
Furthermore, there is only modest uncertainty in the magnitude of these direct effects. Calculating the flow of radiation through a partially absorbing atmosphere is a straightforward engineering exercise, not a problem in complex modelling. Finding the ΔT needed to balance a given energy flow change is a simple sophomore homework problem. A fairly accurate version of this whole calculation was published by Svante Arrhenius in 1906.
Why then is there dispute as to what to expect? The uptake of CO2 by the oceans is somewhat uncertain, but this is not the major problem. The big uncertainty comes because a little warming releases more water vapor, which in turn creates more greenhouse trapping and more cloud reflection, and also melts some reflective ice, may melt areas in which methane is trapped, etc.. (Water plays a special role here because it evaporates and rains out quickly, so it adjusts to a level set by other parameters, such as solar input and CO2 concentration.) Some of these responses amplify ΔT, others counteract it. Quantifying these effects is hard, but important in estimating the overall size.
Lets say that the effects of a 1°C rise created secondary changes of x°C. Then those would trigger tertiary changes of x^2 °C, etc. The net change would be 1°C* (1+x+x^2+...)= 1°C/(1-x), so long as |x| <1. If the feedback x is positive, it amplifies ΔT. If it's bigger than 1, the series diverges, and the temperature runs away to a new stable point at some completely different value. Despite various past changes in solar radiation and atmospheric properties the earth's temperature has been reasonably stable for the last 500 million years (although with some ice ages and warm periods), so it seems quite unlikely that x>1. If modest changes would trigger a runaway effect, it would have already happened. (Possible 'snowball' runaways in the distant past suggest that one can't exclude the possibility of runaway for larger perturbations.) However, there is strong reason to believe x > 0, so these feedback effects do amplify the changes even if they don't lead to runaway. The complicated and uncertain modeling mainly is aimed at finding x.
One way to estimate x is to look at known changes in solar input, approximately known changes in reflection driven by volcanoes, etc. and known changes in greenhouse absorption, and then see by what factor the direct effects of these 'forcing terms' have to be multiplied to get the recorded ΔT. Because we don't have long reliable precise records of some of these terms, this calibration is only approximate. The most important uncertainty is in the change in the atmosphere's reflectivity from industrial emissions. Furthermore, there is a complication- x for greenhouse gases doesn't have to be exactly the same as x for solar changes, because they have different geographical distributions.
So another method is to try to actually model the complex dynamics of the earth's climate with computer codes involving at least hundreds of parameters. There's no way to know all the parameters with sufficient accuracy. In practice, what's done is to combine the modeling with the observed constraints, including measured temperature changes in various locations (oceans, troposphere, stratosphere...). What's left is a large family of models and parameters which fit the observed data well enough. (The less sophisticated skeptics don't seem to realize that if one leaves out the effects of our greenhouse emissions, that family becomes the null set. The other forcing terms, such as recent solar changes, just aren't big enough to compete with the recent CO2 effect, and cannot account for recent warming.) Different members of that family extrapolate to different future effects. The consensus estimates and uncertainties are based on the behavior of this family of models.
This introduction has left many FAQ's unanswered, for brevity. Happer's basic contrarian point does need to be addressed, however. His general claim (apart from some non-expert arguments about the feedback modeling) is that a warmer world with more CO2 could be a very nice place. It's true that our current climate is not uniquely good. What is uniquely good is that it isn't making any very rapid changes. Pine forests cannot quickly move 500 miles north to escape new insect predators, and Bangladesh cannot quickly relocate to Greenland. Global warmth may be fine but rapid global warming is dangerous for organisms like pine trees and us. The human costs of rapid climate change need to be compared with the costs of mitigation, allowing for uncertainties of both signs in both quantities. Given that the problems grow faster than linearly in ΔT, uncertainty about ΔT actually increases the average expected cost of the status quo, despite the general impression that uncertainties favor inaction.
"At the advent of danger, there are always two voices that speak with equal force in the human heart: one very reasonably invites a man to consider the nature of the peril and the means of escaping it; the other, with a still greater show of reason, argues that it is too depressing and painful to think of the danger since it is not in man's power to foresee everything and avert the general march of events, and it is better therefore to shut one's eye's to the disagreeable until it actually comes, and to think instead of what is pleasant." Leo Tolstoy