What do you think you know about the effects of global warming on forest fires?
For example, you probably believe that global warming has been going on for a decade or more - so do I. But do you think that global warming means more forest fires? It doesn't.
Senator Tester (MT) has introduced a forest bill affecting part of MT and Senator Wyden has a bill in the discussion draft stage that will affect all of OR if passed. I think both are very good bills, and I'd like to do a diary on why, but both bills are somewhat technical, so my goal in this diary is to lay a foundation about wildfire that I can reference if I actually get around to writing the other diary.
And in the process, I'd like to provide a clearer view of the interaction between climate change and wildfire.
The paper about climate change and wildfire that has probably received the most media attention is a 2006 paper by Westerling, et al entitled Warming and Earlier Spring Increase Western U.S. Forest Wildfire Activity. It's a fairly easy-to-read paper, which, unfortunately ignores a lot of factors that influence the occurrence and size of fires (which in turn, aren't even as important as some other attributes of fires generally), and relies on a "correlation equals causation" argument to draw its conclusions.
The Westerling paper argues that:
... we show that large wildfire activity increased suddenly and markedly in the mid-1980s, with higher large-wildfire frequency, longer wildfire durations, and longer wildfire seasons. The greatest increases occurred in mid-elevation, Northern Rockies forests, where land-use histories have relatively little effect on fire risks and are strongly associated with increased spring and summer temperatures and an earlier spring snowmelt.
Westerling, 2006
In the paper, the authors clearly state that 'wildfire' means fires larger than 1000 acres (400 ha), although "more large fires" almost always gets translated into just "more fires". The paper was widely reported in the media, and widely referenced since, almost always with the implication of "more and bigger fires" (including one of the authors when interviewed on NPR), when in fact the paper really only claims "more big fires", and that's an important difference.
An understanding of fire basics is helpful in understanding this paper, and its failings. For a fire to happen, three things are necessary, which are usually represented as the three sides of the "fire triangle": heat, oxygen, fuel. Remove any one of the three sides, and the triangle (or fire) collapses.
Blowing out a candle or using a CO2 fire extinguisher removes oxygen; throwing water on a fire mostly removes heat; when wildland firefighters do a "backburn" or "burn out", they remove fuel. Alternatively, if you want to start a fire, you need all three elements.
Oxygen, any time in the last 470 million years and any time in the forseeable future, regardless of CO2 concentrations, is always available. Fuel, in the case of a forest fire, is not just trees (in fact trees don't count much at all when estimating fuel loads in forests) - it's grasses, forbs (wildflowers, lightweight plants), woody shrubs, dead or down trees, houses (as Californians well know) and even people in the worst conflagrations.
But global warming will never produce the heat necessary to ignite a fire. That takes an ignition source, and those come in two flavors: natural and human. Natural ignition sources can be a volcanic eruption or sparks from a rock rolling down a hill, but almost entirely, natural ignitions are caused by lightning. Human ignitions are nearly any other cause - a tossed cigarette, an escaped campfire, burning love letters in the woods, sparks from a locomotive or welding, downed electrical wires, arson, etc.
Humans cause fires more often than lightning by a ratio of more than 5 to 1. (Statistics are for 2001-2008 from this table from the National Interagency Fire Center). But in the western US (the region covered in the Westerling paper), lightning ignited fires burn more acres by more than 3 to 1 or nearly 4 to 1 if CA fires are ignored. In S CA, nearly 90% of acres burned over the period tabulated were from human ignited fires. In the eastern and southern US, lightning only accounted for 9% and 21% of acres burned respectively.
That's not to say human caused fires can't be large - in 2002, the Rodeo fire was started by a firefighter hoping to create work and the Chediski fire by a lost hiker trying to signal a news helicopter, and when the fires combined, over 467,000 acres had burned. But on average, 73% to 78% of acres burned in the west were in fires ignited by lightning. That's because most unextinguished lightning fires (and most are extinguished before growing large) start in remote country, develop slowly, and then blowup and become difficult to fight.
Now here's the thing about lightning storms: they rarely produce only a single bolt. In fact during a typical storm, with or without rain, hundreds or thousands of strikes to ground occur. Not all ignite fires of course, and of the fires ignited, only about 3% grow to more than 10 acres. But when lightning strikes, it doesn't know if it's going to cause no fire, a small fire or a huge fire.
So the argument in the Westerling paper that higher temperatures and longer fire seasons ("...earlier spring snowmelt") are responsible for more large fires would seem to require that the same conditions would produce more fires overall.
They don't.
Back to the National Interagency Fire Center again: this table lists the number of fires and total acres burned (nationally - probably includes brush and grass fires) from 1960 to 2008. In 2008, 78,979 fires were started, but the peak year for fires, by more than 3 times, was 1981, with 249,370. And it isn't just a single year.
The authors of the Westerling paper, to determine if fires were larger now, divided their data into two periods: 1970-1986 and 1987-2004. From 1970 to 1982, there isn't a single year with less than 100,000 fires (the minimum was 108,398). From 1987 to 2008 there isn't a single year that has that many fires (the maximum is 96,385) and both the trend and 10 year moving averages over the entire period confirm that result.
So does global warming mean more forest fires? No - not if global warming is already occurring, as I and the authors of the paper, would claim. The authors don't claim "more fires", just "more big fires", but some explanation for the decrease in total number of fires over time and warming would be necessary to salvage the assertion of longer fire seasons/more fire weather.
If "more" isn't true, how about "more bigger"? That's certainly true, as a look at the NIFC link will show. But is that caused by global warming? Some perspective is needed. In the 1870s and 1880s, there were fires over a million acres in WI, MN and MI. In 1910 (remember that date), the "Big Burn" - the largest wildfire in US history - burned over 3 million acres in WA, ID and MT. And in fact, "[f]rom a historical perspective it appears that current levels of area burned are low compared to the last 500 years".(Climate Change and Forest Fires, Flannigan, et al, 2000).
If that didn't raise some skepticism, let me repeat it more clearly: around 1910 and 500 years before, many more acres burned annually than during the exceptionally bad fire years from 2000 to 2007. What?! You might say - that's stupid! Back then there wasn't any Forest Service to fight fires, and for most of that time the west hadn't been settled by Europeans.
Exactly.
Something besides climate has a significant effect on the size of individual large fires, or the total acres burned in a year. But you won't find that in the Westerling paper, because it only correlates climate with large fires and doesn't really look at causality.
The Big Burn in 1910 was a watershed in US fire history (see, for example, Timothy Egan's recent book entitled The Big Burn for the politics and facts). The fledgling Forest Service performed heroically and firefighters died in that blaze, and Pinchot and Teddy Roosevelt (both out of office) were able to leverage those heroics against timber special interests to bring a more rational and sustainable forest policy to bear, principally through the growth of the US Forest Service. And before 1910, climate and acres burned correlated strongly nearly everywhere in the west. After 1910, that correlation remained in some places, and disappeared almost entirely in others, probably due in large part to fire suppression.
So let's look at the two intervals the Westerling authors used for comparison - 1970 to 1986 and 1987 to 2004 (essentially "non-global warming" and "global warming" - my terms). If you look again at the NIFC table, you'll see that near the tail end of the "non-global warming" period the years 1983 and 1984 had an exceptionally low number of both fires and acres burned - shifting the "global warming" period back a few years would have made some difference in the results.
But let's focus for a moment on one year - 1988 - and one fire - Yellowstone - to see another deficiency in the way the data was used. The 1988 Yellowstone fire was unique in several respects. The first was something called "fire return interval". For a variety of reasons (for example, a heavily burned area might have to grow back before it can burn again), fires occur at somewhat regular intervals in specific areas. In some forest types, fires, before fire suppression, occurred every 2 to 5 years on average. In Yellowstone and surrounding forest, the fire return interval was 250 to 300 years. 1.5 million acres burned in 1988.
During the rest of the 20th century, across 99 years, only 50,000 acres in Yellowstone burned in total ("...c. 1700 to 1987, intermediate to small areas burned. The near-absence of fires in the twentieth century prior to the large fires of 1988 is evident in the charcoal record." A 750-year fire history based on lake sediment records in central Yellowstone National Park, USA). The paper doesn't take into account fire return intervals, for example the possibility that a number of "300 year fires" or something similar aligned in their second period, when the effects of global warming were expected to occur.
More than being an outlier, the Yellowstone fire was managed differently than any large fire before. Beginning in the late 1960s, fire ecologists recognized, and fire managers were slowly convinced, that fire was a natural feature, in fact a necessary feature of forest ecosystems, and should be allowed to burn, rather than being aggressively suppressed. The Forest Service now calls this policy "Wildland fire use for resource benefit" (or just the shorter "wildland fire use"). Some authors, and most detractors of the policy, call it "let it burn". Which is initially what was done with the Yellowstone fire, and one of the reasons it grew so large.
Look at the recent changes in how western forests have been dealt with. During the 1960s and into the 1970s, logging, including large clear cuts, was rampant in the west. By 1976, logging on public lands produced about 6 billion cu ft of timber annually. By 2001, that number was down to 2 billion cu ft and still falling.
Concurrently, in 1964, the Congress passed The Wilderness Act, which withdrew from exploitation and nearly any human activity increasing acreages - starting from 0 to over 104 million acres today. In 2000, another 65 million acres of roadless areas were added.
Logging, as practiced in the 1960s or even today in many cases, is not a contributor to forest health generally. But clear cutting does remove huge amounts of fuel from forests and creates large clear areas (initially) that act as fire breaks. The first interval in the paper is proximate to heavy logging, the second interval is one of increasing regrowth of clearcut areas - more fuels.
In areas that are now wilderness or roadless, fire suprression was pursued more aggressively in the 1960s and 1970s. That would serve to depress burned acreages in the earlier interval's years.
Just as in Yellowstone, fire management region-wide shifted to a new paradigm, particularly in wilderness and roadless areas. Beginning in about 1982 and increasing in subsequent years, some fires (although probably never more than 5% of acres burned in any given year) were "let burn". But fire managers also attacked fires differently.
The priorities of the Forest Service increasingly shifted to 1) firefighter safety and 2) protection of private property in what's called the "wildland/urban interface" (that means pretty much any place there's a house). In addition, while some might consider wilderness "pristine", some wilderness had large areas of diseased or insect-killed trees - more fire-prone - or large fuel buildups from years of fire suppression or natural disturbance like windthrow. And it's difficult to put firefighters in remote areas, especially when safety is a higher priority than firefighting.
For all of those reasons, fire managers have shifted to more indirect attack of fires, using roads or existing burned areas as fire boundaries and letting more remote or wild areas simply burn - not only as official "let burn" prescribed fire, but as unofficial "let burn" firefighting tactics. As Stephen Pyne noted:
Recently, I had occasion to examine the history of fire on the Kaibab Plateau (and Grand Canyon National Park, USA) and could map the order-of-magnitude increase in burned area directly to reforms in policy, personnel, and practices. The program committed significant amounts of money and administrative attention to increasing the amount of burning on the land. Instead of suppressing new fires immediately, they have granted more room for fire to roam. Twice, fires left to burn ("wildland fire use") have blown up, once to 50,000 acres and again to 58,000 acres. Two decades ago, they would have been hit and held immediately (since modern record-keeping the largest burn was 6,000 acres, and 300 acres was considered nearly a fire of record in the park). Similarly, two prescribed fires have escaped, and yielded big burns, one causing the park to be evacuated and closed.
While the old strategy, aiming at fire exclusion, was by itself unsustainable, it is clear that choices about how to contain fire, and when and where to set fires, have altered the equations. They have done exactly what they were supposed to do. They boosted burned area. Of course, one case study is an anecdote, not a statistic, but until similar studies have examined the remaining public domain, it is impossible to blame global warming or extended fire seasons or a legacy of fuels buildup alone or together for the inflation of burned area.
If that isn't enough to question the validity of the remaining claim of the Westerling paper - more bigger fires due to climate change - here's a last detail, returning to the fire triangle's fuel leg.
As Pyne points out, because fire size is determined by firefighting policy to a large extent, neither global warming nor fuel buildups can be definitively called the cause of larger fires. But there are a couple of fundamental questions the Westerling paper ignores completely: Are all fires bad? If not, which fires are bad?
The implicit assumption in the "global warming/more fires" idea is that fire is bad, and as noted above, it's not. It follows then that more fire isn't necessarily bad either. Or good. It depends on what burns and how. In Yellowstone, the forests were largely over-mature lodgepole pine which were due for a high intensity stand replacement fire (a stand replacement fire is one in which nearly all trees are killed).
But in many western forests, fire has been excluded for as many as 20 normal fire cycles, and more fires would be good - if the forests were in a condition to allow normal burning. In forests that burn frequently, the health of the trees depends on frequent fire to eliminate competing brush and small trees and to recycle nutrients.
When fire cycles are missed, the large trees that should survive a fire are stressed, and worse, the fuels that would be kept to a minimum by frequent fire build up. What this kind of forest wants is a low intensity ground fire that trees in this kind of ecosystem have adapted to and can survive, and even need to stay healthy. What it gets, instead, is a high intensity crown fire that not only kills nearly everything, but sterilizes the soil so that any recovery takes longer. If climate change really led to more frequent fires, that would be a good thing for this ecosystem, and forests would eventually become more fire resistant - it would be a return to pre-Eurosettlement conditions. McKenzie, et al suggest that "... in ecosystems whose fire regimes have recently been altered by fire exclusion,climatic change may accelerate the restoration of historic fire regimes, thereby reducing threats to vulnerable species" (Climatic Change, Wildfire and Conservation (2004) - this is only cited as possible, and is counter to the jist of the entire paper, which is a much better look at fire and climate change than the Westerling paper)
That leads to the fundamental problem with the data in the Westerling paper - it's about acres burned, and acres burned is not really a meaningful statistic. For example, the Biscuit Fire in S OR/N CA in 2002 was a 500,000 acre fire - large by any standard. But the 500,000 acre number included within its boundaries about 300,000 acres that either didn't burn or burned lightly. What's important is how forests burn, not how many acres are within the fire boundary.
After all this, it may seem odd to state that my own opinion is that climate change, assuming regional models are correct and climate change produces more fire weather and longer fire seasons, will lead to more fires. As to why the number of fires has decreased so far with global warming - I have no explanation, other than it's probably some other factor besides climate. Fire weather continues to occur regularly throughout the western US, as it always has. Fires may even be larger, although neither the paper criticized here nor other (better in my opinion) papers have been convincing so far in linking that statistic to climate.
What is not a given is that more or even larger fires have to be consistently more destructive fires. Fires - especialy large, human ignited fires - have occurred in every year for the last 10,000 or so years, and forests and wildlife both have adapted to and actually need fire, as much as they need water or nutrients. In a sequel to this, I hope to discuss the Tester and Wyden forest bills, which, if enacted would go a long way towards returning fire to a beneficial feature of western US forests.
In brief then, most research linking warming and fire concludes that more fire weather and longer fire seasons will create more dry fuels which will ignite more easily and burn over larger areas, and to an extent these papers assert the dominance of climate change over fuels as determining future fire behavior. But implicit in that assertion is that the fuels are there to begin with. Here's a simple experiment you can try next time you're in the woods: hold a lighted match against a tree - the odds are pretty good it won't burn. Next hold a lighted match against some dry grass, and as you're running away from the resulting conflagration, watch as the fire spreads up the "fire ladder", from grass to light forbs to woody brush up the sides of trees and into the crowns. Remove the high density fuels, break the fire ladder, and the fire will stay on the ground and most trees (and by implication - the forest) will survive and actually be improved. And that's true no matter what future climate turns out to be.