One of the longest-running climate denial narratives has been that the thermometer record showing a steep rise in temperatures is flawed and unreliable because cities exist and are warm, while satellite data shows no warming to worry about.
They have it backwards, of course, as the thermometer record is one of the most solid data sets around (since it’s been standardized and officially measured for so long) and the satellite data isn’t even actually data, but instead a series of inferences that have repeatedly been corrected to show more warming than deniers would like to accept. Even a Koch-funded effort to examine the temperature data to try and prove the Urban Heat Island effect was the real reason for a perceived warming revealed that actually, nope, climate scientists were right all along!
Still though, according to deniers, climate scientists are using unreliable thermometer readings to make the case for warming.
Keep that in mind as we turn to a recent climate-and-wildfires fact check of a Climate Realism post, where they frame their climate denial as complaints about how Google serves news about climate change when they search for climate change. With the same stupid formula as always, Heartland’s H. Sterling Burnett wrote last week that “near the top of a Google news search for the phrase ‘climate change’ today turns up a story in The Hill claiming the media is failing to properly place the blame for wildfires on climate change.”
Apparently that’s false because some stories have “blamed climate change for wildfires,” but per Burnett, “they are all wrong. Data do not show a significant increase in the number of wildfires or the acreage burned by them.” In fact, “Climate Realism has refuted more than 46 news stories” making the point that warming makes fires worse, because “the data does not support these stories’ claims.”
Well, about that data… As has been pointed out often in the many years Heartland has been lazily repeating itself, the fire data deniers use to claim that the 1920s and 1930s saw way more acreage burned is wildly unreliable, and this year it was even removed from the government website where it used to be housed with a disclaimer that the old data is unreliable. Why? Well for starters, fire researcher John Abatzoglou told Jon Greenberg at Politifact that “some fires can be counted in triplicate as multiple agencies responding to the fire would count that fire in their summary statistics.” Record keeping was duplicative as each agency had its own methods, so the final tally would double or triple-count the fires, depending on how many different federal bodies were involved.
And in fact, just like the temperature record, even other Koch-funded deniers have acknowledged that the data is unreliable, as Greenberg cites a 2018 article by the Cato Institute’s Randal O’Toole that explains why “the data before about 1955 are a lie.” In short, and helpfully corroborated by other more reliable sources, the Forest Service counted intentional fires set to control underbrush as wildfires, dramatically inflating the statistics.
So because different agencies used to keep records differently, and because intentionally-set, managed fires were counted as wildfires, the data before 1983 is well-known to be unreliable and has since been removed from the public website to prevent any further confusion.
Accordingly, Politifact rated Heartland’s claim “Mostly False,” so if climate denial is really about the integrity of the data, surely Heartland won’t recycle another 46 nearly identical blog posts that rely on data that’s well-known to be unreliable.
All while claiming the data proven to be reliable is what’s actually supposedly suspect.
If it’s just about repeating a lie often enough to try and make it true though, we eagerly await Heartland’s next “mostly false” post.