Jon Cassidy has just written a piece for Human Events that takes issue with a blog post of mine and a related article by Chris Mooney that contend conservatives have a bigger problem with factual accuracy. The piece, meant to defend conservatives against the charge of having little concern for reality, is called: "PolitiFact bias: Does the GOP tell nine times more lies than left? Really?"
Does PolitiFact say Republicans lie nine times more? Really?
No, but I'll come back to that. I'm not going to delve into his assertion Michelle Bachman's "death panels" claim has merit, or that Obama robbed Medicare. Dozens of people have written about those examples, and Cassidy writes nothing to progress the argument. What makes Cassidy's critique worth commenting on is it is a rare case in which both left and right should agree on the basic facts.
The complaint people like me have had with the current state of our political discourse is that ideally we ought to all be able to agree on the facts but differ on how we respond in terms of policy. Here is a place where we should both agree: fact-checkers find conservatives more wrong, more often. Where we differ is in our interpretations: Cassidy sees this as proof of liberal bias, whereas I see it as evidence conservatives are more wrong, more often.
Even more interesting, this is a rare instance in which our "confirmation bias" ought to push us both in the same direction: Cassidy wants data showing bias, and I want data showing Republicans are wrong. So it is instructive to see how left and right deal with the same facts when our ideological "needs" are in sync.
One piece of evidence Mooney uses in his piece for The Nation, "Reality Bites Republicans", is a study by the University of Minnesota's Humphrey School of Public Affairs. The study found that from January 2010 through January 2011:
[W]hile the site fact-checked roughly as many statements by current or former Democratic elected officials as current or former Republican officeholders during this period (179 versus 191, respectively), Republicans were overwhelmingly more likely to draw a "false" or even "pants on fire" rating (the worst of all). Out of the ninety-eight politicians' statements that received these dismal ratings, seventy-four were made by Republicans or 76 percent. Sarah Palin and Michele Bachmann fared worst, with eight and seven PolitiFact slams, respectively.
Cassidy cites the study, too, but only in passing. What he chooses to focus on instead is a study by PolitiFactBias, a blog run by "independent bloggers who share a sense of outrage that PolitiFact often peddles outrageous slant as objective news," Jeff Dyberg and Bryan White. This is how Cassidy presents their study:
By one count, from the end of that partnership to the end of 2011, the national PolitiFact operation has issued 119 Pants on Fire ratings for Republican or conservative claims, and only 13 for liberal or Democratic claims.
In another tally, just of claims made by elected officials, Republicans lose 64-10 over the same three-year period.
Those numbers were compiled by Bryan White, who co-founded PolitiFactBias, a blog dedicated to chronicling examples of what he considers poor reasoning, sloppy research, or bias by the PolitiFact.
It is those numbers Cassidy relies on for the title and entire thrust of the piece. PolitiFact finds Republicans wrong nine times more often than Democrats! I'd love to say that, too, but those numbers don't even pass the smell test. UM's 76% is startling enough, but Cassidy has to go for the nine times.
There's nothing wrong with what the guys at PolitiFactBias are doing. I write for blogs and am not above performing a little research of my own, but White's research, no matter how well intentioned, is not peer reviewed. It is not a legitimate study by academics. It is compiled by a person for the explicit purpose of discrediting PolitiFact. And White is under no illusions. In an email, he wrote, "my research approach is ridiculously easy, so duplicating my results would be simple for virtually anyone."
The study provides graphs of his results. In the description of his methodology White claims to throw out "potentially ambiguous" categories and party-on-party claims "reasoning that a contest between two Republicans or two Democrats makes the dilemma for biased journalists more complicated and thus less useful as a measurement of partisan bias." He acknowledges "PolitiFact journalists may realize what the rating data say about their bias and make allowances to appear more fair." In short, he displays the sort of approach I would expect from an intellectually honest person.
But nowhere in the study could I find where PolitiFact finds Republicans wrong nine times more often. I asked White if he had any idea where the number comes from. He writes:
That 9x figure had me puzzled for a bit as well, since it is not a figure anywhere emphasized in my study. But I think I have the answer you're looking for. Cassidy looked purely at the disparity between the total number of supposed "Pants on Fire" statements in my "C" group since the end of the partnership with Congressional Quarterly¦ That was the group I hypothesized probably represented an exaggerated measure of bias since it includes email claims (predominantly from conservatives and predominantly far-fetched). I think the best way to look at the disparities is to examine them as a percentage of the total number of false statements by party. If you read the study, as you appear to have done, you'll know why.
White sent me his data, and the numbers do appear to match with the numbers for chain emails alone. Needless to say, Cassidy does not tell his readers about this selective number, but presents it as Pants on Fire ratings for Republican or conservative claims versus liberal or Democratic claims, period.
Here's the thing. Cassidy didn't need the nine times figure. He could have stuck with the UM study but it wasn't sexy enough. He chose to highlight, instead, the research of a blog to get more extreme numbers. Finding those numbers not extreme enough, he selectively cherry-picked data to the point the study's own author didn't recognize them.
Cassidy was happy to let his readers assume he was referring to PolitiFact findings as a whole, when in fact White found PolitiFact issued "Pants on Fire" ratings for conservative falsehoods 74% more often. In other words, 2% below the UM study (White thinks it's a coincidence because they measure different things: "Perhaps the numbers match because the degree of bias is similar, but it's probably a stretch to credit me with accuracy based on the similarity of the numbers.")
When Cassidy and I write about PolitiFact's skew, I chose to reference the scholarly study. Cassidy preferred the study of a blog with an agenda similar to his own, and then misled his readers as to what the study actually said. Then there is the other conservative White who, though a simple blogger doing his own research, has shown himself in this case to be more intellectually honest than the esteemed Human Events.
It is not the conservatives like White I have a problem with, but Cassidy who represents all the personality traits that support my thesis PolitiFact's findings have more to do with the modern right's uncomfortable relationship with facts than PolitiFact's political leanings.
This also illustrates why I don't think one needs to show liberals are less prone to confirmation bias than Republicans to explain the right's embrace of wacko theories ranging from Obama's birth certificate to climate denial. We are all biased. The difference is the left puts its faith in the scientific process that works precisely because it corrects for those tendencies. We prefer non-affiliated outlets tasked with analyzing rather than partisan outfits with a mission of pushing a particular agenda. Because we are more comfortable with ambiguity, we are content to make our case with a mere 76% without having to manufacture nine times to make it more black and white.
Cross-posted at Skepticism Examiner