Many accuse fact checkers like PolitiFact and The Fact Checker of bias. Most of these accusations come from the right, for which the most relevant example is politifactbias.com. Conservatives don't focus as heavily on The Washington Post's Fact Checker, perhaps because its rulings are apparently more centrist than PolitiFact, and because PolitiFact rulings apparently favor Democrats at least a little bit [1].
We can use Malark-O-Meter's recent analysis of the 2012 election candidates' factuality to estimate the magnitude of liberal bias necessary to explain the differences observed between the two parties and estimate our uncertainty in the size of that bias.
The simplest way to do this is to re-interpret my findings as measuring the average liberal bias of the two fact checkers, assuming that there is no difference between the two tickets. The appropriate comparison here is what I call the collated ticket malarkey, which sums all statements that the members of a ticket make in each category, then calculates the malarkey score from the collated ticket. Using statistical simulation methods, I've estimated the probability distribution of the ratio of the collated malarkey scores of Rymney to Obiden.
Here's a plot of that distribution with the 95% confidence intervals labeled on either side of the mean ratio. The white line lies at equal malarkey scores between the two tickets.
(You'll have to check out the original post at Malark-O-blog to see the histogram.)
Interpreted as a true comparison of factuality, the probability distribution indicates that we can expect Rymney's statements are on average 17% more full of malarkey than Obiden's, although we can be 95% confident that the comparison is somewhere between 8% and 27% more red than blue malarkey.
Interpreted as an indicator of the average bias of PolitiFact and The Fact Checker, the probability distribution suggests that, if the two tickets spew equal amounts of malarkey, then the fact checkers on average rate the Democratic ticket's statements as somewhere between 8% and 27% more truthful than the Republican ticket's statements.
I'm going to speak against my subjective beliefs as a bleeding heart liberal and say that amount of bias isn't all that unrealistic, even if the bias is entirely subconscious.
If instead we believed like a moderate conservative that the true comparison was reversed - that is, if we believed that Obiden spewed 17% more malarkey than Rymney - then it suggests that the fact checkers's average bias is somewhere between 16% and 54% for the Democrats, with a mean estimated bias of 34%.
It seems unrealistic to me that PolitiFact and The Fact Checker are on average that biased against the Republican party, even subconsciously. So while I think it's likely that bias could inflate the difference between the Republicans and Democrats, I find it much less likely that bias has reversed the comparison between the two tickets. Of course, these beliefs are based on hunches. Unlike politifactbias.com's rhetoric and limited quantitative analysis, however, it is based on good estimates of the possible bias, and our uncertainty in it.
It isn't just conservatives that accuse PolitiFact and The Fact Checker of bias. Believe it or not, liberals do, too. Liberals accuse fact checkers of being too centrist in a supposedly misguided quest to appear fair. You can look to Rachel Maddow as a representative of this camp. Maddow's accusations, like politifactbias.com's, typically nitpick a few choice rulings (which is funny, because a lot of critics on both sides accuse PolitiFact and The Fact Checker of cherrypicking).
Such accusations amount to the suggestion that fact checkers artificially shrink the difference between the two parties, making the histogram that I showed above incorrectly hover close to a ratio of one. So how much centrist bias do the fact checkers have on average?
Well, let's assume for a moment that we don't know which party spews more malarkey. We just know that, as I've estimated, the fact checkers on average rule that one party spews somewhere between 1.08 and 1.27 times the malarkey that the other party spews. Now let's put on a Rachel Maddow wig or a Rush Limbaugh bald cap and fat suit to become true partisans that believe the other side is actually, say, 95% full of crap, while our side is only 5% full of crap. This belief leads to a ratio of 19 to 1 comparing the malarkey of the enemy to our preferred party. Already, it seems unrealistic. But let's continue.
Next, divide each bin in the histogram I showed above by 19, which is the "true" ratio according to the partisans. The result is a measure of the alleged centrist bias of the average fact checker (at least at PolitiFact or The Fact Checker). Get a load of the 95% confidence interval of this new distribution: it runs from about 6% to about 7%. That is, a partisan would conclude that PolitiFact and The Fact Checker are on average so centrist that their rulings shrink the difference between the two parties to a mere SIX PERCENT of what it "truly" is.
I don't know about you, but I find this accusation as hard to swallow, if not harder, than the accusation that there is minor partisan bias among fact checkers.
Then again, my belief that fact checkers on average get it about right is entirely subjective. Given the data we currently have, it is not currently possible to tell how much partisan bias versus centrist bias versus honest mistakes versus honest fact checking contribute to the differences that I have estimated.
So what is the way forward? How can we create a system of fact checking that is less susceptible to accusations of bias, whether partisan or centrist? Here are my suggestions, which will require a lot of investment and time.
1. More fact checking organizations. We need more large-scale fact checking institutions that provide categorical rulings like The Fact Checker and PolitiFact. The more fact checker rulings we have access to, the more fact checker rulings we can analyze and combine into some (possibly weighted) average.
2. More fact checkers. We need more fact checkers in each institution so that we can rate more statements. The more statements we can rate, the weaker selection bias will be because, after some point, you can't cherrypick anymore.
Blind fact checkers. After the statements are collected, they should be passed to people who do not see who made the statement. While it will be possible for people to figure out who made some statements, particularly when they are egregious, and particularly when they are repeated by a specific party or individual, many statements that fact checkers examine can be stripped of information about the individuals or parties involved so that fact checkers can concentrate on the facts.
3. Embrace the partisans and centrists. There should be at least one institution that employs professional fact checkers who are, according to some objective measure, at different points along the various political dimensions that political scientists usually measure. So long as they are professional fact checkers and not simply politically motivated hacks, let these obvious partisans and centrists subconsciously cherrypick, waffle, and misrule to their heart's content so that we can actually measure the amount of subconscious bias rather than make accusations based on scanty evidence and fact checker rulings that make our neck hairs bristle.
I hope that Malark-O-Meter will someday grow into an organization that can realize at least one of these recommendations.
[1] To see how PolitiFact and The Fact Checker disagree, and how PolitiFact is harder on Republicans, compare my PolitiFact and The Fact Checker based malarkey scores (see right side bar), and read the press release of a study done by George Mason's Center for Media and Public Affairs, and another study done by the same organization.