The NPR show "Weekend Edition Saturday" had a story this morning about the new lie detectors that the DoD plans to deploy in Afghanistan and Iraq. Scott Simon interviews Donald Krapohl, who is "special assistant to the director at the Defense Academy for Credibility Assessment at Fort Jackson in South Carolina."
I'm sure Mr. Krapohl is just doing his job, but unfortunately his job today was to gloss over the serious problems with these lie detectors and the plans the DoD has for them. But I could wish Mr. Simon was doing his job a bit better, as I complain about after the jump.
But before we get started, there's a difficult truth that we all need to accept: lie detectors don't work. I know they are as much a part of our American culture as apple pie. We see them in movies and read about them in novels. We love it when a whistleblower passes a lie detectors test, and we love it when a lying jerk fails one. A machine that can objectively let us know who's lying and who's telling the truth would be so handy, that it's very appealing to believe that it works. And there's no shortage of people who'll tell you it does. But what's missing is scientific evidence.
There's a 2003 study by the National Research Council of the National Academy of Sciences, "The Polygraph and Lie Detection" (you can read the whole thing online, scroll down for the table of contents.) Here's a quote from the executive summary:
Notwithstanding the limitations of the quality of the empirical research and the limited ability to generalize to real-world settings, we conclude that in populations of examinees such as those represented in the polygraph research literature, untrained in countermeasures, specific-incident polygraph tests can discriminate lying from truth telling at rates well above chance, though well below perfection.
"Well above chance, though well below perfection." You might think that sounds more positive that my assertion that they don't work. There is some measurable correlation between which people the lie detector says are lying, and the ones that are actually lying. It's right more often than it's wrong. Isn't that useful? Page 5 of the report gives a great illustration of just how useless this is. But since you won't go read it, I'll paraphrase: if about 1 out of 1000 people you interrogate is a terrorist, then you can expect that the vast majority of people that fail the lie detector test are innocent, and, to add injury to injury, a significant fraction of the terrorists will pass the test.
With that background, let's turn to the NPR interview. Mr. Krapohl's characterization of the new polygraphs is not entirely at odds with the NAS study. He calls them imperfect, and when Mr. Simon calls them "lie detectors" he says that term is not really correct. And then he starts calling them a screening test. And when Mr. Simon asks if there is a "bias" built into it, Mr. Krapohl says they are "risk-averse" and "better at detecting liars than it does truth tellers." Let me explain what that means. In the example given above, this means that the device has been adjusted to minimize the fraction of terrorists that pass the test -- a good thing -- but the unavoidable cost of that is to disproportionately increase the number of innocent people who fail it. Mr. Krapohl draws an analogy to the TB screening test that everyone takes, where a positive reaction only indicates that you might have TB. He says that a positive on his polygraph means you need more "scrutiny" but that that is "not sufficient in and of itself to take any direct action."
"Scrutiny" was the word that really got to me.
And here is where I became unhappy with Mr. Simon. I feel that one of the duties of journalists is to ask the questions the listeners would ask if they were only better informed. If we put together what the NAS report says, that we can expect most people who fail the test to be innocent, with the fact that everything Mr. Krapohl says agrees with that, with the fact that the whole "risk-averse" thing means they've pushed it toward having an even larger fraction of the people who fail the test being innocent -- how in the world could Mr. Simon not ask, "You say that people who fail this test will get more 'scrutiny.' Given that most of the people who fail the test will be innocent, what exactly do you mean by 'scrutiny'?"
Might "scrutiny" include indefinite detention? Rendition? Transportation to Guantanamo? Torture?
Look, I know soldiers at the site of a roadside bomb can't exactly be nice about it, but how exactly does this box help them? Here's the reality they are faced with: if the box says the guy is telling the truth, he's probably telling the truth; and if it says he's lying, he's probably telling the truth, but with a lower probability. How is this information useful? The inevitable outcome of using this shiny new toy is more innocent Afghans and Iraqis getting locked up, with no improvement to security for our troops.