Every few years as we get into the election season, we see more polls and much more commentary about the polls. So, from my vantage point of poll watching, I wanted to address a few things I see turning up with frequency in the many comments here. Most of this material is readily available on line, with some expert advice thrown in.
Are weekend polls accurate?
Gallup's Presidential approval tracker (three day rolling average, MoE plus/minus 3) had had a mini-bump for Obama lately, and on one occasion when I posted on it, the question about whether a weekend Gallup was different than a weekday poll came up, as it has over the years. This story, looking at weekend bias, dates back to 2006:
Democrat Jim Webb has taken the lead over Sen. George Allen of Virginia, according to a pre-election poll released on Tuesday. Allen's polling consultant rejected the latest results: "Any survey conducted Fridays and Saturdays, everybody knows they're skewed toward Democrats." Similar claims have surfaced in news reports about polling data since at least as far back as 2000. What's so suspicious about weekend polls?...
While it's a common claim that weekend polls favor the Democrats, there isn't much hard evidence to support that idea. One of the best studies of this question was conducted by two polling experts at ABC News. Gary Langer and Daniel Merkle looked at the data from ABC's tracking polls for the last three presidential elections. They compared results from people reached on Sunday through Thursday with those reached on Friday and Saturday and found no difference. Among the Sunday-to-Thursday people polled in 2004, 49 percent supported Bush and 46 percent supported Kerry. Polls of the stay-at-home, Friday-to-Saturday crowd produced similar numbers—48 and 46.
Mark Blumenthal had posted on Gallup in 2008 (
Day-of-Week Effect in Gallup Daily?) and at that time Gallup's
Jeff Jones responded:
The possibility of a day of the week effect has come up in relation to prior Gallup tracking data, such as for the 1996 and 2000 elections. We carefully examined those data for evidence of such an effect, and did not find anything to suggest a systematic effect.
Mark updated his thoughts for us:
I have not seen any studies showing solid evidence of a weekend effect that would counter the Langer-Merkle findings, but that doesn't mean no such evidence exists, and it may say more about a lack of studies than a lack of evidence. It is possible that while interviews conducted over a weekend create no partisan skew, they do skew other important characteristics or attitudes in ways that pollsters' standard demographic weighting fails to correct.
—Mark Blumenthal
Is primary polling accurate?
Harry J. Enten
Stats guru Harry J. Enten (I love the name of the blog: margin of error) looked at this past week's Iowa entries and liked what he saw:
The 12/29-30 Selzer & Co. poll found Mitt Romney leading with 24%, Rick Santorum in second with 21%, and Ron Paul in third with 18%. This poll was the only one to correctly forecast first, second, and third place. It was the most accurate in predicting the spread between Santorum and Paul, and second most accurate in estimating the spread between Romney and Santorum.
Overall, the 12/29-30 Selzer & Co. poll was the “most accurate” Iowa poll employing ARG’s Martin, Traugott, and Kennedy measure of pollster accuracy. This is not to say that Selzer & Co.’s full four-day (12/27-30) sample should not also be scored. The fact is that it was published as the “main” poll, but I think it’s necessary to point out that the two-day sample was quite accurate.
The Santorum surge was not a surprise...
we wrote about it based on the Selzer/
Des Moines Register poll two day sample. That isn't a pat on the back, it's to make a point: the polling isn't always right, but it usually is pretty good, and it's often more accurate than our predictions of what voters will do in a given state based on our biases.
And that means that those suggesting Romney can't win South Carolina or that Santorum's surge will carry the day because of the evangelical vote needs to look at the polls and see if that's supported by the data.
Do we have all the data?
We never do. Charlie Cook's made the point for some time that candidate-driven polling (at least the well-funded national candidates) often have fresher, more detailed and sometimes more accurate data than we have. That doesn't mean you should accept on face value campaign released polls. That might be the bit of good news released to drive a story while they hold tight onto the large portion of bad news.
Nonetheless, at least pay attention to references to 'internal polls' and match them to what we know. In Iowa, there were such references to dropping Ron Paul and Gingrich numbers, which proved to be the case.
At the same time, most media outlets prefer to look only at their own polls, at least on the day of release, so remember to look at them all, weigh their track records, see who they polled, and remember to go back to basics (see 20 Questions A Journalist Should Ask About Poll Results.)
Do these things, and there'll be a lot less surprises on election day.