This late in the game, there are a good number of states already in the bag and a lot of polls have come and gone. This seems like a good time to check on the performance of the various polling organizations so far in state level Democratic contests.
For the results on this table, I've consulted the final results from each state as well as polling data from Pollster .com and Real Clear Politics. I've only included primaries that were polled by at least two of the major pollsters. I've dropped results from pollsters who didn't poll a significant number of states (more than a couple), even when that meant bumping Gallup or CBS. And I've only included polls that happened in the last week before the election. Here are the results.
|
In the center of this table is the actual results for that state. Of example, in Iowa Senator Obama won by a (rounded) 8% while in New Hampshire, Senator Clinton won by 3%. The values indicated for each pollster are for their relative performance vs. the actual results. In Iowa, Strategic Vision said that Obama would win by 3, but he won by 8, so their prediction was 5 to the Clinton side of the line. In New Hampshire, Rasmussen predicted that Obama would win by 7, but Clinton won by 3, so they were 10 to the Obama side of the prediction. In several instances (including Iowa) several predictions fell on the same spot, so I've bumped a poll one space left or right just to make it visible. I've tried to randomize these moves so that everyone got treated fairly on a visual basis, and when it comes to the numeric analysis, I used the raw numbers so this nudging has no effect.
One thing that's immediately obvious is that the "polls show more votes for Obama than he gets" effect does not seem to exist when polls for all states are considered. If anything, more polls lean toward Clinton in the run up to election day than Obama. This is aided by pollsters like Mason-Dixon, which has been "pro-Clinton" in their analysis for every state but one. Strategic Vision erred to the Clinton side of the line in every contest but New Jersey. Insider Advantage had numbers that were more pro-Clinton than the actual results in every state. Rasmussen and Survey USA both come in close to neutral, but still lean to the Clinton side. Among the pollsters examined, only Zogby tended to give a more pro-Obama number than the actual results.
Pollster | Avg. Err% | Leans Toward | States Called |
---|
IA | 10.1 | C + 10 | 6/6 |
M-D | 9.2 | C + 5 | 4/5 |
Ras. | 8.0 | C + 3 | 12/18 |
SUSA | 4.7 | C + 3 | 11/13 |
SV | 9.1 | C + 7 | 4/4 |
Zogby | 9.8 | O + 3 | 6/9 |
Comparing the average error between last prediction and actual results, the "lean" of the poll toward one candidate or the other, and the rate of prediction of winners gives some interesting results.
Insider Advantage leaned toward Clinton on every prediction and had the highest error rate in predicting final results. However, they still managed to pick the winner in the six contests for which they had data in the final week.
On the other hand, Rasmussen, while much more neutral when it comes to leaning toward a candidate, managed to miss in 6 of the 18 contests they called -- a 33% error rate in selecting the winner, and that's generous considering that among the states they called were such gimmes as Illinois and New York.
Only Zogby managed to match that final result for awfulness of prediction. In fact, as anyone who has watched the results this year might have noticed, Zogby has been miserable both on the numbers and the results. They've managed to miss in 1/3 of the primaries (to give them some credit, their correct picks aren't bolstered by New York or Illinois) and come close to Insider Advantage in the overall error percentage.
Survey USA has been close on the numbers in almost every case. One of their misses came in the close contest in Texas, where they called it Obama by one. The only real screw-up of the year for SUSA was their call for a big Clinton win in Missouri. Were it not for that that call and (like everyone else) underestimating Obama's landslide in South Carolina, SUSA would be sporting an amazing batting average. The number of states where SUSA has been extremely close to the final result shows that in many cases their model of the electorate was dead on.
So, how does all this apply to the upcoming contests in Indiana and North Carolina?
North Carolina has been tracked by Survey USA, Insider Advantage, Rasmussen, and Mason-Dixon. Here are the values as they stand and adjusted to reflect the "lean" indicated in previous contests.
Pollster | Actual | Adjusted |
---|
SUSA | Obama +5 | Obama + 8 |
IA | Obama +3 | Obama + 13 |
Zogby | Obama +8.0 | Obama + 5 |
Ras | Obama +9.0 | Obama + 12 |
M-D | Obama +7.0 | Obama + 12 |
Avg. | Obama + 6 | Obama + 10 |
Indiana has fresh numbers from fewer sources, which is surprising considering the closeness of the contest. In any case, here are the actual and adjusted values.
Pollster | Actual | Adjusted |
---|
SUSA | Clinton + 12 | Clinton + 9 |
IA | Clinton + 4 | Obama + 6 |
Zogby | Obama + 2 | Clinton + 1 |
Avg. | Clinton + 5 | Clinton + 1 |
Do I believe the adjusted numbers? Well, since Insider Advantage and Zogby have the highest error percent and Survey USA has the lowest, I'd personally give it extra weight. Call it Obama by 8 in North Carolina, and Clinton by 4 in Indiana.
Now I'm going to sit back and hope that SUSA used the same model for predicting Indiana that it used in Missouri. If it's any comfort, the SUSA/Zogby numbers on Indiana look very much like the Missouri prediction on Super Tuesday. So maybe this will be the second time SUSA took a hike into the weeds. Don't hesitate to check the numbers. Any time I'm allowed to do this much math in one sitting, it's an invitation to disaster.