Of the five people who got the Electoral College results correctly, three of us went further and provided vote predictions—me, Nate Silver and Drew Linzer. I noted a few weeks ago that based on those predictions, I was the most accurate of the lot. But votes were still being counted, so now that the votes of all the battlegrounds have been certified, we can check out updated results. I've also added the three major poll aggregators—Huffington Post, Real Clear Politics and Talking Points Memo.
All three of us ended up a little less accurate than the previous numbers, but the toplines remain. I still came out ahead, with Silver just barely edging out Linzer. I also self-servingly highlighted the best predictor of each state and the national popular vote, to show how consistently better my predictions were. I rock!
(I also shaded red the RCP and TPM results for Florida—the two outfits called the state for Romney, the only flubbed calls on this table.)
Everyone on this list except me used algorithms to come up with the numbers, so interesting seeing a wide disparity—particularly between Huffpo, TPM and RCP since all of those used polling and nothing else. Of course, RCP ended up excluding much of PPP's body of work (among others), and that cost them given PPP's general accuracy this cycle. I'm not sure why RCP gets so much respected. They actually suck. TPM excluded internet polling, which was justified even if it did hurt their result accuracy.
As for me, the secret of my success was simple—I based my baseline on TPM's numbers. When I saw that early vote totals were coming in strong for Democrats, I realized that "likely voter" models were excluding Democrats who were turning out in real-world voting. So I shifted my predictions to the registered voter results where available while throwing out the bullshit pollsters like Rasmussen.
That approach served me well almost everywhere. The national popular vote was easy, since there was plenty of registered voter numbers available. Florida polling was tight, but the Democratic early vote was strong where it needed to be strong. Iowa, Nevada, and Virginia early voting looked far better for Democrats than the polling suggested. Democratic early voting in North Carolina was solid, but the polls consistently showed that small Republican edge, and they were right. In Iowa, I discarded the bulk of the polling and threw in my lot with Ann Selzer's numbers. She is scary accurate in Iowa, and you probably won't go wrong betting on her.
My worst miss was Colorado, where the early vote actually looked shitty for us. Obama won it in 2008 and lost it in 2012. That made me more pessimistic than the TPM aggregate, which ended up being the best numbers for that race. I expected pollsters to undercount Nevada and I was rewarded for that assumption. Colorado may end up joining Nevada as "states that chronically underpoll Democratic support," likely because of both state's growing Latino population.
My next worst miss was New Hampshire. The state didn't have early voting, so I pretty much went with the polling composite. Then there's Ohio, where the poll composite was actually the best. I went with PPP (O+5), SUSA (O+5), and NBC/Marist (O+6). The conservative pollsters had the race tied or Obama +1. They were all wrong, but the average ended up nailing the results.
Next I'll do one of these looking at the Senate predictions.