What would I know about polling companies? Well, about 14 years ago I had the worst part time job ever -- I conducted polling surveys over the phone. I even asked questions on behalf of Gallup. I know how they lie.
Back in college in 1992 I asked one of my best friends, an apolitical low information voter, who he was going to vote for. His response was, "I'm going to vote for the winner." These two paragraphs are connected.
I've been hollering for months that organizations like Gallup, Rasmussen and others are biased in favor of Republicans. Yesterday this great diary shows how Gallup and others manipulated their samples to get a desired result inconsistent with reality. And you may ask, so what? Now here's the deal. There are millions of voters who will vote for the person that the polls say in late October is "winning." And make no mistake about it. Gallup, Rasmussen and others WILL manipulate their polling samples to try and get a desired result in the poll and then those poll numbers will serve as a talking point in the echo chamber to generate millions of dollars of positive publicity for John McCain.
Inaccurate samples such as the ones used by Gallup are the most common ways to manipulate poll numbers. If Democrats enjoy a nationwide 11 point Party ID advantage, then any fair poll must have a 11 pt differential. The same samples should be included for all groups including gender, religion, or race.
I know how polling companies like Gallup manipulate political surveys. They use illegitimate samples to create a desired result in favor of the Republicans. They are not the only ones as Rasmussen and Zogby are just as guilty. And we need to continue to call them out on these shenanigans. But to do that, you need to understand how polls are manipulated in general.
First, understand that the people who ask the questions are following a script. They must read the questions word for word as phrased or they will be fired. So when a poll question is biased, there is little the surveyor can do. Now here are some of the most common biases.
INHERENT BIASES OF POLLS.
1. BS Samples - The diary cited in the opening does a great job of showing how a right wing organization like Gallup can manipulate a sample to get a desired result. If they used a representative sample, Obama would be slightly ahead and this would be at the apex of the McCain bounce which will inevitably recede in the next week. Democrats outnumber Republicans by 11 percentage points in this country. If a poll does not reflect this difference, then that poll is biased. Gallup clearly does not.
2. English Proficiency v. 2nd Language - When I did surveys we were instructed to hang up on people who didn't speak proficient enough English. The survey companies pay to have X number of surveys completed. They lose money with interpreters. So millions of Americans are never heard from in polls, especially Hispanic and Asian voters.
3. Work v. Home - This is perhaps the biggest bias that affects poll numbers and perhaps the most overlooked. We were not allowed to conduct surveys of people while they were at work. If we reached someone's work number, we had to hang up immediately. And since Democrats are more likely to be working than Republicans, this too affects the accuracy of a poll. This is a natural bias which no pollster has yet to adjust for.
4. Phone v. No Phone - Millions of Americans don't own phones. They tend to be poor. Polls automatically discount them. This is another bias that could be accounted for but isn't.
5. The Completed Survey - Many times a survey begins but never gets tallied. In order for a survey to be complete, a person must answer all questions in the survey. Very often someone starts a survey and hangs up in between because stupid questions are asked. So people who have little tolerance for stupidity are under-represented in polls. However, it would be near impossible to calculate the effect of these lost respondents.
6. The Setup Questions - Often you see the results of a poll question. What you don't see are the questions that set up the published poll questions. Set up questions can steer a person toward a desired answer and thus manipulate results. For example, I could ask the question, "Do you support the death penalty" and get an answer of yes with 75% of respondents or an answer of no with 75% of respondents based on the five questions I ask before with the same sample of people. Thus, we do not always know what questions are being asked before. It's the equivalent of a push poll.
7. The Question's Wording - How a question is worded is vital. One word or one inference can change the entire question. Here it is all labels. For example, were I to ask, "Do you favor school choice or the status quo" most people would say "school choice" because of the wording. However, if I asked the question, "Should we steal money from public schools and give it to wealthy children in elite public schools" then most people would say no to one of the key components of "school choice."
8. The Interviewer - The tone of the interviewer's voice and voice inflections can often influence the respondents. However this won't affect a sample that much.
9. Cell Phone v. No Line Phone - I didn't have this issue 14 years ago but it's real today as 10% of Americans use cell phones and have no land line. Cell phone users are not surveyed. They tend to be younger and given that Obama does extraordinarily well with younger voters, then Obama's voters are being under-represented.
10. Lack of reasonable choices Very often poll questions are ridiculous and offer choices that don't reflect people's real positions. Thus the respondent will either hang on or just say anything to move on with the survey. This choosing of a ridiculous poll position then gets used by the Republican party of "proof" that people agree with them when in reality, they don't.
11. No room for explanation or clarification - Similar to number 10, a respondent does not get a chance to clarify a position. For the bogus school choice question, if I responded and said, "Instead of your two choices, I support a strong public education system without voucher schemes" then my answer would not be recorded or counted. If I said, "I support the death penalty but only under the following circumstances ---" then my answer will count as "I support the death penalty."
12. Positioning of choices - Biased polling companies often couch an extreme choice with a more extreme choice. This is similar to the Overton window and is designed to get a desired result.
The credibility of all these polling companies need to be called into question. They need to look at the population, look at the age/race/gender of the populace, look at party registration as of the day of the poll, and use samples that accurately reflect, within 1 percentage point, those subsets. Any poll that doesn't do that needs to be shamed and criticized for it. Let's make sure that they are reporting honest results, not trying to manipulate results so that undecided voters vote for the person they perceive as "the winner."