One of the sites we've come to rely on for objective data on the ups and downs of campaigns and public opinion is the website pollster.com — great graphics if you haven't visited, and unique analysis by Mark Blumenthal (aka the Mystery Pollster) and Prof. Charles Franklin (he of the regression analysis curves), and guest pollsters from all over the political map. But an analysis site like this one is only as good as the data that is available to analyze, and that data is sometimes frustratingly incomplete. Anyone looking for a poll's crosstabs that don't seem to exist will appreciate the following post from Mark Blumenthal:
Why do so many pollsters disclose so little? A few continue to cite proprietary interests. Some release their data solely through their media sponsors, which in the past limited the space or airtime available for methodological details (limits now largely moot given the Internet sites now maintained by virtually all media outlets and pollsters). And while none say so publicly, my sense is that many withhold these details to avoid the nit-picking and second guessing that inevitably comes from unhappy partisans hoping to discredit the results.
Do pollsters have an ethical obligation to report methodological details about who they sampled? Absolutely (and more on that below), and as we have learned, most will disclose these details on request as per the ethical codes of the American Association for Public Opinion Research (AAPOR) and the National Council on Public Polls (NCPP). Regular readers will know that we have received prompt replies from many pollsters in response to such requests (some pertinent examples here, here, here and here).
The problem with my occasional ad hoc requests is that they arbitrarily single out particular pollsters, holding their work up to scrutiny (and potential criticism) while letting others off the hook. My post a few weeks back, for example, focused on results from Iowa polls conducted by the American Research Group (ARG) that seemed contrary to other polls. Yet as one alert reader commented, I made no mention of a recent Zogby poll with results consistent with ARG. And while tempting, speculating about details withheld from public view (as I did, incorrectly, in the first ARG post), is even less fair to the pollsters and our readers.
So I have come to this conclusion: Starting today we will begin to formally request answers to a limited but fundamental set of methodological questions for every public poll asking about the primary election released in, for now, a limited set of states: Iowa, New Hampshire, South Carolina or for the nation as a whole. We are starting today with requests emailed to the Iowa pollsters and will work our way through the other early states and national polls over the next few weeks, expanding to other states as our time and resources allow.[bolded mine]
More on the goal, and a request from Blumenthal:
Our goal is to both collect this information and post it alongside the survey results on our poll summary pages, as a regular ongoing feature of Pollster.com. Obviously, some pollsters may choose to ignore some or all of our requests, but if they do our summary table will show it. We are starting with Iowa, followed by New Hampshire, South Carolina and the national surveys, in order to keep this task manageable and to determine the feasibility of making such requests for every survey we track...
What can you do? Frankly, we would appreciate your support. If you have a blog, please post something about the Pollster Disclosure Project and link back to this entry (and if you do, please send us an email so we can keep a list of supportive blogs). If not, we would appreciate supportive comments below. And of course, criticism or suggestions on what we might do differently are also always welcome.
The specific questions about demographics and methodology are below the fold.
The bottom line is that if you want better data to analyze, then we, the consumers of all things political, ought to support pollster.com in asking for it. And if we expect and appreciate the analysis done by pollster.com, Swing State Project, Open Left, Slate, Real Clear Politics or any of the other sites that digest and analyze polling data, let's help make the data a bit more "open source" and transparent.
These are the questions:
- Describe the questions or procedures used to select or define likely voters orlikely caucus goers (essentially the same questions I asked of pollsters just before the 2004 general election).
- The question that, as Gary Langer of ABC News puts it, "anyone producing a poll of 'likely voters' should be prepared to answer:" What share of the voting-age population do they represent? (The specific information will vary from poll to poll; more details on that below).
- We will ask pollsters to provide the results to demographic questions and key attributes measures among the likely primary voter samples. In other words, what is the composition of each primary voter sample (or subgroup) in terms of gender, age, race, etc.?
- What was the sample frame (random digit dial, registered voter list, listed telephone directory, etc)? Did the sample frame include or exclude cell phones?
- What was the mode of interview (telephone using live interviewers, telephone using an automated, interactive voice response [IVR] methodology, in-person, Internet, mail-in)?
- And in the few instances where pollsters do not already provide it, what was the verbatim text of the trial heat vote question or questions?