We have all seen them: emails and letters telling us that we have been "selected" to
fill out a limited participation survey, along with a request for donations. The
surveys present vague statements about various hot button issues and give you four or
five answers ranging from "Not at all concerned" to "Very concerned." The "surveys"
are clearly designed by marketing specialists to impress you with the actions the
organization performs and affirm your feelgood impressions about the organization.
They are NOT serious attempts to gauge your interest or concern, nor are they
designed to guide the organization during this critical period of time.
Before I went on disability 11 years ago, I worked for 18+ years in an academic think
tank that worked on demographics and sociology research. As the computer systems
"guru" I guided their growth from a mainframe and 2 personal computer site to a 30+
units onsite and online power computing facility. I did it all in coordination with
the research director and was the site administrator and general go-to person for all
the programmers and researchers. The research was broad and varied, and international
in scope, in addition to administering the preparation of public use files for the
DHHS/NIH Long-Term Care Survey. The management methodology was highly toxic, but the
research was done professionally and thoroughly.
That said, the difference between marketing driven "surveys" and actual research
surveys is a difference of night and day. If these organizations, such as the ACLU,
PFAW, and FFRF really want to find out what they supporters are interested in, and
how/where the supporters want them to focus their efforts, they should do some real
research surveys using much more appropriate methods that allow for analysis of
relative evaluation of importance.
First of all, all questions need to have a response that lets the taker opt out of
the question. Ideally there should be different responses for "refused" versus a
"don't know/not applicable" response. And they should place more specific, narrower
topics in a comparison grid allowing the taker to indicate the relative weights they
would assign for each topic directly in comparison to each other. Such as a 0-10
preference line for each topic.
And the surveys should not include a solicitation letter for membership or donations
for the organization. It is unfortunate that marketing research show that the "limited
participation survey" method is quite effective ate getting prospective donors to
respond and pony up the money in response. One of these surveys I received just last
week came in a big envelope and prominently marked with "Do Not Bend" on the outside.
I would expect this for a survey form that was designed to be computer scanned to
collect the results from a "mark sense" form. I would also expect there to be a
similar large envelope for returning an unbent form to be included in the mailing.
That there was only a standard business envelope included (for returning the form
folded in the usual ways) proved that it was a marketing ploy rather than a serious
survey.
Finally, there are a lot of software packages out there that provide the ability to
analyze survey results (mostly some form of cluster analysis) but all these
solicitations simply show only a mailing list or database collection hallmarks. This
gives a lie to internal statements that your data will "only be used in aggregate" and
divorced from an personally identifying information. [For crying out loud, the survey
form has my name and address printed right at the top -- not an "anonymous"
identifier showing the proper methodology and intent.]
These "surveys" are nothing more than unscientific opinion polls. No better that a
social media tick vote that litters the internet these days. When I get one of these
things I don't bother giving it more than a glance before consigning it to the
"circular file" that sits on the floor next to my desk.