Skip to main content

View Diary: Doctors Like Public Option, Especially As A Choice (126 comments)

Comment Preferences

  •  I defer (2+ / 0-)
    Recommended by:
    HudsonValleyMark, math4barack

    with all seriousness. I only took those courses, I didn't teach statistics; I use this info to help people understand applications. There is also some difference between the use of p values in 'hard' data that doesn't depend on variables such as whether someone responds to a question or not - many people who read the NEJM are used to looking at data in different ways.

    It has always been my understanding that if you have a low response rate on any survey that is dependent on voluntary participation (and I think 42% would be considered a low response rate,) regardless of the demographic similarities between the groups, then the study has much lower validity.

    I seem to be having some problems trying to express what I'm trying to get at, so I apologize.

    Diversity may be the hardest thing for a society to live with, and perhaps the most dangerous thing for a society to be without - W S Coffin

    by stitchmd on Mon Sep 14, 2009 at 08:18:47 PM PDT

    [ Parent ]

    •  Dude, didn't mean to hammer you, but yeah (1+ / 0-)
      Recommended by:
      HudsonValleyMark

      I could be an SOB in the classroom at times, but always tried to layer it with some humor. And I appreciate your comments.

      The thing we have to remember here is that damned near all social science surveys are dependent upon voluntary participation. (Can you imagine the responses, if we held a gun on the participant?) A lot of what folks in the physical sciences look at isn't as soft. The key really becomes how representative of the general population of interest is that body of respondents. The PI has to make a decision as to whether the respondents really reflect what the population in question looks like.

      •  see, here's the thing (0+ / 0-)

        my undergrad degree was in the 'hard' science of biology/molecular genetics, and then I enrolled in some graduate poli sci courses where I had to study statistics and study design for social sciences, so I have had exposure to the difference in how the scientific method is applied in "hard" sciences and "soft" sciences. This, before I went to med school. It has given me an interesting perspective on interpreting data.

        I certainly understand how samples for these kinds of surveys are chosen, and I understand that these are often dependent on voluntary participation. But don't you need to have a certain percentage of participation to make such a study have validity? I seem to remember pretty clearly that if you don't have a certain level of participation, then you cannot reliably extrapolate your responses to the broader population your sample is designed to represent, because there are inherently differences between responders and non-responders. Have I misunderstood this all this time? It seemed to be a very big issue in the courses I took.

        Diversity may be the hardest thing for a society to live with, and perhaps the most dangerous thing for a society to be without - W S Coffin

        by stitchmd on Mon Sep 14, 2009 at 08:47:57 PM PDT

        [ Parent ]

        •  "Depends upon sample size," is the simple answer. (1+ / 0-)
          Recommended by:
          DemFromCT

          Geez....this reminds me of conversations with my son, who was a physics major and grad student. Talk about esoteric.......

          I should also add that some of my friends in the physical sciences are guilty of using statistical techniques when the data doesn't meet the assumptions of the technique. And, yes, some social science colleagues have stretched credulity from time to time.

          •  clinincal trials are often (0+ / 0-)

            small differences between large groups that are tough to match properly. But imagine a poll of Smith v Jones where Smith had 73 and Jones had 27 (the difference here between public option pref and private alone). Slice and dice the poll all you want. Jones is losing, unless you accidentally polled Smith family members and no one else.

            For perspective, in politics, 55-45 is a landslide. In health reform, lower cost and improve access are tied as issues. So what's 73-27? it ain't chopped liver.

            "Politics is the art of looking for trouble, finding it everywhere, diagnosing it incorrectly and applying the wrong remedies." - Groucho Marx

            by Greg Dworkin on Tue Sep 15, 2009 at 05:11:43 AM PDT

            [ Parent ]

        •  it has little to do with the response rate (3+ / 0-)
          Recommended by:
          DemFromCT, rhp, stitchmd

          For instance, we know that incentives can increase the response rate and increase bias simultaneously.

          It would be a non-starter to suggest that we should disregard the results of this survey because its response rate was 'only' in the 40s. It would also a non-starter to argue that the poll should be discounted because it was administered by mail (although that would be a greater concern if there were no attempt to follow up with initial non-respondents).

          It's true that non-response bias is possible even when the measured demographics of the respondents are similar to those of the non-respondents, as they generally are in this case. It would be possible even if the response rate were substantially higher. That's an additional source of possible error, and good researchers do bust their guts trying to minimize it and to evaluate it. However, it isn't a reason to disregard the results.

          •  thank you (0+ / 0-)

            you said it better than I did.

            "Politics is the art of looking for trouble, finding it everywhere, diagnosing it incorrectly and applying the wrong remedies." - Groucho Marx

            by Greg Dworkin on Tue Sep 15, 2009 at 05:12:14 AM PDT

            [ Parent ]

          •  ....and there are some fairly standard techniques (1+ / 0-)
            Recommended by:
            HudsonValleyMark

            in mail surveys to encourage greater response rates. For example, send out a letter saying a survey is coming, send out the survey, send out a reminder, after a couple of weeks send out a second copy of the instrument (if it has not been returned), send out another reminder, ad nauseum. And, yes, those things work.

Subscribe or Donate to support Daily Kos.

Click here for the mobile view of the site