Skip to main content

Few phrases have been written or uttered more often than "data-driven instruction" or "evidence-based decision making" since No Child Left Behind (NCLB) codified "scientifically-based" practices in 2001.

The accountability era begun in the early 1980s intensified the near mania for data in the U.S. that can be traced back to the first few decades of the twentieth century and the promises associated with quantifying student learning and teacher quality through standardized testing. Although the past century and the more recent thirty-year cycle of accountability based on standards and standardized testing have not delivered on the promises (see Hout & Elliot, 2011), "No Excuses" Reformers, including the current Department of Education headed by Secretary of Education Arne Duncan, remain steadfast in pursuing better tests based on better standards.

The "No Excuses" Reform movement is driven by bully politics that ironically includes an unbending faith in data and evidence while simultaneously ignoring (or misreading) the data and the evidence. One stark example of that pattern was exposed by Jack Hassard at Anthony Cody's Living in Dialogue blog:

"In May, 2012, the National Council on Teacher Quality (NCTQ) issued a report entitled: What Teacher Education Programs Teach About K - 12 Assessment. Anthony Cody mentioned this study in a recent post entitled Payola Policy: NCTQ Prepares its Hit on Schools of Education.

"The title intrigued me, so I went over to the NCTQ website, and read and studied the report which is about what education courses teach about assessment. This post is my review of the NCTQ study, and I hope after you finish reading the post you will realize how bogus reports like these are, given the quality of research that professors of education have been doing for decades. The study reviewed here would never have been published in a reputable journal of research in education, not only in the U.S., but in any other country in the world. I'll make it clear why I make this claim."

From NCLB to Race to the Top to international PISA comparisons to cyclical handwringing about NAEP and SAT scores to growing calls for value-added methods for evaluating teachers, we have ample data to conclude that we have historically and are currently failing data in education and education reform

Failing Data in Education Reform

In NCTQ's report discredited by Hassard, the claim that teachers are not properly prepared to function efficiently in a data-driven education world is paradoxical evidence that "No Excuses" Reformers are themselves data-challenged.

"Data can be an immensely powerful asset, if used in the right way," explains Jonathan Gray in "What data can and cannot do," adding: "But as users and advocates of this potent and intoxicating stuff we should strive to keep our expectations of it proportional to the opportunity it represents." [1]

Further, Gray offers five cautions about data: (1) "Data is not a force unto itself," (2) "Data is not a perfect reflection of the world," (3) "Data does not speak for itself," (4) "Data is not power," and (5) "Interpreting data is not easy."

Here, then, let's consider how Gray's warnings can inform our awareness of the increasing misuse of data in education reform:

Data in education should not be a force unto itself. In other words, education reform should move away from teaching to the test and relentless quantification of both teaching and learning. Currently, "No Excuses" Reformers are seeking intensifying both, thus making data "a force unto itself," decontextualized and corrosive.

Data in education, specifically quantified data, are not perfect reflections of learning and teaching. Standardized test scores are approximations, at best, of teaching and learning; and standardized test scores remain powerfully correlated with factors beyond the control of schools or teachers. As Gray notes: "Data is often incomplete, imperfect, inaccurate or outdated. It is more like a shadow cast on the wall, generated by fallible human beings, refracted through layers of bureaucracy and official process." Thus, increasing testing and the high-stakes associated with those tests for students and teachers are profound misuses of data.

Data in education, specifically quantified data, should not speak for themselves. Each year, SAT data are released, and despite the College Board's own caution not to rank and compare states by that data, journalists and politicians across the U.S. allow SAT data to speak for themselves—ignoring disproportionate populations of test takers from state to state, ignoring the high correlations between SAT data and students' parental income and levels of education. Again, as Gray warns: "In many ways official datasets resemble official texts: we need to learn how to read and interpret them critically, to read between the lines, to notice what is absent or omitted, to understand the gravity and implications of different figures, and so on. We should not imagine that anyone can easily understand any dataset, any more than we would think that anyone can easily read any policy document or academic article."

Education data should not be power. For too long, quantified data in education have been used as political baseball bats to bludgeon public schools and more recently public school teachers. As well, quantified data have for too long been used by teachers to bludgeon students. The misuse of data shifts the locus of power outside any agents in the education process—teachers and students—and places that agency in the mirage of objectivity. In effect, quantified data are the antithesis of human agency.

Interpreting education data is complex, and better left to people with expertise and experience in education, measurement, and research. Several years ago, Gerald Bracey recognized the increased misuse of data by the inexpert, especially as that data were being misreported in the media (See Yettick, 2009, and Molnar, 2001). Bracey recognized that think tanks were snookering the public and swaying education reform policy (See Welner, et al., 2010). Data must not be interpreted without context, and data always pose problems with causation and correlation. Politicians, think tank advocates, journalists, and billionaires may have many strong qualities and areas of expertise, but that doesn't mean any of them understand educational data. (I highly recommend the Shanker Blog and School Finance 101 as models of careful and expert considerations of educational data.)

Gray ends his piece with optimism: "I'm sure as time goes by we'll have a more balanced, critical appreciation of the value of data, and its role within our information environment."

I struggle with such optimism in the context of education reform, however, because, as Anthony Cody has explained, current "No Excuses" Reformers are trapped in the blinders of groupthink that reinforce their self-righteous zeal.

Educators and researchers are therefore under more pressure than ever to use our own expertise with evidence to counter that zeal with the complex and powerful weight of the data: "We should strive to cultivate a critical literacy with respect to our subject matter," as Gray suggests.

[1] Throughout Gray's article "data" is followed by singular verbs, which is nonstandard, but I will retain the published wording in all quotes while conforming to standard usages of "data" as plural in my original sentences.

Originally posted to plthomasEdD on Mon Jun 04, 2012 at 10:49 AM PDT.

Also republished by Education Alternatives.

Your Email has been sent.
You must add at least one tag to this diary before publishing it.

Add keywords that describe this diary. Separate multiple keywords with commas.
Tagging tips - Search For Tags - Browse For Tags


More Tagging tips:

A tag is a way to search for this diary. If someone is searching for "Barack Obama," is this a diary they'd be trying to find?

Use a person's full name, without any title. Senator Obama may become President Obama, and Michelle Obama might run for office.

If your diary covers an election or elected official, use election tags, which are generally the state abbreviation followed by the office. CA-01 is the first district House seat. CA-Sen covers both senate races. NY-GOV covers the New York governor's race.

Tags do not compound: that is, "education reform" is a completely different tag from "education". A tag like "reform" alone is probably not meaningful.

Consider if one or more of these tags fits your diary: Civil Rights, Community, Congress, Culture, Economy, Education, Elections, Energy, Environment, Health Care, International, Labor, Law, Media, Meta, National Security, Science, Transportation, or White House. If your diary is specific to a state, consider adding the state (California, Texas, etc). Keep in mind, though, that there are many wonderful and important diaries that don't fit in any of these tags. Don't worry if yours doesn't.

You can add a private note to this diary when hotlisting it:
Are you sure you want to remove this diary from your hotlist?
Are you sure you want to remove your recommendation? You can only recommend a diary once, so you will not be able to re-recommend it afterwards.
Rescue this diary, and add a note:
Are you sure you want to remove this diary from Rescue?
Choose where to republish this diary. The diary will be added to the queue for that group. Publish it from the queue to make it appear.

You must be a member of a group to use this feature.

Add a quick update to your diary without changing the diary itself:
Are you sure you want to remove this diary?
(The diary will be removed from the site and returned to your drafts for further editing.)
(The diary will be removed.)
Are you sure you want to save these changes to the published diary?

Comment Preferences

  •  Foundation Effect (6+ / 0-)

    Another factor undermining excellence in educational data evaluations is the unprecedented influence of Foundations such as the Gates, Broad and Walton foundations. These foundations have a world view that is at times countered by good research but clever researchers know that this kind of research is not a good career move. Today, those trying to create a career in education realize they must support the implementation of the Common Core standards. They must support the idea that new testing technology in conjunction with these standards that have been mandated from realm of greed for power will lead to improve pedagogy. Public education in America and by extension democracy itself is in real jeopardy. Thank you for your most informative post.

  •  A few years ago (2+ / 0-)
    Recommended by:
    elfling, mrkvica

    we teachers were given handouts of the school's data, ostensibly to study it to look for patterns. Part of 'data driven instruction' is that teachers actually see the data so they can drive instruction with it.

    The facilitator of this exercise, a lousy English teacher turned data coach, wanted us to discover that the English department, under his direction, had made amazing gains, so that we would buy into his plans to expand his coaching domain to math and science.

    Unfortunately, he handed the data to a bunch of science teachers who actually knew how to analyze data, and the amazing gains he expected us to find amounted to little more than statistical noise.

    We did, however, find a very interesting pattern, and called him over to discuss it: girls in female teachers' math classes did much better than girls in classes taught by men. He caught the implication immediately. "This would lead to segregating classes by gender! We can't do that! We're a public school!" and he walked off.

    "The problems of incompetent, corrupt, corporatist government are incompetence, corruption and corporatism, not government." Jerome a Paris

    by Orinoco on Mon Jun 04, 2012 at 12:19:46 PM PDT

    •  Or the male (1+ / 0-)
      Recommended by:

      math teachers could change . . . oh never mind. . . :)

      •  The administratively approved interpretation (0+ / 0-)

        of 'the data' was that teachers would identify individual students who were on a borderline of moving into a higher test bracket (we used far below basic, below basic, basic, proficient and advanced) and somehow give those few kids encouragement to do better on the tests.

        Since the pattern we found wasn't in line with the approved interpretation, it wasn't even discussed. My general impression was that female math teachers collaborated with each other a lot more than the male math teachers did, but I don't know why that would selectively effect female students. And we never got time to talk about it, figure it out, and maybe use our discoveries to improve instruction.

        "The problems of incompetent, corrupt, corporatist government are incompetence, corruption and corporatism, not government." Jerome a Paris

        by Orinoco on Tue Jun 05, 2012 at 12:32:12 PM PDT

        [ Parent ]

        •  I believe I've read (0+ / 0-)

          -- there's something back in the foggy depths of my memory -- that girls learn better with female teachers because the male teachers don't call on them, pay as much attention to them, etc.

          Or was it that the boys are ignored by the female teachers?

          Or both?

          But none of that matters in the land where testing is king.

  •  I can't help but point out (3+ / 0-)
    Recommended by:
    mrkvica, tultican, moira977

    That if student test score data over three years is valid for evaluating teachers, it's probably valid for Secretaries of Education too.

    No one has the patience to realize that changes we make to kindergarten are not reflected in high school graduation rates until 13 years later... that's two reform generations later!

    Fry, don't be a hero! It's not covered by our health plan!

    by elfling on Mon Jun 04, 2012 at 12:31:46 PM PDT

    •  If these tests are good enough (0+ / 0-)

      to retain a 3rd grader or tell a 10th grader he won't graduate, everything should be measured by the tests.

      Of course, they aren't but . . .

Subscribe or Donate to support Daily Kos.

Click here for the mobile view of the site