School & District Management Opinion

AERA continued: The Teachings of Russ Whitehurst

By Eduwonkette — April 02, 2008 3 min read
  • Save to favorites
  • Print

In a talk last Thursday entitled, “Seven Things I’ve Learned About Education Research and Policy, Plus or Minus Two,” Russ Whitehurst, the Director of the Institute of Educational Sciences, summarized what he’s learned about education policymaking during his seven years at IES.

1) The research community is oriented towards understanding, while the policy community is oriented towards action.

Researchers are often upset that their work is not defined as “policy relevant” (and thus not included in IES’s funding priorities). But they usually haven’t thought about what’s actionable in their own research. Whitehurst gave the example of a study in which the researcher coded classroom interactions between teachers and children of different ability levels, and found meaningful differences. When asked what the policy implications were, the researcher looked at him like a deer in headlights.

Said Whitehurst, “I’m not suggesting there’s anything wrong with that kind of scholarship,” but this quest for understanding rather than prescription is perplexing to policymakers. Researchers often defend research oriented towards understanding by appealing to the long arc of science. In response, Whitehurst argued that education is not a discipline in the sense that neuroscience is a discipline. Rather, education is more like transportation, in that it presents a series of problems that need to be solved.

2) Researchers operate under the logic of disconfirmation, while policymakers operate under the logic of confirmation, especially once they’ve committed to a course of action.

Once policymakers have signed on to an initiative, they are not looking for evidence that they’ve committed to the wrong program. Using NCLB as an example, Whitehurst explained that once NCLB was passed, the Department of Education necessarily had to shift from being a buyer to a seller of education policy. A series of complex policy decisions had to be made – i.e. establishing subgroup size and the percentage of students who could sit for alternate assessments. Whitehurst’s point was that the sweet spot of policymaking is where people are uncertain and uncommitted. Policymakers like to have the weight of research behind them, and it’s most effective to offer advice before they’ve publicly commented about the issue.

3) Much research that’s relevant to policymakers shouldn’t be because it’s too methodologically weak to be taken seriously.

Whitehurst lamented that he’s had to provide assessments of research to major newspapers, though the research “was so weakly done that it’s a shame that anyone had to spend time thinking about it.” Unfortunately, Whitehurst said, a report put out by a thinktank is given the same weight as an article published in Science. Whitehurst argued that until policymakers don’t have to worry that what they’re reading is a political document, rather than a research document, the relationship between educational research and educational policymaking will be troubled.

4) Demonstrating that popular programs don’t work is risky business.

No good evaluation goes unpunished, Whitehurst quipped. He provided the example of the Upward Bound evaluation, which found no effects of Upward Bound on college going, and discussed the subsequent shutdown of a randomized trial to further evaluate Upward Bound.

5) The combination of high-stakes for policymakers and high uncertainty about what they can do generates unreasonable expectations for educational research.

While medical research has invested millions of dollars in the search for an AIDS vaccine, it has been unsuccessful. The medical community is willing to accept that research and progress takes time, while there’s no understanding that identifying solutions takes time in educational research, too.

This was probably the best session I attended at AERA. His points weren’t particularly novel, but Whitehurst pulled them together coherently, if sometimes naively. For example, he worried that policymakers see research articles as political documents rather than research documents – but isn’t the choice of research questions and outcome variables political to begin with? I also expected more fireworks from the audience about funding priorities – only a few years ago, many researchers were stomping mad that their research was ineligible for funding.

Related Tags:

The opinions expressed in eduwonkette are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.