It can take thousands of experiments to find a successful intervention; the majority of research studies, be they from the Institute of Education Sciences or Monsanto’s biotechnology labs, do not produce positive results.
“Often we do a lot of research and we’re all working really hard, and we keep coming up with zeros,” said Bridget T. Long, vice-chairman of the National Board for Education Sciences and a Harvard University education and economics professor, at a board meeting last week.
Critics of IES have voiced frustration that relatively few studies have produced strong positive results, but that doesn’t mean that all those negative or null results findings have to be useless. The NBES is looking for ways to design future research in ways that will make it easier to glean useful information from well-designed studies that showed no or negative effects.
Among other things, this means tagging more individual characteristics of an intervention and its context to hone in on how interventions work with different groups of students, schools, or education policies. For example, in a study examining how well 8th-grade students performed using different math curricula, analysts might identify specific teaching and administrative practices, material, and parent involvement programs, and differences in how the intervention affected students in different grades or from different backgrounds. They might find that while no curriculum produced striking overall improvements, curricula that incorporated certain professional development or types of materials did significantly improve performance for certain students at specific grades.
“Not all interventions work in all places for all people,” said NBES member Adam Gamoran, the director of the Wisconsin Center for Education Research at the University of Wisconsin-Madison. “We need to know under what circumstances for what populations does this work?”
Fellow member Anthony S. Bryk, president of the Carnegie Foundation for the Advancement of Teaching, called for more iterative research for interventions, in which researchers “trial quickly, fail quickly, learn quickly and retry quickly.”
Carnegie has been exploring a similar 90-day research cycle in ongoing studies of remedial math classes.
A version of this news article first appeared in the Inside School Research blog.