Research Effort Aims to Bury 'Nothing Works' Image
Now that an independent panel has given the federal What Works Clearinghouse a thumbs up for the rigor of its methods for reviewing education research, clearinghouse operators are taking another step to make their reviews more useful to the policymakers and practitioners for whom they’re intended.
At a Dec. 12 forum held here, Mathematica Policy Research, the Princeton, N.J.-based research firm that runs the clearinghouse, and the federal officials that help oversee it invited researchers and members of the public to share ideas on charting a course for the next phase of their mission: making the clearinghouse’s work more relevant.
Launched by the U.S. Department of Education’s Institute of Education Sciences in 2002, the What Works Clearinghouse was originally intended to be a Consumer Reports-style Web resource where educators could find reliable information on “what works” in schools. But early on, it developed a reputation as the “nothing works” clearinghouse because few reviews were posted on its Web site and even fewer pointed to promising strategies for improving schools.
“You can’t spend $30 million of the public purse and have something that is referred to repeatedly in the media as ‘nothing works,’ ” said Grover J. “Russ” Whitehurst, who is widely credited with having spearheaded the clearinghouse’s launch during his term as the IES director, which ended last month.
Now the director of the Brown Center on Education Policy at the Brookings Institution here, Mr. Whitehurst said project delays resulted in part from disagreements over procedures for screening studies, legal threats from program developers whose work got low ratings from the clearinghouse, congressional lobbying that was critical of the clearinghouse, and a dearth of well-executed studies on which to base its reviews.
“I was naive coming in about how much was out there and how much was good research,” he told meeting-goers. “But there has to be a place for something like the clearinghouse for the future of education reform in this country.”
Practice Guides Popular
Over the last two years, the clearinghouse has picked up the pace of its work, publishing increasing numbers of reviews with “positive” findings, and producing new products, such as practice guides, that are targeted to practitioners. According to Jill Constantine, the deputy director of the clearinghouse, its Web site now gets 50,000 to 60,000 “hits” a month and offers 100 research reports, seven practice guides, and a series of new quick reviews, which vet studies that have been spotlighted in the news media.
Mark R. Dynarski, the clearinghouse director, noted that the seven practice guides have been downloaded “more times than the entire 100 reports.”
“Educators are voting with their feet—or clicks,” he said.
The dilemma now is how to respond to the demands for usable research, while at same time maintaining review standards, which tend to favor strict experimental studies over most other forms of research. The practice guides, in comparison, cast a wider net for studies in an effort to offer practitioners “promising practices” they can put into place now until more definitive proof emerges.
“I’m always grateful we took the high road to set and stick to standards,” said Kay Dickersin, an epidemiology professor at Baltimore’s Johns Hopkins University and the director of the U.S. Cochrane Center. The latter group, to which Ms. Dickersin was referring, is one of 12 centers around the world that assists the Cochrane Collaboration, a 15-year-old effort to synthesize studies in medicine and give practitioners reliable, research-based advice.
“People can be harmed by wrong or insufficient data,” she added.
Different Models Eyed
The clearinghouse was patterned largely after the Cochrane Collaboration, but it was one of several models for synthesizing and sharing research that were discussed at last week’s forum.
Susan Bodilly, the education director for the RAND Corp. of Santa Monica, Calif., described her research group’s Promising Practices Network, which examines the evidence on programs and policies aimed at improving children’s lives.
Rather than try to provide unassailable evidence to the field, however, the network develops information requiring a slightly lower bar for evidence, which Ms. Bodilly described as “what is proven, what is promising, and what on the radar screen is coming up.” The suggestions on what interventions the network should review come from practitioners in those fields.
From the National Cancer Institute, Michelle Bennett, the deputy director of the institute’s center for cancer research, discussed her federal agency’s 4-year-old “translational research” program, an effort aimed at transforming the research that comes out of laboratories, clinical practice, and epidemiological studies into tools and practices that practitioners can use to treat or prevent cancer.
Yet where most such efforts fall short, said Ms. Bodilly, is in providing advice for practitioners on how to put programs in place and sustain them over time. “That’s the missing ingredient in this approach,” she added.
Just bringing answers to the educators is not enough to bring about changes in practice, added James H. “Torch” Lytle, a professor of practice at the University of Pennsylvania in Philadelphia and a former Trenton, N.J., school superintendent.
“We know hand washing reduces infections in hospitals,” he told the group. “Yet infection control continues to be a problem in hospitals.”
“If we can’t get hospital staff to do something as simple as washing hands,” he asked, how can teachers be expected to enact far more sophisticated changes in their own practice?
Vol. 28, Issue 16