Web Sites Gauge Proof of Whether Programs Work
A federal research center has added another Web site to the lengthening list of similar ventures that distill what the research says on “what works’’ to improve the achievement of students in grades K-12.
The new clearinghouse, called the Best Evidence Encyclopedia, or BEE, was launched earlier this fall by the Center for Data-Driven Education Reform at Johns Hopkins University in Baltimore. It adds to a growing collection of Consumer Reports-style Web sites that vet the research evidence on promising educational programs and practices. At least eight such sites exist now, the best known of which is the U.S. Department of Education’s What Works Clearinghouse.
To a degree, the Web sites grow out of the Education Department’s ongoing campaign to transform education into an evidence-based field, much like medicine.
A growing number of consumer-friendly Web sites that synthesize the research on "what works" in education and the social sciences are popping up, including:
Best Evidence Encyclopedia distills evidence from several different research groups on promising practices and interventions in education. It’s produced by the federally funded Center for Data-Driven Reform in Education, at Johns Hopkins University in Baltimore.
Blueprints for Violence Prevention at the University of Colorado in Boulder identifies programs found to be effective in reducing violent crime, bullying, delinquency, and substance abuse.
Child Trends What Works site summarizes both the research and the professional wisdom on promising programs aimed at improving the lives of children and youths.
Comprehensive School Reform Quality Center at the Washington-based American Institutes for Research (AIR) posts research reviews on schoolwide improvement efforts and educational service providers.
International Campbell Collaboration systematically reviews research evidence from around the world on social, behavioral, and educational interventions.
Poverty Action Lab at the Massachusetts Institute of Technology scientifically evaluates anti-poverty programs in developing countries.
Promising Practices Network is operated by the RAND Corp., a think tank based in Santa Monica, Calif. It highlights research-based programs shown to be effective at improving outcomes for children, youths, and families.
Social Programs That Work, produced by the Washington-based Coalition for Evidence-Based Policy, is a listing of interventions in education and various social science areas that have been evaluated through randomized studies.
What Works Clearinghouse, funded by the U.S. Department of Education and run by the AIR, contains research reviews and effectiveness ratings on a wide range of educational programs and practices.
“I can imagine that, in a few years—whether it’s the BEE or something like it—there being something that will be consulted continually and begin to matter in the kinds of choices that schools and policymakers make,” said Robert E. Slavin, the director of the Center on Data-Driven Education Reform, or CDDRE. “That would be a radical change.”
His clearinghouse has so far gathered reviews on promising programs in elementary mathematics, reading, educational technology, comprehensive school reform, and other areas. Unlike the What Works Clearinghouse, though, the BEE seeks to be a one-stop shop for practitioners, posting evaluations by other researchers as well as those that Mr. Slavin and his colleagues produce.
“Nobody who is actually looking for a math program is going to look at What Works, and then the CRSQ [Comprehensive School Reform Quality] Center, and then somewhere else,” said Mr. Slavin, referring to other organizations whose reviews are included on the new Web site. “Educators and policymakers want all this in one place so they can make a fair comparison.”
Whether all the new Web sites will ultimately complement one another—or compete—is an open question, said Gerald E. Sroufe, the director of government relations for the Washington-based American Educational Research Association.
A strength of the Best Evidence site, said Jon Baron, the executive director of the Washington-based Coalition for Evidence-Based Education, is that its format makes it easy for practitioners to quickly find the three or four programs in a particular area with the strongest research base.
“From my standpoint, that’s the most important use for a site like this,” Mr. Baron said.
But the Best Evidence Encyclopedia also uses a slightly different research standard from that of the What Works Clearinghouse.
Both efforts put a high value on randomized studies—in other words, those that randomly assign participants to either control or experimental groups—as well other types of comparisons. The difference is that the BEE also favors studies that test students in the control and experimental groups on content that both groups have had a chance to learn, Mr. Slavin says. He said his site also includes older studies in the evaluation mix—in contrast to What Works’ practice of using studies no more than 20 years old.
“In the late 1970s and early ’80s, there was a lot of interest in randomized studies and a lot of them were being done,” he said.
In fact, all the online research-review databases set slightly different standards for what counts as evidence of effectiveness.
On one end of the scale, Social Programs That Work, the site that Mr. Baron’s organization operates, screens out educational programs that have not proved their mettle through randomized studies. At the other end, another What Works site—this one produced by the nonprofit Washington-based research organization Child Trends—highlights programs recommended by experienced practitioners as well as those with research track records.
“It does create confusion that there are a bunch of these sites,” Mr. Baron said, “especially if they use different criteria.”
When Mr. Slavin’s center was launched in 2004 with a five-year, $10 million grant from the Education Department’s Institute of Education Sciences, its main mission was to help districts improve learning by analyzing their own student-achievement data. Center researchers had no plans to create their own evaluation database, Mr. Slavin says.
But delays in getting the What Works clearinghouse up to speed prompted the center to step in with its own user-friendly research syntheses for its partner districts. ("‘One Stop’ Research Shop Seen as Slow to Yield Views That Educators Can Use," Sept. 27, 2006.)
“The key idea is not just to try to help districts understand their own data,” Mr. Slavin said, “but also to respond to the problems in practice that those reviews identify.”
Vol. 26, Issue 11, Pages 5,14