School & District Management

If at First You Succeed, Try, Try Again: Challenges in Repeating Research Results

By Sarah D. Sparks — September 03, 2015 3 min read
  • Save to favorites
  • Print

As the What Works Clearinghouse often shows, it’s tough to find a clear, positive effect from an education intervention. So when it happens, it’s easy to get excited and start looking for ways to implement the findings ASAP.

It’s probably best to take a step back.

In the most comprehensive attempt to date to reproduce social science research (including education research), a team of dozens of researchers repeated 100 published social science experiments. In the analysis, published online in the journal Science, researchers were able to redo most of the studies, but more than 60 percent of the experiments showed weaker evidence the second time around, even though the researchers used the studies’ original materials and strong methodological safeguards.

“Trying to figure how to make interventions work better than standard practice is very, very hard,” said Brian Nosek, who leads the University of Virginia’s Reproducibility Project, which conducted the analysis.

Nosek and colleagues collected studies from the 2008 volumes of three major social science journals. In the reviewed studies, 270 authors worldwide used many different methods and sample sizes, and 95 percent of the articles found significant results.

Stronger Effects Needed?

Researchers generally consider an effect significant if there is less than a 5 percent chance that the result could have happened by chance, otherwise known as a p-value of less than .05. But the Replication Project researchers found that studies with p-values close to .05 were much less likely to be repeated than those with P-values closer to zero.

“A failure to replicate a result one time does not indicate a study is wrong,” Nosek said. “We don’t have any definitive conclusions about any of the individual studies. It was intended to be a very shallow and wide look.”

Rather, the findings suggest that education researchers and other social scientists may need to look for stronger evidence to begin with when analyzing their data. Researchers with more experience were no more likely to have a study that was successfully reproduced than less-experienced teams, the project found, but those who had more-transparent data had a better chance of having their findings repeated.

Building the Structure for Future Research

Just organizing and preserving information from a study can go a long way to help, said Mallory Kidwell, a co-author of the analysis and coordinator of the nonprofit Center for Open Science in Charlottesville, Va. Many in the education research field are pushing to make data publicly available (with support from as high a power as the White House), but some researchers have been slow to move away from feeling data should be closely held, she said.

“There are just so many small tweaks that may seem apparent to an original researcher, versus someone who might not have as extensive a background but is still very interested in your work,” Kidwell said. When preparing to replicate the studies, she said, “We definitely got a fair bit of ‘It’s on someone else’s laptop,’ or ‘This researcher has left academia, so what you have is what you have.’”

But universities and journals also must change professional incentives that reduce transparency in research, Nosek said. “As a researcher, I need to get positive, innovative, clean results as often as possible to be published,” Nosek said. “All of those will inflate the beauty of the evidence, at the cost of its reproducibility. We’re simultaneously trying to shift incentives so that what is good for science and what is good for me as a researcher are in alignment.”

More than 500 journals now are evaluating guidelines to encourage more transparency in published studies, and the Reproducibility Project has a free tool to help researchers organize and share their work.

Related:

Related Tags:

A version of this news article first appeared in the Inside School Research blog.