Curriculum What the Research Says

Picking ‘Evidence-Based’ Programs: 5 Mistakes for Educators to Avoid

By Sarah D. Sparks — January 19, 2024 6 min read
Conceptual image of magnifying glass and rating/grades.
  • Save to favorites
  • Print

With hundreds of education programs and interventions purporting to be “evidence based,” and dozens of websites and clearinghouses that compile thousands of studies and evaluations, it’s easy for practitioners to become overwhelmed by the sheer amount of information to sift through to figure out what will best help their students.

It’s understandable that teachers, as well as district and school leaders, get frustrated when studies and clearinghouses seem to come to different conclusions about a given program.

But hoping for a single seal of approval is “not realistic,” said Jonathan Jacobson, the the branch chief for the What Works Clearinghouse, National Library of Education, and ERIC at the National Center for Education Evaluation and Regional Assistance.

See Also

photograph of a magnifying glass on an open book
Valiantsin Suprunovich/iStock

The sometimes conflicting information is akin to the range of reviews consumers encounter for all kinds of products, Jacobson said.

“When I’m looking to buy a new car, and I read different magazines that have reviewed the car, the results or characterizations are not going to be identical, even though the vehicle might be the same,” Jacobson said. “That doesn’t mean that [evidence-based decision-making] isn’t scientific, but it’s not as simple as solving a math problem where everyone should get a single right answer.”

With that in mind, here are five pitfalls for educators to avoid when using research to choose evidence-based programs.

1. Equating research quality with program quality

Research clearinghouses have become a popular way for educators and leaders to review a lot of studies on a subject quickly, but Jacobson warned, “there’s a distinction between the quality of the research and the quality of the program or the practice or the intervention.”

“Just because studies are high-quality, that doesn’t mean they will show favorable results” for a product or program, he said.

A randomized controlled trial—considered the “gold standard” in research, might show that an intervention worked or didn’t, but it won’t necessarily give insight into why that happened.

Educators shouldn’t “assum[e] that the What Works Clearinghouse endorses a particular program or product simply because we report and confirm favorable findings,” Jacobson said referring to the U.S. Education Department’s education research review site. “We’re not endorsing that; we are simply characterizing the strength of the research and reporting whatever findings we were able to confirm with our standards or procedures. Educators need to make their own decisions.”

2. Taking ‘no effect’ for a conclusive answer

When a study finds that an intervention has “no effect,” it means that students who used the intervention performed equally well as similar students who did not. This comparison group might be using a different intervention, or just whatever they would normally do in school, often dubbed “business as usual.”

Finding no effect “doesn’t necessarily mean that is a bad finding,” said Erin Pollard, an education research analyst with IES’s knowledge utilization division. “If you have a very expensive program as your business-as-usual, and you’re [testing] a less expensive program and there’s no difference, that could be a good thing.”

3. Looking only at the summary (or rating)

It can be tempting to just skim the abstract of an evaluation study to determine whether the program was effective—or, similarly, to look at the overview given of an intervention on a clearinghouse—but looking only at a summary can give educators the wrong idea about a program, researchers said.

That’s because evaluations often measure several different outcomes in a variety of different areas, from specific academic subjects to social-emotional outcomes like motivation. The What Works Clearinghouse alone can include more than 70 different outcomes in its reports. It’s critical to dig into these findings.

“Sometimes users will take the high-level summary, like the evidence tier of the whole study and intervention, but not look at the individual outcome domains,” IES research scientist Betsy Wolf said. “It might be that the study has evidence of a positive finding [in the summary], but not in the outcome domain that people really care about for that intervention.”

For example, Pollard pointed to one What Works Clearinghouse review of a study on a pre-kindergarten reading intervention. The summary shows a “promising” rating for having at least one positive result in a rigorous study.

An intervention rating in the What Works Clearinghouse includes several different measures.

“People might look at it and say, ‘Well, we should do this reading intervention?’” Pollard said. “Well, you can, but actually the effects were on math.”

The study had assessed the children on early word identification, but also early number skills and two tasks measuring self-regulation, such as a “head, shoulders, knees and toes” game. Of all of those, students participating in the intervention only performed better than the control group of students in math.

Even within a given area, it’s important to look at the details of how an outcome is being measured. Wolf noted that researchers—and the WWC—have expanded the methods they use and ways they look at things like a program’s effects on teachers.

“In the past we had teacher practice, we had teacher retention and turnover—very observable things—but we didn’t have outcomes like how teachers were perceiving their jobs or their self-efficacy. So we have that now,” Wolf said.

4. Focusing too much on effect size

To ensure that a study can isolate the effects of, say, a reading program, researchers try to tightly control how it is put into practice. They may give several days of teacher training or provide their own tutors. They may create and use particular lesson plans and manipulatives. All of these implementation pieces can take time and money.

“One concern is that people might pick an intervention that has the largest effect size, and not think about the resources needed for implementing that,” said Liz Eisner, IES’s associate commissioner for knowledge utilization.

When choosing an intervention, Eisner advised teachers and leaders to balance its benefits with the effort of implementation: the materials, staff, and training, as well as how much of the intervention students need to see results. A cheap, 5-minute intervention that boosts background knowledge before a new lesson and a yearlong, intensive tutoring program can both benefit students, but they each require very different levels of investment from schools.

5. Forgetting whom the program serves

The most “evidence based” program in the world won’t be much help if it doesn’t help the students an educator needs to serve. Eisner said educators should remember to review the demographics, grades, and other aspects of students in a study’s treatment and control groups.

“For example,” she said, “if a school that is trying to improve math achievement serves mostly English-learner students, but the intervention of interest has not been studied in schools with many ELs, the staff may decide that they want to identify a different intervention that has some empirical evidence of positive outcomes with EL students.”

That goes for students in the comparison group, too, Jacobson said.

“It’s important to understand what was the intervention being compared against. For which population of students? For which outcomes?” he said. “It’s really important, I think, for decisionmakers to be thoughtful about what applies to them.”

See also

Evidence-Based Practice
July 9, 2020

Events

Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Science of Reading: Emphasis on Language Comprehension
Dive into language comprehension through a breakdown of the Science of Reading with an interactive demonstration.
Content provided by Be GLAD
English-Language Learners Webinar English Learners and the Science of Reading: What Works in the Classroom
ELs & emergent bilinguals deserve the best reading instruction! The Reading League & NCEL join forces on best practices. Learn more in our webinar with both organizations.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Curriculum Q&A Why One District Hired Its Students to Review Curricula
Virginia's Hampton City school district pays a cadre of student interns to give feedback on curriculum.
3 min read
Kate Maxlow, director of curriculum, instruction, and assessment at Hampton City Schools, who helped give students a voice in curriculum redesign, works in her office on January 12, 2024.
Kate Maxlow is the director of curriculum, instruction, and assessment in Virginia's Hampton City school district. She worked with students to give them a voice in shaping curriculum.
Sam Mallon/Education Week
Curriculum One School District Just Pulled 1,600 Books From Its Shelves—Including the Dictionary
And the broadening book ban attempts may drive some teachers out of the classroom.
6 min read
Books are displayed at the Banned Book Library at American Stage in St. Petersburg, Fla., Feb. 18, 2023. In Florida, some schools have covered or removed books under a new law that requires an evaluation of reading materials and for districts to publish a searchable list of books where individuals can then challenge specific titles.
Books are displayed at the Banned Book Library at American Stage in St. Petersburg, Fla., Feb. 18, 2023. In Florida, some schools have covered or removed books under a new law that requires an evaluation of reading materials and for districts to publish a searchable list of books where individuals can then challenge specific titles.
Jefferee Woo/Tampa Bay Times via AP
Curriculum How to Create Courses on Personal Finance That Stick
Many states are now requiring students to study personal finance. Here are tips on implementing these courses.
4 min read
Illustration of a woman sitting on top of a question mark and underneath are with multiple arrows showing different directions to earning income, spending, savings, investing, credit, checking account
Getty/DigitalVision Vectors
Curriculum A Few Years Ago, 8 States Required Personal Finance Education. Now It's Up to Half
Advocates say the pandemic has accelerated the push to require high schoolers to take a financial literacy course.
4 min read
Vector illustration of a calculator, piggy bank, charts, coins, and stacks of paper money all floating above and around an open book.
iStock/Getty