Opinion
School & District Management Opinion

Why Evidence-Backed Programs Might Fall Short in Your School (And What To Do About It)

How close a program’s implementation matches its plan is important, though perhaps not as important as you think
By Heather C. Hill — May 25, 2021 5 min read
The process from idea to practice in the classroom.
  • Save to favorites
  • Print

Editor’s Note: This is part of a continuing series on the practical takeaways from research.

As funds from the American Rescue Plan start to arrive in schools and districts, many educators will find themselves tasked with quickly implementing new programs to meet the needs of students with pandemic-related unfinished learning. These programs include high-dosage tutoring, summer learning experiences, and school year curricula with linked professional development.

Many schools will choose such programs based on their evidence of success, perhaps using academic reviews or websites (like the late Robert Slavin’s evidenceforessa.org) that report specific programs’ effects on student achievement. Yet most school leaders also know that implementing a program does not automatically yield the promised bump in student achievement.

It’s a conundrum: Why do research syntheses find large effects on student learning, but schools less often realize the gains?

One reason is that the effect sizes promised by meta-analyses—studies that statistically combine the results of multiple evaluations—can be overly large: Small pilot programs are likely to make up the bulk of studies in meta-analyses, and they tend to yield larger effect sizes, which are then not replicated under real-world conditions. Another reason is that implementation—the degree to which a program or practice is carried out with fidelity—can sharply affect program outcomes.

With Anna Erickson, a researcher at the University of Michigan, I recently examined 72 studies of math, science, English/language arts, and social-emotional-learning programs, comparing the extent of program implementation to the likelihood of positive student outcomes. As expected, when schools did not adopt or maintain program elements, evaluations saw few positive impacts on students. This echoes findings from studies of individual programs, where positive effects often occur only in schools and classrooms with higher levels of fidelity.

However, in good news for schools, our analysis found that both moderate- and high-fidelity implementation tended to produce positive effects of similar size. Perfect need not be the enemy of the good when it comes to the implementation of new programs.

When principals and instructional leaders signal commitment to a program and follow up with set-asides of resources and time to learn about the program, teachers are more likely to follow their lead.

But low fidelity doesn’t cut it. So many district and school administrators will still want to work for better implementation. But how? Qualitative studies of implementation failure suggest addressing four root causes:

  • Will—literally, whether teachers decide to embrace new practices or materials or to shelve them. It’s worth noting that in some cases, teachers’ decisions to forgo new materials make sense in light of their students’ needs.
  • Skill—whether teachers have the knowledge and expertise to implement a given program. For instance, case studies suggest STEM teachers who lack strong content knowledge may provide disorganized and error-filled instruction when using science, technology, engineering, or math curriculum materials that focus on concepts and build in student inquiry.
  • Organizational capacity—whether an organization has in place tools, routines, and relationships that enable implementation. Capacity encompasses a range of factors, from the quality of connections between central-office staff and principals to the ability to overcome seemingly simple logistics challenges, like distributing curriculum materials to students.
  • Contexts and coherence—whether the program will work given its particulars and the local setting’s needs, strengths, and weaknesses. This includes whether the program aligns with existing instructional guidance as represented in school priorities, pacing guides, and tested content and also whether the program reflects students’ interests, strengths, and culture.

While most post-mortems of failed programs point to one of these four categories as the culprit, research on how to proactively address them is more primitive. However, several studies suggest ways school leaders can plan for a successful implementation.

School leadership is key to increasing teachers’ willingness to take up new programs and practices. When principals and instructional leaders signal commitment to a program and follow up with set-asides of resources and time to learn about the program, teachers are more likely to follow their lead.

Allowing teachers to help shape program adoption can also increase their commitment to the program and potentially help avoid a bad fit between the program and local context. Program “touch points” with teachers after initial program training—for instance, a brief workshop a few weeks after implementation begins—can also boost teacher will, program fidelity, and student performance.

Teacher skill in using a new program can be enhanced by concrete lesson materials: curricula, routines, or assessments. Professional development and coaching on practices specific to the new program also seem to build teacher skill and interest.

Enhancing organizational capacity to support implementation may be a heavier lift. Ideally, districts thinking about adopting a program or practice would assess their “readiness”: leader and teacher commitment to the program, earmarked resources for implementation, and the extent of conflict with existing programs or other instructional guidance. District leaders would either resolve these issues or find alternative programs. But little empirical evidence exists for how to do so.

Addressing context and coherence can also be tricky. Adapting programs to better fit local contexts is a promising practice but with a caveat: Teachers must implement with fidelity before adapting the program. James Kim of the Harvard Graduate School of Education and colleagues found such a result in a randomized study. They assigned a group of teachers first to a year-long, fidelity-oriented implementation of a reading program, then to a year of adaptation. During the adaptation year, teachers did things like better matching books to students’ reading level and adding a parent-outreach component. Student outcomes for this adaptation group outstripped those of both teachers who continued to implement the program with fidelity and those who adapted the program without first implementing with fidelity.

Finally, districts must take steps to protect the program following initial implementation, including continued professional development and mentoring in the program for new teachers. With average teacher turnover around 20 percent per year, getting new teachers up to speed on a particular program can both boost overall implementation and keep veteran teachers engaged.

Protecting the program also means preventing it from being “crowded out” by other initiatives. Too often, teachers report their work to learn a particular math or ELA program comes undone when their school or district adopts a conflicting classroom-observation rubric, pacing guide, or even a competing curriculum.

A program vetted by evidence is important for results, but so is implementation. With carefully crafted implementation plans, schools should see more returns on their American Rescue Plan investments.

Events

Reading & Literacy K-12 Essentials Forum Supporting Struggling Readers in Middle and High School
Join this free virtual event to learn more about policy, data, research, and experiences around supporting older students who struggle to read.
School & District Management Webinar Squeeze More Learning Time Out of the School Day
Learn how to increase learning time for your students by identifying and minimizing classroom disruptions.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Improve Reading Comprehension: Three Tools for Working Memory Challenges
Discover three working memory workarounds to help your students improve reading comprehension and empower them on their reading journey.
Content provided by Solution Tree

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management Superintendents Think a Lot About Money, But Few Say It's One of Their Strengths
A new survey also highlights how male and female superintendents approach the job differently.
6 min read
Businesspreson looks at stairs in the door of dollar sign.
iStock/Getty and Education Week
School & District Management From Our Research Center Schools Want to Make Better Strategic Decisions. What's Getting in the Way?
Uncertainty about funding can drive districts toward short-term thinking.
6 min read
Conceptual image of gaming cubes with arrows and question marks.
iStock
School & District Management Opinion The 5‑Minute Clarity Reset: How a Small Pause Can Change a Big Decision
Stuck in a spin? This practice can help free an education leader to act.
5 min read
Screenshot 2025 11 18 at 7.49.33 AM
Canva
School & District Management Opinion Have Politics Hijacked Education Policy?
School boards should be held more accountable to student learning, says this scholar.
8 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week