Opinion
School & District Management Opinion

We Must Raise the Bar for Evidence in Education

By Carly Robinson & Todd Rogers — October 30, 2019 5 min read
BRIC ARCHIVE
  • Save to favorites
  • Print

Those looking for what works in education will find no shortage of advice. Educators, hoping to improve student outcomes, eagerly embrace recommendations telling them to “Cater to each child’s learning style!” and “Give students awards for positive behaviors!” But many such intuitive, popular “best practices” may not, in fact, be what is best for students, even though their proponents stamp them “evidence based.”

Educators who prioritize evidence-based practices are fighting an uphill battle; the standards of proof for what constitutes “evidence” in schools—and education more widely—are often exceedingly low. For instance, the popular notions that we should be teaching to students’ learning styles or providing students with attendance awards were both rooted in observational evidence. Both practices have now been debunked, but over 75 percent of educators still endorse learning styles, and many schools say they use awards to recognize excellent student attendance.

We should not be surprised when people have incorrect notions on what research says works—education research is littered with published papers on a range of practical topics that either do not replicate previous findings or suggest massively inflated effects.

Educational policymakers and practitioners need to understand how study designs and research practices influence the reproducibility and credibility of a study’s findings. This is easier said than done, but there are a couple of initial indicators that suggest a research finding is “real” and worth implementing.

First, to disentangle whether a practice causes improvement or is merely associated with it, we need to use research methods that can reliably identify causal relationships. And the best way to determine whether a practice causes an outcome is to conduct a randomized controlled trial (or “RCT,” meaning participants were randomly assigned to being exposed to the practice under study or not being exposed to it).

The standards of proof for what constitutes 'evidence' in schools—and education more widely—are often exceedingly low."

Second, policymakers and practitioners evaluating research studies should have more confidence in studies where the same findings have been observed multiple times in different settings with large samples. Many educational practices are based on single research studies with small sample sizes. We can learn from small, one-off studies. But when it comes to adopting practices, we recommend those that have been evaluated in studies with large sample sizes and reproducible results.

Finally, we can have much more faith in a study’s findings when they are preregistered. That is, researchers publicly post exactly what their hypotheses are and exactly how they will evaluate each one before they have examined their data. This helps limit flexible research practices, making it less likely that researchers will find statistically significant results by chance.

Some will lament that large RCTs are too expensive, slow, or difficult to implement, and that preregistering studies is not feasible because educational research is messy and unpredictable. Yet several studies conducted in the past few years prove that changes in educational practice evaluated by large-scale, reproducible, preregistered RCTs exist and can be used to inform work on the ground.

In our own work, we have spent the last six years studying how to reduce student absenteeism. Through two large-scale, preregistered RCTs (one with more than 28,000 K-12 students and another with almost 11,000 K-5 students), our research team found that sending mailings to parents several times over the course of the school year with personalized attendance information that dynamically targets key parental misbeliefs consistently reduces chronic absenteeism 10 percent to 15 percent. This research led to the creation of InClassToday, a program that partners with districts around the country to help them reduce student absenteeism by implementing this research-backed intervention.

Another practice that educators can—and, strong evidence suggests, should—take up comes from a study conducted by Peter Bergman and Eric Chan. They randomly assigned parents of more than 1,000 middle and high school students in 22 schools to receive automated, frequent information via text message about their child’s missed assignments and grades. The study, which has now been replicated multiple times, found that this strategy led to a 28 percent reduction in course failures, a 12 percent increase in specific class attendance, and increased student retention by 1.5 percentage points. Providing parents with information that helps them monitor their child’s academic progress can have meaningful impacts on student success.

Finally, a study of learning mindsets led by David Yeager explored for whom growth mindset interventions are most effective. In a nationally representative sample of more than 12,000 students, the study found that adolescents assigned to complete a “growth mindset” intervention—which taught that intellectual abilities can be developed—earned higher GPAs (a modest, but real 0.05-grade points) in core classes at the end of the 9th grade. The authors preregistered that they predicted the intervention would most help low-achieving students. Consistent with this, they found that low-achieving students showed larger effects with higher GPAs (0.10-grade points) in core classes at the end of the 9th grade and were 11 percentage points less likely to get a D or F average in one of these classes. This large-scale and preregistered study replicating prior findings provides evidence that mindset interventions can improve outcomes for struggling students.

Each of these relatively low-cost and easy-to-implement interventions have modest but real impacts on student outcomes. Holding educational research to greater standards of evidence will very likely mean the effect sizes that are reported will be smaller. But they will reflect reality.

Expectations about how much impact interventions tend to have need to be massively recalibrated since more-rigorous research with larger sample sizes tends to find smaller effect sizes. If an educational intervention’s outcomes seem too good to be true, they probably are. (A recent working paper by Matt Kraft of Brown University provides a helpful overview of how we might think about effect sizes in education.)

There do not appear to be single “silver bullets” that will easily equalize and accelerate educational outcomes. The reality is that educational gains will come from a combination of many well-supported, evidence-based practices and communities of caring adults helping kids.

We hope educational policymakers and practitioners will start proactively looking for practices that meet these standards of evidence and work to widely implement them. By doing so, we can move steadily toward greater educational success for all students.

A version of this article appeared in the October 30, 2019 edition of Education Week as Raising the Bar for Evidence in Education

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
IT Infrastructure & Management Webinar
The Reality of Change: How Embracing and Planning for Change Can Shape Your Edtech Strategy
Promethean edtech experts delve into the reality of tech change and explore how embracing and planning for it can be your most powerful strategy for maximizing ROI.
Content provided by Promethean
Reading & Literacy K-12 Essentials Forum Reading Instruction Across Content Disciplines
Join this free virtual event to hear from educators and experts implementing innovative strategies in reading across different subjects.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by Panorama Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management Opinion What's the No. 1 Way to Retain Principals?
When it comes to the demands of the job, principals share common concerns, according to a recent survey.
5 min read
Screenshot 2024 12 09 at 12.54.36 PM
Canva
School & District Management The Top 10 Things That Keep Principals Up at Night
Principals’ jobs are hard, but what are their most common concerns? We asked, principals answered.
5 min read
School & District Management Superintendents Wrapped: The Songs District Leaders Listened to This Year
Five brave superintendents shared their top songs and artists from the past year with Education Week.
1 min read
A bright blue and pink background with a hand holding a phone with the spotify logo. A pair of headphones frames the cellphone.
Collage by Gina Tomko/Education Week and Canva
School & District Management Opinion I Invited Students to Help Hire a New Assistant Principal. Here’s What Happened
What began as an opportunity for the students turned into a gift for our administrative team.
3 min read
Centering students in the school community.
Vanessa Solis/Education Week via Canva