Opinion
Assessment Commentary

Did One Program Really Cost Students 276 Years of Learning? (Spoiler: No.)

You should be skeptical about how education research is translated
By John F. Pane & Matthew D. Baird — July 03, 2019 5 min read
BRIC ARCHIVE

Researchers may follow a rigorous scientific process to determine the effects of an education program or policy, but the rigor is undermined when flawed methods are used to interpret the results. Consider the data reported by one study that led to the startling conclusion that students finished a year of schooling with much less knowledge than when they started—seemingly, they would have been better off skipping school. However, as we will show, this conclusion doesn’t tell the whole—or the most accurate—story.

In studies of education initiatives, researchers typically report numeric measures of impact as standardized effect sizes, an abstract statistical scale that can be hard for both researchers and non-researchers to interpret. Educators and policymakers want to be able to assess the implications for their own practice and decisionmaking. For example, should a program that produces a standardized effect of 0.13 be expanded or shut down? The answer is not plainly evident.

How can anyone be sure the person carrying out the translation didn't handpick a calculation favorable to his or her agenda?"

The research field increasingly recognizes that we need metrics that everyone can understand. Translating the result into years of learning has become a popular approach. A program with a standardized effect of 0.13 might be said to impart 1.22 years of learning in just one school year. On the surface, this seems much easier for non-researchers to interpret. Unfortunately, though, such a metric has major flaws.

The good news is that other options are available. This year, we examined years of learning and three other options in “Translating Standardized Effects of Education Programs Into More Interpretable Metrics.” The others were: benchmarking against other known effects, percentile changes, and the likelihood of scoring above a reference value. Using these metrics, an effect of 0.13 might be said to be one-fifth as large as the black/white achievement gap, to move a median student from the 50th to the 56th percentile, or to increase the number of students scoring proficient by 3 percent, respectively. Our research concluded that percentile changes—the method used by the What Works Clearinghouse to calculate its improvement index—performed best. Years of learning performed worst.

The following example illustrates some of the flaws of years of learning. In our study this year, personalized learning’s effect on 11th grade reading (−0.20) translated into +37 years of learning using one calculation method and −276 years using another method. These numbers are both unbelievable and confusing.

The average reader might misinterpret that positive 37 as showing benefits, but that translation actually means that students lost as much learning in one year as the comparison students would have taken 37 years to lose. The other calculation suggests that these students would need to repeat 11th grade in a comparison school year after year for 276 years to reverse the damage of a single year of treatment.

Of course, years-of-learning translations do not always perform that poorly. The most stable estimate we found in our study was for personalized learning’s effect on 6th grade reading (0.10), which translated to between five and 15 weeks of learning. That’s still a factor-of-three difference. If the analyst simply reports a gain of 15 weeks of learning, readers probably won’t realize the calculation is so unreliable.

The flaws of a years-of-learning measure are evident beyond our own work. Now, back to that study that seemed to suggest students would have been better off skipping school for a year: In 2015, CREDO estimated the effects of attending an online charter school. The reported effect sizes show the schools undoubtedly performed poorly, but the study’s translations of results into years of learning were highly improbable. Nationally, the effect on student learning in mathematics was translated to −180 days of learning. Recognizing that a typical school year is 180 days, The Washington Post made the logical interpretation that “it’s as if the students did not attend school at all.” We used a method with fewer assumptions to calculate a loss of 128 days of the 180-day school year—still a discouraging result, but one that allows some learning probably did take place.

The same CREDO study reported that in Florida the effect of attending an online charter school translated into about −325 days of learning. By that estimate, students not only didn’t learn anything, they apparently unlearned material they already knew when they started the school year—almost another year’s worth. Would those students have been better off skipping school for the year and losing only 180 days?

Such unreliable calculations can lead to poor investment decisions. For example, in “Reimagining Learning,” the NewSchools Venture Fund translated effects from our 2015 study of personalized learning into 122 extra days of learning per year. They further projected this onto returns on investment, concluding that a “big bet” $4 billion investment in innovative schools would accrue educational benefits worth $8 billion to $20 billion.

Using what we consider to be better methods with fewer assumptions, we calculated the impact to be only 25 extra days of learning per year. That is about one-fifth as large as the NewSchools Venture Fund’s 122 days. Our estimate suggests the $4 billion investment might break even at best, and possibly produce as little as $1.6 billion in benefits. Now that investment does not look so enticing.

But, to be clear, we don’t trust our numbers, either. Nobody should trust a translation that can report effects as outlandish as 276 years of learning and that are so sensitive to the method of calculation. How can anyone be sure the person carrying out the translation didn’t handpick a calculation favorable to his or her agenda? And there are additional problems with this type of translation, which are more technical in nature.

With better alternatives available, the practice of translating standardized effect sizes into years of learning or other units of time should be halted. And practitioners and policymakers should be highly suspicious of research results reported using this flawed metric.

Follow the Education Week Commentary section on Twitter.

Sign up to get the latest Education Week Commentaries in your email inbox.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Making Digital Literacy a Priority: An Administrator’s Perspective
Join us as we delve into the efforts of our panelists and their initiatives to make digital skills a “must have” for their district. We’ll discuss with district leadership how they have kept digital literacy
Content provided by Learning.com
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
How Schools Can Implement Safe In-Person Learning
In order for in-person schooling to resume, it will be necessary to instill a sense of confidence that it is safe to return. BD is hosting a virtual panel discussing the benefits of asymptomatic screening
Content provided by BD
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
How Districts Are Centering Relationships and Systemic SEL for Back to School 21-22
As educators and leaders consider how SEL fits into their reopening and back-to-school plans, it must go beyond an SEL curriculum. SEL is part of who we are as educators and students, as well as
Content provided by Panorama Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Opinion Alternatives to Standardized Tests During a Pandemic Year
Three educators suggest alternatives to federally mandated standardized testing during this year undercut by COVID-19.
7 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
Assessment Opinion AP Exams Can't Be Business as Usual This Year
The College Board seems unconcerned with the collateral damage of its pandemic approach, writes an assistant superintendent of curriculum and instruction.
Pete Bavis
5 min read
Illustration of large boat in turbulent waters with other smaller boats falling into the abyss.
iStock/Getty Images Plus
Assessment Federal Lawmakers Urge Miguel Cardona to Let States Cancel Tests, Highlighting Discord
A letter from Democratic members to the new education secretary calls for an end to the "flawed" system of annual standardized exams.
3 min read
Jamaal Bowman speaks to reporters after voting at a polling station inside Yonkers Middle/High School in Yonkers, N.Y. on June 23, 2020.
Jamaal Bowman speaks to reporters after voting at a polling station inside Yonkers Middle/High School in Yonkers, N.Y. on June 23, 2020.
John Minchillo/AP
Assessment How Two Years of Pandemic Disruption Could Shake Up the Debate Over Standardized Testing
Moves to opt out of state tests and change how they're given threaten to reignite fights over high-stakes assessments.
9 min read
Image of a student at a desk.
patat/iStock/Getty