Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

College & Workforce Readiness Opinion

Is Online Credit Recovery a Sham?

By Rick Hess — September 26, 2019 5 min read
  • Save to favorites
  • Print

Is credit recovery creating fraudulent graduation rates and manufacturing an illusion of learning? As credit-recovery programs have grown in popularity and import, a growing number of questions have emerged about the impacts on participating students and the integrity of a high school diploma. Invaluable analyses, like those by AEI’s Nat Malkus or Fordham’s Adam Tyner, have helped move these questions to the front-burner. In that light, I was more than a little interested when Jeremy Noonan, a doctoral student at the University of Leicester and former classroom teacher, reached out with some thoughts and research regarding online credit recovery. I thought his concerns and experiences well worth sharing, so here you go:

“Recent work by Nat Malkus has given us a detailed picture of the extent of schools’ reliance on credit-recovery programs, specifically of the online variety. “Credit recovery” generally refers to the opportunities schools provide for students to acquire credits that—whether due to failure, prolonged illness, etc.—they did not gain on-time. Thus, it is a way for students to get back on track for graduation. While credit recovery has been practiced for decades (think summer school!), it is the rapid, widespread proliferation of online credit recovery (OCR), in place of traditional strategies like retaking a course, that has spurred questions about the validity of these credits. As Nat Malkus has framed it, are they real second chances or merely second rate?

A curious fact about the growth of OCR is the absence of any kind of research base on the fittingness of online courses for the kind of learners largely served by these programs—at-risk kids who struggled to learn in that subject in the first place. Prior to 2019, there was just one academic study on the efficacy of OCR for student learning compared with credit recovery in a traditional classroom—one!—and this found that OCR students did worse on a standardized algebra exam.

That OCR has grown nationwide in the absence of any research demonstrating its efficacy for student achievement raises questions about schools’ commitment to “data-driven” decisionmaking. My own story suggests that perhaps schools do not know whether OCR is good for student learning because of willful neglect; i.e., they do not want to know these effects.

From my first day in an OCR classroom, I felt troubled that this was not an environment set up to support learning. And the more I saw, the more concerned I became. Eventually, I laid out my burdens in a letter to my superintendent, which catalyzed a districtwide meeting of everyone involved in our OCR program. The meeting agenda charged us to resolve the issues I raised and to do so with a view toward raising students’ scores on Georgia’s End of Course exams.

But as the meeting drew to a close, the scores had not once been mentioned, so I raised my hand: “Wait a minute, we haven’t discussed raising their test scores. We have to know where we are starting from. Does anyone know their scores?” When no one could answer, I thought they were feigning ignorance—out of embarrassment perhaps. The administrator leading the meeting stifled further inquiry, saying, “Raising test scores is nice, but what’s most important is keeping our high graduation rates.” In other words, increased student learning—which would have required raising expectations—was seen as a threat to high graduation rates!

Later, I expressed my incredulity to my own assistant principal: How could no one know the test scores? She shared how she stumbled across the OCR students’ End of Course exam scores while exploring the district’s new data-management system. A folder labeled “Unknown” piqued her curiosity. In it, she found the files with the scores! No one in the district had ever bothered to look; they were tucked safely away.

The test scores were “Unknown” because no one cared to know. All that mattered was that the OCR program was a boon to the district’s graduation rates (which had increased by 13 points the previous year).

I soon discovered that there was more to their ignorance than apathy. After securing access from my principal to my own school’s scores, I was permitted to present them at the district’s next credit-recovery meeting. They were dismal—over 70 percent scored at the lowest level (which I learned later was consistent with statewide OCR scores). Naively expecting a receptive audience, I was shouted down by my colleagues—"What’s wrong with you, buddy? Do you have some kind of axe to grind?"—who left the meeting angrily. So much for the ideal of neutral, impersonal “data-driven” decisionmaking!

In an era in which this ideal of “data-driven,” “evidence-based” decision-making is regarded as a professional standard for school administrators, the absence of research evidence on OCR is quite startling. The implementation of OCR has far outpaced the research, which is only beginning to catch up. A study published in 2019 looked at credit recovery outcomes over a four-year period in a large urban school district. It found that while online coursetaking was positively associated with credits earned and high school graduation (predictably!), there was an increasingly negative association between the number of school years in which students took an online course and performance on math and reading exams. Specifically, students who took OCR courses for all four years of high school were set back in math and reading progress by the equivalent of one year!

Such findings should prompt school leaders to re-evaluate their reliance on online courses for credit recovery, but will they? An abundance of research done in the early 2000s on online learning—years before the OCR explosion—found that online environments were generally ineffective for student learning, suggesting that OCR courses would NOT “work.” Yet that has stopped few from implementing OCR in their schools.

That’s because these decisions are driven by one data point: graduation rates. The efforts to boost rates via bogus online courses are depicted vividly in Slate’s investigation The Big Shortcut. Yet is the effect of higher graduation rates sufficient evidence to justify the use of these programs? The research suggests—as the administrator running that meeting believed—a trade-off between higher rates and student achievement. What if more students are graduating but are learning less and are ill-equipped for life post-graduation? Are these trade-offs worth more students having diplomas? These are questions data alone cannot answer.”

Campbell’s Law is relentless. Pressure schools to raise graduation rates, and they’ll raise graduation rates . . . somehow or other. There’s good reason to fear that credit recovery has become part of that “or other.” Here’s hoping that advocates, researchers, parents, practitioners, and policymakers start asking what exactly is going on before—rather than after—credit recovery turns into one more case of “How the heck did we let THAT happen?!”

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.