This post was originally posted on the High School & Beyond blog.
As states press hard to ensure that all students graduate from high school ready for college or good jobs, many are hobbled by the very accountability systems they designed to leverage improvement, according to a report released Monday.
The new study, by Achieve, argues that in reporting K-12 performance to the public, states often aren’t including factors that matter the most in college readiness, such as the proportion of students who are completing rigorous high school courses, how well students are accumulating credits toward graduation, and whether they’re earning college credit while in high school.
Achieve, which works with states on standards and accountability, has been tracking the 50 states’ college-readiness policies in a series of reports for a decade, but shifted its analysis this year. Instead of focusing on what policies states adopt, it chose to examine how students are actually performing, state to state, on key indicators that correlate with the chance of college success.
“Policy alone is insufficient,” the report says. “Implementation of policy at all levels—state, district, school, and classroom—matters ... how do states—and their citizens—know whether their policies are having the intended impact?”
The report argues that states are under less pressure to monitor and improve a robust set of student-performance indicators if they’re absent from their accountability systems. If the don’t measure factors that most affect students’ chances of success in jobs or college, states also lack the information they need to see if their policies are working, Achieve argues. (Earlier this year, the organization designed and released sample accountability reports that include the factors it sees as most important.)
Not everyone will agree with Achieve’s choice of indicators in this study. For instance, the group classifies only the SAT, ACT, PARCC, Smarter Balanced, and New York State’s Regents exams as college-readiness assessments. The 19 states that didn’t give one of those tests end up in the “missing data” category for that indicator, regardless of the level of rigor on their high school tests.
Comparing states on other indicators, the study turns up a good deal of variation in how many states report the proportion of students completing a college-readiness curriculum (15 states), how many report the proportion of students who are “on track” to graduate (seven states), and how many report the proportion who are earning college credit while in high school (22 states). This map shows how few states report to the public that they offer a college-and-career-readiness course of study.
Achieve also found that not every state publicly reports its test-score data broken down by academic subject and subgroup. Federal law requires states to do that, but Achieve’s finding suggests that a few states might choose not to include in their own state reporting systems the same data they send to federal officials.
A look at Achieve’s compilation of student performance across states on PARCC and Smarter Balanced, which made their debut in 2014-15, and on the SAT and ACT in states that mandated those exams, offers a reminder of how far most states are from demonstrating that their students are ready for the rigors of college, at least according to these kinds of measures. (For another resource on state test scores from 2014-15, you can also consult EdWeek’s big test-score database, which reports each state’s 2014-15 scores alongside the scores from its most recent previous assessments.)
The Achieve report shows that in states that gave PARCC or Smarter Balanced, or required the SAT or ACT, the proportion of students who reach college-readiness benchmarks varies widely, but rarely goes over 60 percent (and there are lots of figures in the 30s and 40s).
Graduation-rate data, while trumpeted nationally for its improvements in recent years, still shows that many states have a long way to go before they can say that all their students leave with diplomas. (Exactly what those diplomas mean is a whole other can of worms, as previous reports have shown.)
A version of this news article first appeared in the State EdWatch blog.