You all know that I frequently lament the way that discussion of schooling has increasingly become a matter of reading and math scores. I’m not anti-testing and I do think that reading and math scores convey a lot of useful information. But I’m troubled by how completely reading and math scores have become a proxy for school quality, teacher performance, and student learning. When I say things like this, though, one of the responses I inevitably get is, “Well, what else do you want us to look at?”
I offered one response last week, when I wrote about some of the STEM-related measures we devised for the U.S. Chamber of Commerce’s Leaders & Laggards report. There, we looked at how states are doing when it comes to the share of high school graduates who pass AP exams in STEM subjects. I’m also a huge fan of pushing to get creative about metrics that capture the share of students who reach some level of accomplishment in music, master a second language, pass IB or AP coursework in various subjects, or pass a citizenship exam. But, today, I just want to point to three measures from Leaders & Laggards that help to offer a more holistic take on the quality of a state’s school system.
One measure is “return on investment” (ROI), which tries to determine how much return taxpayers are getting for each dollar the state spends. Applying the approach we pioneered in an earlier Leaders & Laggards report several years back, we calculated a state’s ROI by dividing a composite measure of NAEP achievement scores by per-pupil expenditure (adjusted for the cost of living). Five states that posted A’s in academic achievement on NAEP also posted A’s when it came to ROI--suggesting that they’re faring relatively well on reading and math, and doing so in a cost-effective manner. The five are New Hampshire, Massachusetts, Minnesota, Colorado, and Washington state.
A second measure is the quality and transparency of state data systems. If you believe that parents and taxpayers have a right to know what’s really going on, this is a prerequisite. On this count, the Data Quality Campaign has done terrific work. In L&L, we calculated how many of the 10 DQC recommendations states have actually completed. The recommendations include things like “building state data repositories” and “creating reports using longitudinal statistics to guide systemwide improvements.” Seventeen states earned A’s for having completed at least 80% of the recommended actions, while just six states had completed less than 50% of them. Arkansas and Delaware led the pack, posting 100% completion rates.
Other things equal, I believe giving parents more ability to exercise educational choice is a good thing, and I think this is true above and beyond other measures of performance or quality. So I was pleased that we introduced a new “parent power” metric in the report this year. The measure was based on the share of students attending schools of choice, the strength of charter laws in each state (including, of course, the strength of the authorizing and quality control system), and a gauge of parent influence on policy. Topping the category was Washington, DC, while Montana brought up the rear.
We can debate which of these measures are the “right” ones or which ones are most important. That said, I do know that we need to talk about educational quality in more robust ways and ways more connected to the real world. That certainly means paying more systematic attention to things like transparency, cost-effectiveness, and parental choice. Doing so can help address a common concern, which is that middle-class and affluent communities often feel like school reform isn’t about them and their kids. Reading and math performance are crucial and will remain so, but focusing only on proficiency in these areas is a recipe for narrowing the constituency for school improvement. When educational improvement becomes a broader discussion of excellence, cost-effectiveness, transparency, and more, everyone has a more personal stake in it. And that is the recipe for broadening the coalition for reform.