Last week, the National Alliance for Public Charter Schools issued “The Health of the Public Charter School Movement: A State-By-State Analysis.” Authored by Todd Ziebarth and Louann Bierlein Palmer, the report is offered as the “first comprehensive attempt” to explore the link between charter school performance and the Alliance’s ranking of state charter school laws.
The report examines 26 states where charters account for at least one percent of public K-12 enrollment and that participated in CREDO’s 2013-14 evaluation of charter school performance. The report features 11 indicators that measure dimensions like reading and math performance, geographic distribution, the ethnic makeup of charter enrollment, the number of new charters opened, the number of charters closed, and the extent of certain “innovative” practices. The top five finishers, in order, were Washington DC, Louisiana, Michigan, New Jersey, and New York. Of the 26 states ranked, Nevada brought up the rear, with Oregon close on its heels.
The report is carefully and thoughtfully done. These kinds of efforts are always fraught because the data you really want is never available, every measure is inevitably simpler than you’d ideally like, you inevitably tick people off, and the whole thing ultimately rests on a series of subjective determinations. For that reason, these reports are best and most useful when they’re painfully clear about the rationale, the metrics, and the role of human judgment. On that score, Ziebarth and Bierlein Palmer do a stellar job. They clearly and succinctly explain all of this up front. The result is a good and useful exercise.
I quite like what Ziebarth and Bierlein Palmer have done. They’ve created a substantial contribution, and one that I hope (and trust) they’ll be replicating in the years ahead. It offers a serious framework for discussing the relative merits of state charter school systems in a more robust fashion. At the same time, it’s clear that there is a lot of room for improvement over time. One complaint that’s been raised, but that I don’t share, is that some of the data points are a couple years old. I’m okay with that, because doing the best you can with the data you can get is part of the deal if you’re doing this kind of work. And, since I think this kind of work is worth doing, c’est la vie...
That said, I do have three particular reservations/suggestions.
The first is that I’m always disappointed when charter rankings do nothing to examine the red tape and compliance burdens that are being imposed on a state’s charter schools. Organizations from the World Bank to the U.S. Chamber of Commerce rank countries and states in terms of their openness to entrepreneurial activity. They measure how long it takes to open a new enterprise, how burdensome it is, and what costs are imposed along the way. Alongside measures of quality control like school closures and reading/math performance, I’d love to see the Alliance start to shed some light on this score.
The second is my discomfort with the notion of model “innovations.” The report awards points if a larger share of a given state’s charters was employing one of six practices: extended day, extended year, year-round calendar, school-to-work, independent study, and offering higher education courses. Now, I think these practices are all fine. And if I thought they worked consistently, or if just adopting them meant that kids would be better served, I guess I’d be fine with this. But I don’t believe that’s the case. Or, if charter schooling was about dictating a particular model of delivery, they’d be appropriate. But that’s not the deal. Now, if this category was renamed something like “the National Alliance’s preferred school practices,” I’d be fine with it. But I get real uncomfortable with the notion that the National Alliance is issuing a report that judges the “innovativeness” of a state’s charter schools based on how many comply with some punchlist of preferred practices. That can encourage subtle (and not-so-subtle) pressure on schools to adopt these practices and on authorizers to demand them, which strikes me as more consistent with comprehensive school reform or the School Improvement Grant model than the charter school bargain.
The third is my ongoing worry that we’re defining school performance solely in terms of reading and math scores (and, occasionally, graduation rates). This is not how parents think about school quality. It’s particularly bizarre for it to be the whole conversation about schools that are purposefully given the room to pursue their own path. Reading and math performance is a fine and useful measure, but charters ought to be leading the way in helping us think about other dimensions of performance--the share of students mastering world languages, passing AP exams, receiving IB diplomas, excelling on their state’s science assessments, or what have you.
Now, let’s be clear. These concerns are hardly unique to this report. Ziebarth and Bierlein Palmer have crafted a good and a valuable contribution. But I hope next year’s version will rethink the innovation question and start taking some baby steps on the other stuff.
The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.