Opinion
Standards Opinion

A Tale of Two Test Scores

By Dave Powell — February 26, 2016 6 min read
  • Save to favorites
  • Print

I want to tell you a story about a school and its tests scores. It begins with this headline: “Math Scores Lagging.”

I picked up my local newspaper recently to find this story being told about one of our local schools. The school in question is a bilingual community charter school not run by a for-profit corporation and not started by a parent or group of parents pursuing a rigid ideological agenda. It is not a school that selects students by lottery or pushes out the kids least likely to do well on tests. It’s not a school founded on the assumption that putting college pennants up in every classroom and having kids wear business suits to school on Friday will somehow miraculously prepare them to be college educated professionals someday. It’s definitely not a school where kids are banished to the Calm Down Chair for Not Doing What’s On Their Paper or failing to be in a Ready To Succeed position. It’s none of these things.

It is, instead, a school that was always designed to take all comers and, specifically, to address a community need: this school is situated in a place with a high population of migrant students whose native language is Spanish. It is a school full of students whose parents wanted more from their local public schools—specifically, more of an emphasis on culture, less on testing. It’s all written in the charter: this school was built to bring people of different backgrounds together, an increasingly hard thing to do in today’s rapidly resegregating world.

And it does. But that’s only half the mission. The other half, in spite of the charter, is set by the state. That half can be summarized pretty simply: kids have to pass their tests. Because. That’s why.

So here’s where the story would seem to get simpler but actually becomes much more complicated. If kids pass the right tests, then there’s no problem. If they don’t, though, then the school opens itself to charges that its academic program is lacking, that public money is being wasted, that children in the local public schools are being robbed of their educations by self-interested parents who don’t really support public education. Our local superintendent likes to remind the public that our state’s charter school law is very generous to charter schools, which is both problematic and true; on the other hand, he also has to be reminded periodically that the kids who attend the charter school, which is a public school in every sense of the words, are members of this community too. They are as entitled as anyone else is to a high quality education supported by every adult in the community, including the superintendent and school board.

So: are they meeting their “achievement” goals? Let’s see. Here are the scores posted by the school’s students on two recent exams, as reported in our local newspaper. One is the state test and the other is a test put out by an organization known by its initials: NWEA. See the difference?


  • State Exam in Reading: 53% advanced or proficient; 47% basic or below.
  • NWEA Exam in Reading: 68% advanced or proficient; 32% basic or below.
  • State Exam in Math: 31% advanced or proficient; 69% basic or below.
  • NWEA Exam in Math: 77% advanced or proficient; 23% basic or below.

Guess which test results the school is being judged by. If you guessed “State Exam,” well, then, you win the prize. Now ask yourself: what’s going on here?

Clearly the numbers don’t add up. Here are two different tests, allegedly assessing the same thing, that provide wildly different results. Is one test harder than the other one? If so, is it because that test sets a “higher standard” for student learning, indicating a deficiency on the part of the school and its students? Or could it be because that test is less valid and less reliable as an instrument of assessment?

I have no idea. I really don’t, because we’re not really allowed to know what’s actually on the state test. What I do know is that the school reacted to this news in a way that was disappointing, to say the least: they blamed the discrepancy in scores on Common Core. See, the charter school has attempted to implement the Common Core standards in classes taught in English (remember, the school’s actual mission is to promote bilingual education and emphasize culture, not unilaterally push for “achievement” based in test scores), and chose NWEA’s assessments as the instruments they’d use to evaluate that. If, indeed, NWEA’s exams are aligned to Common Core, and if the school has been teaching to the Common Core standards, then this should actually be a success story; it should show not that 2/3 of students “failed” their math exam, but that 3/4 of them actually passed.

Let me put it another way: this story would have looked a lot different in the newspaper if the headline was “State Test Validity Lagging,” and if the rest of the story focused on both the profound lack of transparency the state provides as a window into its curriculum and assessment strategies and on the school’s efforts to address this. We know next to nothing about how the state’s tests are made, and the extent to which they align with its new “Pennsylvania Core” curriculum—which is loosely based on Common Core, but had to be differentiated from “ObamaCore” by our most recent Republican governor.

What all this reveals, I think, is a couple of things. First, it reveals that we have become reflexively predisposed to look for failure instead of success. It’s one thing to blame to reporter for this; she hears it, no doubt, from all sides, and may have decided for herself that the charter school is bad for our community. She also may think she found a smoking gun in these test scores, which is what reporters are trained to do. If she didn’t feel that way, she probably would have written the article very differently.

What’s more suprising, though, is that school officials seemed all too ready to blame themselves and the curriculum they chose for the problem. In effect, they said: “We made the mistake of adopting Common Core instead of the state’s standards, and made the problem worse by adopting a test that actually assessed the curriculum we chose. We should have adopted their standards and their test.” But the opposite is true: the state encourages teaching and testing in the dark, and our kids and their teachers deserve better than that. The last thing the school should do is give in to this pressure and let the state decide if it is successful or not. That’s the whole point of having a charter in the first place.

Can it do that? Of course it can. There is always the risk that its charter will be revoked by an unfriendly school board if test scores continue to lag but the school is already compiling evidence to suggest that the state’s defintion of achievement is fundamentally flawed and may not apply to this school anyway. Putting aside the fact that many people have, regrettably, appropriated the idea of chartering for other reasons, charter schools are supposed to use their flexibility to help us decide what kinds of schools we would like to have. Where they don’t do that, they should be closed; where they do, they should broadcast their successes and make sure those successes are well known and understood. I don’t mean they need to “scale up,” either; sometimes a community school just fits right into its community, and needs to be satisfied with that. You don’t have to take over the world to make a difference in it.

It’s okay for schools, like students, to be who they are. That goes for traditional public schools as well as charters. It’s also a professional responsibility for school leaders to know what their school’s identity is and to identify the most effective ways to communicate that to the public. Maybe the state has no idea how to say if the local bilingual charter school is doing a good job of fulfilling its mission. Why, then, would we let the state give a test to tell us if that’s true or not?

The opinions expressed in The K-12 Contrarian are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.