Reading & Literacy

Avoiding ‘MalPISAnce': Caveats on International Test Rankings

By Liana Loewus — December 03, 2013 2 min read
  • Save to favorites
  • Print

I’m sure I’m not the first to wish you this but ... Happy PISA Day, everyone!

As you partake in this triannual international celebration of 15-year-olds’ problem-solving skills (which I assume you kicked off this morning by reading the Ed Week article on the assessment results), here are some things to keep in mind:

• Be wary of the “rankings.” It’s tempting, in looking at the raw data, to say that the U.S. ranks 24th in reading. But in truth, the United States ranks somewhere between 19th and 31st in reading, because there is a range in which scores do not differ to a statistically significant degree. Countries that fall within that range should be described as on par with each other or within the same performance category. The charts in the NCES report (and those in our story) do a good job of noting these distinctions.

• Be careful saying X country did better than Y country. This is related to the point above. You’ll need to know what constitutes a statistically significant difference before saying for certain if one education system outperformed another. The NCES report only offers this information from the U.S. perspective.

• Do not confuse correlation with causation. This is a point we’ve made repeatedly with national and international test results, but it’s worth reiterating here. Since PISA was released this morning, I’ve gotten dozens of emails from advocacy groups saying the results bolster or repudiate whatever policies they support or eschew (the Common Core State Standards, early childhood ed, high-stakes testing, poverty-reduction efforts, teacher preparation, etc.). But, as several sources point out in my story, the results do not explain why particular countries performed a certain way, only that they did. Even the experts most skilled at talking about PISA test scores can quickly fall into the causation trap, so please take this caveat to heart. (My colleague Stephen Sawchuk recently wrote about the misuse of NAEP test scores, quoting a researcher who calls it “misnaepery.” Any ideas for PISA? MisPISAzation? MalPISAnce?)

• Don’t forget TIMSS. PISA, as you know, is just one of several prominent international assessments (though it does tend to get the most attention). And TIMSS, which came out last year, told a fairly different story—it showed the U.S. scoring better than the global average in math and science, and it had 4th graders improving in math. So maybe U.S. students are improving more in the early grades and hitting a wall in high school ... or maybe all of these results should be taken with a grain of salt.

A version of this news article first appeared in the Curriculum Matters blog.