Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Education Opinion

PISA’s Problem-Solving Results Are Interesting, But That’s It

By Rick Hess — April 02, 2014 3 min read
  • Save to favorites
  • Print

Yesterday, PISA released its newest report on the results of a “first-of-its-kind” assessment that sought to measure “creative problem-solving skills” of 15-year-olds. U.S. students scored above average, thought they fared worse than ten of the “44 countries and economies” (now there’s an awkward phrase). Thankfully, the exercise hasn’t occasioned the same spasm of hyperventilation that greeted the release of PISA’s math, science, and reading results a few months back. This time, the muted reaction meant that the hectoring by PISA Overlord Andreas Schleicher was pleasantly dialed down.

The test defined creative problem-solving as the ability to “understand and resolve problem situations where a method of solution is not immediately obvious.” The report explains, “In modern societies, all of life is problem solving.” Ahh, I see. Thing is, I’m lying when I say that. The whole thing is pretty murky to me. When I survey the questions, they look like a grab bag of cool stuff, and I’m not confident that the folks who boldly cite these findings really know what any of it means. The test is still interesting, and I’m hugely thankful for any assessment that gets us past reading, math, and science and into thinking skills--we just need to be really humble about how much sense we can make out of the findings.

You can see sample problems here or here. They’re kind of creative and interesting--I’m not sure what they’re actually tapping into, that I trust the grading heuristics, or how much confidence I have that PISA’s psychometricians and test developers have got this “right.” This really isn’t a knock on these highly trained experts. It’s just noting that we’ve seen various standardized assessments breathlessly unveiled over the course of the past century, and many have been later been deemed misguided, goofy, or problematic (we needn’t go back to early IQ tests for examples, just to the previous 21st century revision of the SAT or most 21st century state tests).

Three more general thoughts.

One, it’s useful to keep in mind how tiny the sample size is from a given nation. For instance, the tested U.S population totaled 1,273 students in 162 schools. Now, is this enough to get a valid sampling? Sure. But it’s 1,273 students out of 50 million, which means you want to be cautious about how precise you think the results are. This is why PISA rankings don’t really mean all that much, as Tom Loveless has pointed out--it’s because there’s huge standard errors baked into the estimate for any given country.

Two, as I’ve noted before, international test score comparisons suffer from the same banal problems that bedevil simple NCLB-style comparisons. PISA results say nothing about the value schools are adding; they merely provide simple cross-sectional snapshots of achievement. Imagining that these results can tell us how well schools are “teaching problem-solving” poses the exact same problem as using NCLB-style tests to conclude that schools in a bucolic, leafy suburb are teaching math or reading “better” than those in an impoverished community rife with broken families. The participating “nations and economies” vary in thousands of ways. They have different lifestyles, cultures, economies, political regimes, religious traditions, health care systems, diets, norms, school calendars, school facilities, educational resources, teaching populations, and so forth. Trying to imagine that one can determine which variables are responsible for how well fifteen-year-olds “solve problems” reflects a breathtaking hubris.

Three, it’s interesting to see everyone wrestle with the attempt to cram these results into their pre-baked narratives. Those who usually use lousy U.S. PISA results to make the urgent case for doing X were notably silent yesterday, while you can see the anti-testers who usually pooh-pooh PISA struggling to decide whether they should embrace these results. Meanwhile, the fact that Singapore, Korea, and Japan topped the charts has everyone scratching their heads, especially since these are nations that routinely send delegations over to study American schooling and ask all of us how they can do a better job nurturing creativity, imagination, and entrepreneurship in their schools. It’s possible that the test is measuring something other than creative problem-solving, that we’ve all severely misjudged these nations, that we’re wrong about what spurs creativity, or that the measurement is unreliable.

All interesting questions and well worth asking, so long as no one imagines that any of this ought to be read as offering a roadmap towards particular recommendations or reform.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.