Assessment

NAEP Stagnation: Do Shifting Demographics Explain It?

By Liana Loewus — May 12, 2014 3 min read
  • Save to favorites
  • Print

UPDATED

Last week, I reported that 12th grade NAEP scores in math and reading have remained unchanged since the last assessment in 2009. They’ve gone down in reading since the test was first administered in 1992, and up in math since 2005, the first year available for comparison.

An official with the National Center for Education Statistics cited changing demographics as a potential contributor to the lack of progress over the last 20 years—more English-learners and students with special needs taking the test, more Hispanics and fewer white students in the population, more students graduating.

Tom Loveless, a senior fellow at the Brookings Institution’s Brown Center for Education Policy, wrote in an email that “the changing demographics—including keeping kids in school who once dropped out before 12th grade—is undoubtedly a factor.”

Jill Barshay of The Hechinger Report argues that that explanation doesn’t hold water. She wrote:

When you look at top-achieving students in the top 75th and 90th percentiles, their scores are FLAT. ... High-achieving students aren’t improving at all. So you can’t blame the infusion of more low-performing students in the testing pool for the disappointing test scores. Even if we hadn’t introduced a greater number of weaker students into the mix, the scores of our high school students would still be stagnant.

Here’s one of the charts she is referring to (I’ll stick with reading):

Initially, I agreed with the conclusion that, since 1992, high-performing students have nearly flatlined and that’s contributed to overall stagnation.

But as Morgan S. Polikoff, an assistant professor at the University of Southern California’s Rossier School of Education, pointed out to me, it doesn’t make sense to compare percentile scores over time. (Or, as my colleague Stephen Sawchuk would say, to compare them is “misNAEPery.”) Percentile groups are relative to whoever is taking the test, not absolute. Having lower scores at the 90th percentile doesn’t necessarily mean the top performers weren’t doing any better--it could just as well mean there was an influx of lower performers who brought all the scores down. So it wouldn’t take as high a score to be considered a high performer. Charts like the one above comparing percentiles are “misleading,” Polikoff said.

It’s more helpful to look at score changes “in terms of race or socioeconomic status or whatever variables you’ve got in groups, so you’re comparing like to like,” he said. Even doing so “you’re still not really comparing like to like because even within those groups you’re going to have changes in the pool. So it’s always fuzzy.”

However, looking at the score changes over time by race doesn’t offer too much clarity either.

In reading, between 1992 and 2013, black students’ average scores went down a statistically significant 5 points. Scores for white, Hispanic, and Asian/Pacific Islander students were all flat.

The other explanation commonly offered for why 12th grade performance has stagnated, while 4th and 8th grade performance have climbed, is that seniors in high school just don’t take the test seriously.

During a media call on May 6 to discuss the new results for 12th graders, Cornelia Orr of NCES called this an “urban myth.”

“There’s no evidence students are blowing off this test,” she said.

I know a few high school teachers who might beg to differ.

So what’s the upshot here? Why aren’t 12th grade students doing any better than they did 20 years ago? Perhaps the Facebook relationship status says it best: “It’s complicated.”

Correction: An earlier version of this post incorrectly characterized the trajectory of high-performing students.

A version of this news article first appeared in the Curriculum Matters blog.