Assessment Opinion

What NAEP Long-Term Trend Scores Tell Us About NCLB

By Diane Ravitch — May 05, 2009 3 min read
  • Save to favorites
  • Print

Dear Deborah,

I watched with some amusement as the media tried to figure out how to report the latest results from the National Assessment of Educational Progress (NAEP). Margaret Spellings said that the results vindicated the success of No Child Left Behind. The story by Sam Dillon of The New York Times reported that the achievement gaps—which the law was designed to eliminate—remained unchanged, and the headline of the story was “’No Child’ Law Is Not Closing a Racial Gap.”

So which is it? Were the results heartening or not? I’ll try to parse them here for the benefit of our readers and perhaps to kick off a renewed consideration of NCLB. Readers can make their own judgments by reading the report here.

Most people (and reporters) do not realize that there are actually two different versions of NAEP. There is Main NAEP, the tests that are given every other year to measure national and state achievement in reading and math, along with occasional tests in other subjects, such as science, history, civics, economics, writing, etc. Main NAEP’s tests of reading and mathematics are based on frameworks that periodically are revised, to reflect changes in the field.

And then there is Long-Term Trend NAEP, which is given less frequently and which tests more or less the same reading and math questions and concepts that have been tested since the early 1970s, with only minor revisions to remove obsolete references (such as outmoded technology).

Another difference is that Main NAEP tests grades 4 and 8, while Long-Term Trends tests students at ages 9, 13, and 17.

The Long-Term Trend results were released last week; the previous administration was in 1999. LTT reading scores for 9-year-olds, 13-year-olds, and 17-year-olds were up significantly, which is why Spellings felt vindicated.

But the scores for 9-year-olds were up by less than in the previous five years, so the rate of progress seems to have slowed. As for 13-year-olds, their scores have risen back to where they were in 1992; that’s progress, but only in the sense of recovering lost ground. And while reading scores went up for the 17-year-olds, they are still not as high as they were in the late 1980s and early 1990s. Again, better to see the scores going up rather than down, but we don’t seem to have made any real breakthroughs.

In mathematics, the story was similar. A significant gain for 9-years-olds, but not as large as the gain posted pre-NCLB. A significant gain for 13-year-olds, but not as large as the one posted pre-NCLB. No change for 17-year-olds, whose scores have hardly changed since 1973 (even though many low-performing students have dropped out by this age).

As for the racial gaps, they narrowed more pre-NCLB than post-NCLB in every age group.

So Spellings is right; the scores are moving in the right direction. But since the passage of NCLB and its implementation, the rate of improvement on the federal tests has slowed. Perhaps there are other strategies that would improve academic achievement with greater consistency.

By the way, you might be interested in reading my debate with John Chubbin Education Next about the future of NCLB. It was just posted.


The following was added on 5-6-09:
P.S. Thank you to a reader for pointing out that I did not clarify the dates of the tests that I was comparing. The results that were released last week were for the tests given in 2008. The previous tests were given in 2004. Before that, they were offered in 1999. Most of the pre- and post-NCLB comparisons that I make are related to comparing the results for these two periods: 1999-2004 and 2004-2008.

Thanks, and sorry for the error,

The opinions expressed in Bridging Differences are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.