Published Online:
Published in Print: November 4, 2015, as NAEP Score Drop Spurs Speculation

Drop in U.S. Math, Reading Scores Prompts Blame Game

Blame Laid on Economy, Demography, Standards

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

With U.S. students' math and reading scores showing statistically significant declines on a national test for the first time in more than two decades, advocates on all sides have begun pointing fingers.

The Common Core State Standards, frequent testing, the economy, and demographic changes have all become targets. But researchers say such explanations should be viewed skeptically—both because the test scores don't explain causation and because the drop will not necessarily lead to a long-term trend.

"Politically, [the drop] is going to be a problem for the common core, but as an educational researcher, it's unfair to say the common core had anything to do with these scores going down," said Tom Loveless, a nonresident senior fellow at the Brookings Institution who researches trends in achievement tests. "If [the scores] went up, it would be unfair to say it had anything to do with them going up. You just can't tell that from NAEP data."

Still, the results were a surprise to some, since scores on the National Assessment of Educational Progress had been showing an upward trend over the past two decades.

Proficiency Over Time

While the percentage of students scoring at “proficient” or higher levels on the National Assessment of Educational Progress mostly dipped from 2013 to 2015, the latest results are still significantly higher than 1996 levels.

"This isn't a pattern that we saw coming," Peggy G. Carr, the acting commissioner of the National Center for Education Statistics, which administers NAEP, said in a call with news media. "It was an unexpected downturn."

A nationally representative group of about 600,000 students took the test in 2015. Results from the test, known as "the nation's report card," are released about every two years, and serve as a barometer for how U.S. students are performing academically.

The results show that between 2013 and 2015, the average score for 4th grade math declined by about 1 point, to 240 on a 500-point scale, which constitutes a statistically significant decrease. In 8th grade math, the average went down about 2 points, to 282. (Various NAEP representations of scaled scores differ by 1 point due to rounding.)

Score Breakdown

Fourth grade reading scores remained unchanged statistically from 2013. The 8th grade reading scores went down about 2 points, to 265.

The scores are still much higher than they were in the 1990s.

"It's a one-time test. ... There was a lot going on in this country around testing and transition" when NAEP was given between January and March of this year, said Chris Minnich, the executive director of the Council of Chief State School Officers. "We need to make sure we don't overreact to one data point. We were sure not to do that two years ago when we saw the data uptick."

NAEP also reports scores based on achievement levels: "basic," "proficient," and "advanced." Proficient indicates that students are successful with challenging content at their respective grade levels.

Just 40 percent of 4th graders were at or above the proficient level in math this year. That's down from 42 percent in 2013. In 8th grade math, 33 percent of students were deemed proficient or better.

In reading, 36 percent of 4th graders and 34 percent of 8th graders were at the proficient mark or higher.

The percentage of students scoring in the lowest level, below basic, increased for 4th and 8th grade math as well as for 8th grade reading from 2013.

William J. Bushaw, the executive director of the National Assessment Governing Board, which sets policy for NAEP, noted that "curricular uncertainty"—likely a nod to the curriculum changes many districts are making to meet the Common Core State Standards—may be a factor in the drop in scores.

"The majority of our schools are undergoing significant changes in how and what we teach our students," he said. "It's not unusual when you see lots of different things happening in classrooms to see a decline before you see improvement."

U.S. Secretary of Education Arne Duncan said in a separate media call about the results that this sort of "implementation dip" is fairly common. He pointed to Massachusetts, which saw a temporary drop in test scores in the early 2000s after changing its standards. "This is the ultimate long-term play," he said.

But Brookings' Loveless dismissed the idea of a post-implementation drop being the norm. "I don't buy it," he said. "If that whole theory is right, we should have seen it two years ago," when the common standards were also being implemented. Further, he noted, states and districts have made curriculum changes many times over the past two decades—and they haven't always resulted in score drops.

Advocates are offering plenty of other interpretations for the declines as well.

Randi Weingarten, the president of the American Federation of Teachers, said in a statement that the focus on "test scores and their consequences" in classrooms has hindered student learning. The National Education Policy Center, a nonprofit research center at the University of Colorado, issued a press release noting that, while a cause for the drop in scores cannot be pinpointed, the decrease was "bad news" for those who've advocated for what it calls "no excuses approaches" that close failing schools and hold teachers accountable for student test scores.

Cuts to education funding since the Great Recession have hurt student growth, according to a statement from Daniel A. Domenech, the executive director of AASA, The School Superintendents Association.

Meanwhile, Michael Petrilli, the president of the Thomas B. Fordham Institute, wrote initially that the economy would be to blame if NAEP scores dropped but revoked that interpretation after digging into the state data. Instead, he asked onlookers to be patient, "to wait for more sophisticated analyses to emerge, and to wait until 2017 to see if these numbers are a one-time blip or the beginning of a disturbing trend."

Some have said changing demographics influenced the NAEP declines. But as Commissioner Carr pointed out, the demographic makeup of students has not changed much in the past two years.

As always, researchers offer the caveat about "misNAEPery"—or the use of NAEP scores to draw causal relationships.

"NAEP is very good at telling us what's happening," said Jack Buckley, the former NCES commissioner and now the senior vice president for research at the College Board. "And it's not very good at telling us why."

Buckley also noted that, while the changes in scaled scores are statistically significant, they represent very little change in what students actually know and are able to do.

"People boxed themselves in when they took credit for the small increases that were statistically but not substantively or educationally significant," he said. "They had no choice but to make claims about the small declines."

Achievement Gaps Remain

Racial and ethnic achievement gaps, for the most part, have persisted, according to the NAEP data.

In 4th and 8th grade reading, as well as 8th grade math, there were no significant changes in such achievement gaps from 2013 to 2015.

One score gap did shrink: that between black and white students in 4th grade math. But it did so because white students' scores declined in 4th grade math, and black students' scores held steady.

White, black, and Hispanic students' scores all declined in both reading and math for 8th grade.

The 2015 results were also broken down for all 50 states, the District of Columbia, the U.S. Department of Defense schools, and 21 urban districts.

The outcomes were mixed across the states, though they were disappointing overall.

The District of Columbia and Mississippi were the only two places to show increases in both math and reading for 4th grade.

The District of Columbia saw some of the biggest improvements during the previous test administration as well. However, it continues to perform below the national average in both math and reading. In fact, in both 8th grade math and reading, the District of Columbia scored lower than all 50 states.

Sixteen states saw declines in 4th grade math scores. Other than Mississippi and the District of Columbia, only the Department of Defense schools had increases in average scores. In all other states and jurisdictions, the average scores were not statistically different from 2013.

Not a single state had an increase in 8th grade math scores, and 22 states had declines in that subject at the 8th grade level.

West Virginia was the only state with an increase in 8th grade reading between 2013 and 2015, while eight states declined. All other states' 8th grade reading scores remained statistically unchanged from one test administration to the next.

Of the 21 urban districts assessed, only three saw multiple instances of score increases: the District of Columbia, Miami-Dade County, Fla., and Chicago.

NAEP was created as an independent indicator of achievement. But it is also meant to test what's being taught in classrooms.

A recent study by the American Institutes for Research found that the NAEP items have "reasonable" overlap with the common-core standards, which 43 states and the District of Columbia are now implementing.

"There's learning going on that isn't being picked up by NAEP at this point," said Fran Stancavage, a managing researcher for the study, which was commissioned by the independent NAEP Validity Studies Panel. Ideally, the two would be more in line than they are now, Stancavage said.

Related Blog

The governing board periodically makes adjustments to the test and may review it against the common standards for relevancy, said NAGB's Bushaw.

Minnich of the CCSSO said the national test should be aligned to college- and career-ready expectations for students, which is now a consistent bar across all states. But he also emphasized that the results should not be overblown. "In two years, if we see the same thing, we need to make sure we're making some adjustments," he said. "But at this point, we don't have a trend. We have one data point."

Results from two international tests—the Program for International Student Assessment and the Trends in International Mathematics and Science Study—will both be out in about a year. "If those drop, maybe there's something real happening here," said Loveless of Brookings.

Vol. 35, Issue 11, Pages 1,12-13

Related Stories
You must be logged in to leave a comment. Login |  Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Clarification: An earlier version of this story miscast the National Education Policy Center’s stance on the drop in NAEP scores. The center said the decrease was “bad news” for school improvement approaches that seek to close failing schools and hold teachers accountable for student test scores.

Most Popular Stories

Viewed

Emailed

Commented