Twelfth graders’ math and reading scores on the National Assessment of Educational Progress were released yesterday morning, and the results were somewhat disheartening. Scores went down in math and stayed stagnant in reading compared to two years ago. And the average scores for the lowest performers—those in the 10th and 25th percentiles—dropped significantly.
When the scores for 4th and 8th grades were released in October, and showed declines as well, there was all sorts of chatter about where to place the blame. The Common Core State Standards took some heat, as did high-stakes testing policies. Then-U.S. Secretary of Education Arne Duncan said the declines were likely due to an “implementation dip,” in which curriculum changes associated with the common core were causing a temporary downturn.
But so far, the most recent 12th grade scores haven’t caused the same kind of finger-pointing.
In a statement, U.S. Secretary of Education John B. King Jr. acknowledged that teacher practices have changed in recent years, and simply said, “we need to be patient—but not passive—in continuing to pursue the goal of preparing all students for success after high school.”
American Federation of Teachers President Randi Weingarten said in October that the 4th and 8th grade results showed that “the strategy of testing and sanctioning ... does not work.” But when it came to this round of 12th grade scores, she chose not to comment.
Neal McCluskey, the director of the Cato Institute’s Center for Educational Freedom, who has generally objected to the federal government’s role in incentivizing states to adopt the standards, had a fairly tempered response to the new scores. “Do these results prove that Common Core is either impotent, or worse, a negative force? Certainly not,” he wrote. “But all these scores do undermine any proclamations of proven Core effectiveness.”
The declines, particularly among the low performers, provide “evidence that is corroborated by all kinds of other evidence that we need to do a better job of supporting in particular our most vulnerable kids,” said Daria Hall, the vice president for government affairs and communications at The Education Trust.
12th Graders Are Mostly ‘Pre-Core’
According to Jack Buckley, the former NCES commissioner and now the senior vice president for research at the College Board, people are less likely to say these recent scores offer any definitive judgment on the common-core standards “because for 12th graders, the bulk of their educational experience is pre-core.”
The students who took the test this round are too old, he said in an interview. “This is not a referendum on them.” However, four years from now, he said, the 12th grade scores are likely to get “a lot more attention.”
The common standards were adopted in 2010, but many states didn’t truly switch over to them for several years. In addition, many individual school districts chose to implement the standards in just a few grades at a time, and often they started with the youngest grades. So it’s unclear how much exposure 12th graders really had to the common core.
Plus, this round of 12th grade scores didn’t break down results by state, so it’s impossible to disaggregate by common-core adopters and non-adopters.
There’s also the possibility that people aren’t getting up in arms about the NAEP data because they don’t put much stock in the 12th grade scores to begin with. Many people believe that high school seniors don’t take the test seriously. As Buckley said, “It’s hard to rule out the fact that some kids in particular on hard problems say they don’t want to put in the effort on this.” On a test like the SAT, though, he said, students have much more incentive to power through each problem.
But Peggy G. Carr, the acting commissioner of the National Center for Education Statistics, which administers NAEP, said that based on data about omitted questions and other factors, motivation doesn’t appear to be much of a concern. And it wouldn’t explain a drop in scores anyway. “Students are not interacting with this assessment any differently than they have in the past,” she said.
Another theory on why fewer people are looking to lay blame for the NAEP declines? Perhaps people are now heeding warnings about committing “misnaepery"—that is, attempting to use NAEP data to explain cause-and-effect relationships. As assessment experts (and we here at EdWeek) often caution, NAEP data can’t tell you which policies or instructional interventions are working and which aren’t. Any such interpretations are speculative, at best.
A version of this news article first appeared in the Curriculum Matters blog.