What’s in a name? A lot, apparently.
Members of the panel that sets policy for the National Assessment of Educational Progress—better known as the Nation’s Report Card—on Saturday approved small but significant changes to the test’s description of what constitutes “advanced,” “proficient,” and “basic” performance.
From now on, they’ll be preceded by the word NAEP, as in “NAEP advanced,” “NAEP proficient,” and “NAEP basic,” and references to performance in a grade will be stricken and replaced with performance on the NAEP assessment.
The impetus behind these revisions is to improve public understanding about what NAEP’s achievement levels mean; educators, parents, and the media often misunderstand NAEP proficiency to mean grade-level work, when it’s generally considered somewhat more difficult than that.
The rewording may seem awfully minor to the uninitiated. But there’s a deeper subtext behind the changes, and that’s why this is worth noting.
For nearly 30 years, some advocates have criticized the NAEP proficient bar as too high and therefore misleading for the public, and this latest go-around has opened up that debate once again.
In the public comments that accompanied this change, some commenters urged the board to overhaul the achievement level system entirely and others commended the National Assessment Governing Board for sticking to its guns.
A History of Achievement Levels
To fully appreciate this wording change, you need a little history lesson first. NAEP originally put the achievement levels and descriptors into place around 1990. Before that, the exam results were reported as a big list of scale scores, and there was no attempt to translate them for the lay person.
This was obviously difficult for most people to engage with, a problem because at that time people were hungry for more school-performance information. “A Nation At Risk,” the report that sounded a national alarm over education, had come out just seven years earlier, and the first push of “reform” efforts were well underway.
The creation of the achievement levels was quite controversial. Both the wording of the achievement level descriptions, and also the technical “standards setting” itself—the process of setting the cutoff points for “basic,” “proficient,” and “advanced” performance—created a lot of consternation and were variously attacked and defended.
While never universally embraced, the achievement levels eventually gained currency through the 1990s and became a keystone of NAEP—especially among the media (cough) which naturally found reporting the results using achievement levels to be easier for readers than reporting out scale scores.
Then we come to 2002 and the advent of the No Child Left Behind Act. Under that law, every state had to develop its own tests and cutoff scores and passing levels. Many of them chose to use the word “proficient” in their own systems.
I’m pretty sure you know what I’m going to say next: People began to get all of these terms, tests, and policies hopelessly confused.
Sometimes, people who confuse NAEP with grade-level performance are making innocent mistakes, and sometimes they’re doing it to make an advocacy argument. (Read this story for background on all the depressing ways people tend to misuse NAEP results).
In any case, from the beginning some edu-folks have argued that the NAEP proficient bar is really more aspirational than realistic. They say it’s actively misleading parents and the public and warping education policy unproductively. The group currently leading this charge is the Superintendents Roundtable, which has pointed to the fact that even top-performing countries would have many students not considered proficient under NAEP’s current levels.
The Roundtable has lobbied for NAGB to describe the proficient level on NAEP as “extremely demanding” and the basic level as “roughly analogous to grade level.”
Studies of NAEP and state tests find that NAEP’s idea of proficiency does tend to be higher than the states'—but states have narrowed this gap significantly in recent years.
New Questions Ahead?
That brings us to the current action.
The change approved by NAGB at its Nov. 17 meeting is actually part of a larger document laying out its policy for examining and reviewing the cutoff scores for the achievement levels. (The board is required to do this by federal law.) NAGB wanted to refresh the document to reflect technical evolution in standards setting.
That document was out for public comment, and from the 70-plus responses that came back, it was clear that some observers are worried that the door could now be open for the board to lower the bar, while others think doing so would paint a better picture of student achievement.
To its credit, the board also hosted a panel discussion with various perspectives on this topic. Marc Tucker, who has long studied workforce preparedness, for example, said he thought NAEP proficiency should be aligned to the level of work students need to succeed in the first year of community college—a lower threshold than the current one. On the other hand, David Driscoll, a former Massachusetts superintendent, attributed that state’s work to align its expectations to NAEP as a factor in its improved student performance.
But let’s make one thing really clear: There are no current proposals to change NAEP standards. While the board will begin revising its reading and math test frameworks over the next few years, the board’s intent is to maintain the test’s current cut scores and the preserve the “trend line,” said Andrew Ho, a testing expert at Harvard University and the chair of NAGB’s committee on standards, design, and methodology.
The board could do more to try to make the achievement levels easier for the public to understand what proficient performance looks like, beyond the somewhat verbose definition it currently uses, he said.
“The most common question we get asked is, ‘What does proficient mean?’ and the answer is this paragraph. But the paragraph is somehow deeply unsatisfying,” Ho noted in remarks to the board. “It’s also overwhelming and intimidating and dense.”
And that, it appears, is what NAEP will try to do: contextualize the test scores better, possibly by examining the landscape of other tests or perhaps student work samples.
To conclude, I’ll leave you with a sampling of the public comments that accompanied NAGB’s draft.
The Council of the Great City Schools: “For decades ... NAEP has been, and should remain, the standard for these terms. Application of these terms from assessment-to-assessment have been made relative to NAEP definitions—even if they have not been faithfully applied. Changing the terminology suggests that NAEP should no longer be the standard upon which we understand student achievement.”
The Education Trust: “If the revision of the Achievement Level Policy results in lower expectations for what it means to be ‘proficient’ or ‘advanced’ without solid justification for these changes, it could harm students across the country, with the highest risks for students who are already underserved in our schools.”
AASA, The School Superintendents Association: “The original achievement levels were developed in a rushed process, and resulted in levels that continue to confuse educators, citizens, and policymakers. The levels have been described as ‘wishful thinking’ more than ‘reasonable’ or ‘common sense,’ and the latest research linking NAEP’s benchmarks to international assessments reveals that the majority of students in most nations cannot clear NAEP’s proficiency bar.”
The Superintendent’s Roundtable: “In the Roundtable’s judgement, the modifications are in no way responsive to the major criticisms that have been leveled at the NAEP benchmarks over the years. To retreat behind the claim that the proficient benchmark is an aspirational standard is deceptive and evasive.”
Emily Maurek, teacher: “America’s children have been made out to be ‘failing’ when they score below Proficient, when in reality the passing mark is out of reach and always will be. The National Center for Education Statistics has clearly stated that ‘proficient’ is not synonymous with grade level performance. But when a metric is so clearly misused, misunderstood, and abused, it is clearly time for an immediate restructuring. That time is now.”
Jack Jennings, former congressional aide and Center on Education Policy president: “Those levels, instead, have led to confusion in the news media and among teachers, parents, and the general public. It appears that very high aspirations ruled their development, instead of realistic conclusions based on sound data.”
A version of this news article first appeared in the Curriculum Matters blog.