Faced with persistent apathy among high school seniors toward the National Assessment of Educational Progress, the board that oversees the federal test is considering potentially significant changes aimed at making NAEP more understandable and relevant to the public.
Those steps would involve revamping the test’s structure and the way it is promoted to students. Also under consideration is releasing certain test results for individual schools and students—feedback that NAEP, which is focused primarily on national, state, and demographic trends, does not now offer.
Worries about lackluster student participation on the assessment known as “the nation’s report card” go back for years. But such concerns have gained new urgency as the rate of schools and students at the 12th grade level agreeing to take the test has dwindled to its lowest point ever.
That indifference lingers at a time when President Bush is calling for an expansion of NAEP by proposing that states be required to administer its tests in reading and mathematics to a sample of their 12th graders. Currently, states are only required to participate in NAEP at the 4th and 8th grade levels, while the 12th grade test is voluntary. (“Bush Backs Requiring NAEP In 12th Grade,” April 14, 2004.)
Mr. Bush has proposed boosting NAEP’s $95 million annual budget by $22.5 million in fiscal 2006 to pay for the expanded 12th grade testing. That step would provide state-by-state data on student performance for high school seniors. Only a national sample exists for that grade now.
With those objectives in mind, members of the National Assessment Governing Board, which sets policy for NAEP, on March 4 heard recommendations from an advisory committee that has studied ways to increase 12th grade participation on the test. The recommendations were presented at the governing board’s quarterly meeting, held here in the Texas capital.
“Principals, teachers, and students know little to nothing about NAEP, its mission and purpose,” said board member David W. Gordon, the superintendent of California’s Sacramento County Office of Education, who served on the advisory committee. “A lot of what we put out to people by way of encouragement [to take the test] is really apologetic. … That has to change.”
The recommendations will be studied by separate governing board committees over the coming months, and then could be considered by the entire 26-member board.
One potentially sharp departure from current NAEP policy is a recommendation to give individual students and schools some form of feedback on their performance on the exam. NAEP now produces test results only at the state and national levels, and on a few occasions for some school districts, but not for individual students and schools.
Out of Obscurity
Not allowing students and schools to see their NAEP scores creates a disincentive to take the assessment, or to take it seriously, the committee suggested. That was also a finding of StandardsWork Inc., a Washington consulting firm hired to study ways to improve seniors’ participation.
One option offered by the committee would be to give students passwords to a secure Internet site, from which they could learn their test scores.
Mark D. Musick, the president of the Atlanta-based Southern Regional Education Board, praised the governing board for considering changes to NAEP, though he added that implementing many of them would be tricky. “This is worth trying,” Mr. Musick, a former governing board chairman, said in an interview. “I hope technology makes it possible to do it.”
There could be consequences for doing nothing, he said. The participation rate for high school seniors and their schools on the 2002 NAEP dipped to 55 percent, its lowest point ever. From 1988 to 2000, that proportion hovered around 65 percent. Such poor participation puts “the credibility of NAEP at risk,” the committee warned.
But other questions remain about the legality of releasing school and student information. The law that governs NAEP says the federal government may not use the assessment to “rank, compare, or otherwise evaluate individual students or teachers.” Another provision says that “all personally identifiable information” about students and schools must remain confidential.
Peggy G. Carr, an associate commissioner of the National Center for Education Statistics, the arm of the Department of Education that administers NAEP, said one legal option would be to have students and schools seeking access to test results complete public-records requests, which would obligate the NCES to release that information. Because NAEP gives different sets of questions to different students, the NCES would most likely have to provide those students with feedback on their success on individual questions, relative to that of other students quizzed on the same items, rather than an overall test score.
As a carrot for high school seniors, the committee suggested giving them material rewards for taking part, such as food, educational materials, or a chance for a scholarship. But students’ tastes can be hard to predict. Committee members said they had heard stories of test-takers “littering the hallways” with the current certificates of appreciation awarded for NAEP participation.
The StandardWorks study said NAEP would have more appeal to students and schools if it were promoted as a public service to the nation, or as a matter of school or student pride.
“We need to make a much more compelling case to students,” said Mr. Musick, who described the message as “ ‘Do your best for your country’—and look, ‘Here’s how you did.’ ”
The board is also considering a fundamental overhaul of the 12th grade exam by having it focus on evaluating the preparedness of high school seniors for college, the workforce, and the military. That new emphasis would mesh with the goals of an increasing number of policymakers and organizations across the country, most recently the National Governors Association, that are showing a keen interest in high school improvement. (“Summit Fuels Push to Improve High Schools,” March 9, 2005.)
The advisory committee also suggested giving students the 12th grade NAEP in the fall, rather than in the spring, as is now the case, so that students would be more likely to participate and take it seriously before they’re overcome by “senioritis.”
Moving the test to the fall of 12th grade, however, would raise questions about whether NAEP was truly testing students’ knowledge through graduation, as opposed to 11th grade, said Gerald E. Sroufe, a senior adviser for the American Educational Research Association in Washington.
While he commended the governing board for exploring ways to make NAEP more relevant, Mr. Sroufe noted that part of the test’s appeal is its independence from assessments administered by states, which are the focus of hours of preparation by students and schools.
“NAEP is not a high-stakes test. It is an indication of national progress in education,” Mr. Sroufe said. “That’s its value, and that’s what we should hold on to.”
A version of this article appeared in the March 16, 2005 edition of Education Week as Board Studies Release of Individual NAEP Results