Research Council Pledges Help in Setting NAEP Levels
When the National Research Council called the process of setting achievement levels for the nation's report card "fundamentally flawed" in a recent report, it provided only sketchy details on how to make the system better.
"I hope you sense our frustration," Mark D. Musick, the chairman of the National Assessment Governing Board, told NRC researchers at the start of the board's quarterly meeting here. Board members want to know exactly what they should do to correct the problems with the National Assessment of Educational Progress, he said.
By the end of the members' two-day meeting, the Washington-based NRC had agreed to tell them. The testing experts assembled by the research council promised to do a follow-up study in which they will outline definitive steps that NAGB can take to fix the testing system it oversees.
In its review of the national assessment, the panel convened by the NRC said the achievement levels are too rigorous and inconsistent, and don't match others in similar large-scale tests. The federally sponsored NAEP is the only national sampling of students in core subject areas. ("Panel Assails Assessment Calculations," Sept. 30, 1998.)
"Right now, judgments are based on items in isolation," James W. Pellegrino, the Vanderbilt University professor who headed the NRC panel, told NAGB during the board's Nov. 20-21 meeting.
A better way to set levels may be "to look at sets of items and performance on them rather than rely on judgments" about individual items, Mr. Pellegrino said.
For example, Lauress L. Wise, a member of the NRC panel and the president of the Human Resources Research Organization in Alexandria, Va., cited the possibility of ordering items according to their degree of difficulty and then encouraging experts to determine which point should be the cutoff for each NAEP achievement level: "basic," "proficient," and "advanced."
But neither Mr. Pellegrino nor Mr. Wise specifically endorsed that method.
Mr. Musick said board members wanted the criticism to be accompanied by proposed solutions that were backed up with research.
RC officials said they would incorporate those solutions into the new study they promised.
The governing board itself will review the achievement levels for a congressionally mandated report. The board agreed to file that report by June 30, three months ahead of schedule.
Two leading testing concerns have won $112 million in federal contracts to continue their work on NAEP.
The Department of Education renewed contracts with the Education Testing Service and with Westat to do everything from preparing the test questions that appear on the exams to writing the reports explaining the results.
The department made a "concerted effort" to create competition for the contracts, but the two incumbents had such an advantage in personnel and other resources that no other bidders submitted proposals, said Peggy G. Carr, the associate commissioner in the department's National Center for Education Statistics.
"It's very complicated science," Ms. Carr told NAGB members. "It's very hard" for other companies to draft competitive proposals.
The contract with the ETS, which has worked on the assessment since 1983, will cover the next five years. The Princeton, N.J., nonprofit test developer will subcontract work to the American Institutes of Research; the Center for Research on Standards, and Student Testing at the University of California, Los Angeles; Aspen Systems; and National Computer Systems.
The team of contractors will write test questions, score the tests, and analyze and report the results. The five-year contract is for $76 million, Ms. Carr said.
Westat, a Rockville, Md., company, won a four-year, $38 million contract to collect data for other NAEP research and sample questions.
In the eight years since the governing board began offering NAEP exams that report statewide results, it has never had more than 42 states participate in the tests. For 2000, the board and the NCES are aggressively recruiting all states to join the sampling of the mathematics and science achievement of 4th, 8th, and 12th graders.
"We want to have a solid math/science baseline for the year 2000 that all states participate in," said William T. Randall, a former Colorado commissioner of education and a former NAGB chairman who is leading the "AllStates 2000" campaign.
This year, 49 states expressed interest in giving the assessment, which takes selected samples throughout each participating state; only 35 states did so.
Most dropouts occur, Mr. Randall said, when states are unable to recruit enough districts to take part, leaving them without a statistically significant sample of students.
Mr. Randall said his committee is helping states explain to districts the importance of collecting data on student achievement. He said his biggest challenge will be South Dakota, one of the smallest enrollment states with only 135,000 students, which has never had a student sample for the biennial set of tests.
"We're going to build an infrastructure that we hope people will say: 'We're part of something important,' " Mr. Randall said.
That attitude, he added, may give momentum to raise state participation throughout the coming decade.
--David J. Hoff
Vol. 18, Issue 14, Page 24