NAGB Will Keep Achievement Levels-For Time Being
The board that runs the federal government's student assessment is standing by the way it defines test-performance standards, but says it will investigate alternatives in the face of criticisms from prominent researchers.
The National Assessment Governing Board will continue to use its standards-setting process. But to appease critics from the National Research Council, it will try other methods to define what students need to know to merit the labels "advanced," "proficient," or "basic" on the National Assessment of Educational Progress.
"There are those who would wait for the 'perfect' model for setting standards," the governing board says in a seven-page paper it approved unanimously at a June 23 meeting in Ann Arbor, Mich. "The board is not willing to discard a well-established method until there is good, solid evidence that a better method is available for the purpose of large-scale national assessment."
The governing board--known as NAGB--will research new methods to define achievement levels over the next year and will pilot-test the most promising ones, it said in the congressionally mandated report. But the 15-member board, which is made up of state and local policymakers, researchers, business leaders, principals, and a teacher, has no plans to abandon its current approach--called the modified-Angoff method--until a new one is identified.
A panel of experts called the current practice "fundamentally flawed" and said the governing board should replace it in time for the exams to be given in 2002. The performance standards the board sets, which spell out how well students perform, are often too rigorous and don't reflect student achievement on other assessments, the NRC panel said. ("Panel Assails Assessment Calculations," Sept. 30, 1999.)
By contrast, the NAGB report--which Congress ordered after the research council released its report last fall--calls the Angoff method "state of the art" because other test developers use it. The method's standards-setting procedures, it adds, "are the very best available."
No Better Way?
Despite the differences, one member of the NRC panel said the NAGB response is a step toward improving the way standards are set for the assessment, which tests a national sampling of students in core subjects.
"For the most part, it's a very constructive report," said Lauress L. Wise, a member of the NRC panel and the president of the Human Resources Research Organization, an Alexandria, Va., nonprofit group that conducts education and training research. "It's a schedule that seems realistic and ... looks at what the alternatives are."
Under a modified-Angoff method, NAGB hires panels of educators to evaluate each test question and to decide the probability that a student performing in one of the three performance categories could answer the question correctly. Students who fail to reach the "basic" level are reported as "below basic."
The problem with the Angoff approach, the NRC panel said, is that the task is too subjective and that it can be confusing and overwhelming to those who evaluate the questions. The upshot is that achievement levels are more challenging than those used by other tests, such as the Advanced Placement tests high school students take to earn college credit.
But NAGB members say they sought out other methods in 1996 and didn't find any more reliable. What's more, their interviews with those who rate test questions suggest they feel capable of judging the questions.
'A Thorough Review'
But in its report to Congress, NAGB says it will try again to find a new standards-setting method.
The board promised to conduct "a thorough review of its policies and practices" over the next nine months. Any changes from the review won't affect NAEP standards-setting until the mathematics exam scheduled to be given in 2004, the report says, because the board needs to plan ahead to conduct the assessments.
NAGB also will explore data from standards-setting methods used in other exams. It will pilot-test the most promising approaches and compare the results against what the Angoff method yields.
"We're not wedded to the modified-Angoff method if we find something one should have more confidence in," said Mark D. Musick, the NAGB chairman and the president of the Southern Regional Education Board in Atlanta. "We don't yet know what that is."
While the review and pilot-testing proceed, the achievement levels will continue to be considered developmental, as Congress demanded in 1994 legislation that allowed their use after criticisms from education researchers and the General Accounting Office, a research arm of Congress.
The 1994 law gives the national commissioner of education statistics the power to lift that tag.
Congress is scheduled to review the testing program and consider reauthorizing NAGB's authority over it in the next year.
Vol. 18, Issue 42, Page 24