Washington--Citing doubts bout the validity of the process, educators here last week urged the governing board of the National Assessment of Educational Progress to put off until at least 1992 its plan to set the first national standards for student achievement.
Under the plan, adopted by the board last May, the panel is expected to report the results of the 1990 mathematics assessment--the first to include state-by-state data--by comparing student performance against agreed-upon standards for what students at the “basic,” “proficient,” and “advanced” levels of achievement in grades 4, 8, and 12 should know and be able to do.
But at a public forum on the issue, held here last week, officials of several key groups, including the National Council of Teachers of Mathematics and the Educational Testing Service, said the process the panel used for setting the standards was too rushed and might be technically flawed. They urged the board to take more time to ensure that the standards represent a national consensus of what levels of achievement should be.
“I think you can do a better job in 1992 than you are able to do now,” said Gregory R. Anrig, president of the ets, which operates naep under contract to the Education Department. “You will do an even better job in 1994, and an even better job in 1996. I want you to get off to a good start.”
“The danger is,” he added, “if you move too fast in the wrong way, you’ll lose what you’re trying to accomplish.”
But board members, acknowledging that the procedure was less than ideal, responded that a delay might be equally costly. Because of the time it takes to develop new assessments, said Chester E. Finn Jr., professor of education and public policy at Vanderbilt University, the board may not be able to report results from an entirely new standards-setting process until 1995 at the earliest.
“I agree it would be good to take time to do things well,” he said. “But I am also mindful of the adage, ‘The perfect is the enemy of the good.”’ “If we do not get baseline data until 1995,” Mr. Finn added, “we may be sacrificing something else--the sense of urgency for national imL provement.”
Richard A. Boyd, the panel’s chairman, said the board would con sider the statements from the public hearing, along with a report from re searchers hired to evaluate the pro cess, before voting later this month on whether to go ahead with the pro ject. The board has already decided to postpone including the standards in the report on the math assess ment, which is scheduled to be re leased in June, Mr. Boyd said.
The proposal to set three standards for student achievement was aimed at making naep results more useful to the public and policymakers, board members said. In the past, they noted, naep has simply reported how stu dents performed in math, reading, writing, and other subjects, without determining whether the results were “good enough.’'
The plan would also, said Mr. Finn, the panel’s former chairman, help “flesh out” the national educa tion goals set by President Bush and the nation’s governors.
Because of the link to the national goals, said Gordon M. Ambach, ex ecutive director of the Council of Chief State School Officers, the standards-setting process is likely to receive intense scrutiny from educa tors and policymakers.
“There is a lot of discussion that these levels will be the standards for performance across the nation,” he said. “In setting the levels, it is terri bly important there be acceptance in the way it was done, and true belief that this was an authentic process.”
Under the procedure, a group of 63 educators, business leaders, and public officials met last summer in Vermont to analyze questions on the 1990 math test to determine if stu dents at each level of proficiency should be able to answer them cor rectly. Most of the group then recon vened in Washington in September to discuss their results, and a small er group then compiled the results and wrote a description of student abilities at each achievement level. (See Education Week, Sept. 5, 1990.)
But several witnesses at the hear ing here suggested that the procedure raised questions about its validity. For one thing, noted Mr. Ambach, the standards were applied to a test that had already been administered.
“The assessment, as originally de signed, did not contemplate levels of proficiency set,” he said. “It was in fact a retrospective process. The uestion is whether they were ap plied successfully, in a way we all deem credible.”
In addition, said Mary Harley uter, a member of the group that met in Vermont, her panel had too little time to make reasonable judg ments about what students at the three levels of achievement should know and be able to do.
“We were uncomfortable that we did not do the best job we could do,” said Ms. Kruter, a project director for the Mathematical Sciences Education Board of the National Research Council. “It was a rushed process.” James D. Gates, executive direc tor of the math teachers’ group, also said the board sought too little input from math educators. As a result, he said, “there is not sufficient consen sus” on the standards.
To develop such a consensus, Mr. Anrig suggested, the naep governing board should start now and “expand the standards-setting process for 1992 so that it will be nationally publicized and will incorporate the judgments of hundreds of teachers and the com ments of possibly thousands of others concerned about education.”
But Jeanne Allen, an education- policy analyst for the Heritage Foundation, said teachers are less concerned about being informed about the board’s actions than they are about receiving tools to help them improve schools.
“What I want to know is, when are we getting around to telling the American people how their children perform?” she asked.
A version of this article appeared in the January 16, 1991 edition of Education Week as NAEP Board Urged To Delay Standards-Setting Plan