E.D. Consortium Seeks To Spur Alternative Assessments
In a move to help spur the development of alternative methods of student assessment, the Education Department last week announced that it is forming a consortium of states interested in creating such methods.
The new State Alternative Assessment Exchange—led by the department's Center for Research on Evaluation, Standards, and Student Testing and the Council of Chief State School Officers—will enable states to pool resources to cooperatively develop new forms of assessment, department officials said.
Under the plan, participating states will create a data base that would allow states to share test items. At the same time, the center is also expected to help states develop guidelines for evaluating the new measures.
Christopher T. Cross, the department's assistant secretary for educational research and improvement, said in an interview that the consortium represents an appropriate federal response to the critical problem of improving assessment. The OERI is contributing $150,000 this year toward the project, he noted.
"Federal dollars are wonderful leverage for states," he said. "We're facilitating—serving as an 'honest broker.' That's all we're trying to do."
If the project is successful, said Eva L. Baker, co-director of the assessment center, which is based at the University of California at Los Angeles, it will advance the state of the art at a lower cost than if states moved to develop new forms of assessment on their own.
"We hope we can circumvent making the same mistakes that were already made," she said, "and we're trying to control the fairly large costs of developing new measures" of student performance.
State officials last week praised the plan, and suggested it would help them as they overhaul their assessment systems.
"The more states buy into this, the faster we'll move into more performance assessment," said Jack Foster, education aide to Gov. Wallace G. Wilkinson of Kentucky.
Led by a chorus of critics of conventional multiple-choice tests, a growing number of states have moved to create new forms of assessment that measure students' abilities to perform tasks, such as conduct science experiments and write essays.
A number of researchers have cautioned, however, that states may be moving too fast to replace their tests. They have warned that the new measures have not proved themselves reliable measures of student abilities on a large scale. (See Education Week, Sept. 12, 1990.)
Nelson Smith, director of the Office of Programs for the Improvement of Practice at the OERI, said the consortium was aimed at bringing the research arm's resources to bear on the issue.
"We had been casting about, looking for different approaches for dealing with state needs," Mr. Smith said. "Again and again, [we heard that] assessment was the linchpin of a lot of state issues."
Department officials agreed, he added, that the best way to assist them would be to link them and allow them to share what they had already learned, and at the same time make available the assessment center's technical expertise.
"Joining practitioners' wisdom with research knowledge is an ideal way to proceed rationally with a high degree of reliability toward measures of performance that are generalizable and consistent over time," Mr. Smith said.
Mr. Foster of Kentucky noted that the consortium would get under way too late to help that state launch its new assessment system. But, he said, it could help officials refine their system over time.
"There is no way we'll come out with what will be the very best final answer," he said.
Although the center will work with the states to evaluate the test items contributed to the data base, it is not expected to confer a "Good Housekeeping seal of approval" on them, according to Ms. Baker.
"That's not a good idea," she said. "We would have to make judgments in a decontextualized way."
Mr. Cross also rejected the notion that the pool of test items could form the core of a national examination. The project is aimed at facilitating development work, not at creating a new test, he said.
"Performance assessment is still new, there's still a lot to be learned," the assistant secretary said. "We're a long way from putting this into something that does capitalize on the investments people are making in this area."
Robert E. Gabrys, chief of program assessment, evaluation, and instructional support for the Maryland Department of Education, said the exchange would allow states to agree on common characteristics of performance, while permitting them to design their own curricula.
"It picks up a heavy coordination role and doesn't threaten states' rights," he said.
Vol. 10, Issue 25