Washington--Saying that shifts in reading instruction demand changes in assessment, the governing board of the National Assessment of Educational Progress has adopted, in principle, a dramatic new blueprint for its 1992 reading test.
Acting at its quarterly meeting this month, the board approved a revised version of a proposal drawn up by a panel led by the Council of Chief State School Officers. It is expected to be further refined before it is sent next month to the Educational Testing Service, naep’s contractor.
“The tide is shifting,” said Phyllis W. Aldrich, chairman of the board’s reading committee. “Since 1971,4reading has changed in definition. How we test it has got to change.”
The closely watched plan, which will provide the framework for the first state-by-state comparisons of student-achievement data in the subject, calls for substantially increasing the proportion of open-ended items that ask students to write their own responses, rather than choose from among multiple-choice responses.
It would also use longer passages of authentic text material, and measure performance on three types of reading.
In addition, it would add an unusual special study of students’ oral fluency, as well as a pilot assessment to collect portfolios of students’ classroom work in the subject.
The proposal represents a substantial shift from past assessments, but Ms. Aldrich said naep would conduct a “bridge study” to link performance in 1992 with the results from previous reading tests.
The plan would also cost considerably more to administer than past tests, which relied primarily on multiple-choice questions, acknowledged Chester E. Finn Jr., the board’s chairman. But, he said, “at some point, this board has to say to the Administration and Congress, ‘If you want a proper assessment, this is the way to do it, and this is what it costs.”’
‘Everything We Can’
The blueprint adopted this month represents a slight modification from the plan unveiled last month by the ccsso consensus-planning committee. (See Education Week, Feb. 14, 1990.)
Based on consultations with officials from the National Center for Education Statistics and the ets, the committee agreed to cut back some proposals the officials said might be unfeasible given cost constraints and the one-hour time period allotted for testing.
For example, the panel agreed to reduce from 60 percent to 40 percent the amount of testing time students would spend on open-ended questions, and to cut out many questions aimed at discerning students’ reading strategies and backgrounds.
The panel also agreed to report a single composite score, in addition to scores on the three reading scales.
Ms. Aldrich said the innovations could be expanded in future naep tests.
“We want to do everything we can,” she said. “We may not be able to do it all at one time.”
In other action, the board agreed to modify its plan to establish national goals for student performance in the subject areas it tests.
Under the new version, which is expected to receive final approval in May, the board would set at least two standards for performance in each grade level, instead of the single standard proposed in the original plan.
The revised plan is “very close” to what President Bush and the nation’s governors envisioned in the national performance goals announced last month, according to Michael Cohen, director of education programs for the National Governors’ Association.
“When there are goals set, whoever does it,” he told the governing board, “it should have the effect of moving the entire level and distribution of performance.”
“What the governors did not want to do,” he added, “is to have the effect of moving those just below the standard, and ignoring those well below or well above it.”
Herbert J. Walberg, chairman of the naep board’s technical-methodology committee noted that the board and the governors continue to differ over how to set standards for performance.
In their statement, the governors and President Bush said that student performance “will increase significantly in every quartile.” The board’s plan, on the other hand, would establish standards for “essential” or “advanced” performance, as well as a possible third standard for “proficient” performance.
“What the governors are proposing is easy to do,” said Mr. Walberg, professor of education at the University of Illinois at Chicago. “It’s a statistical procedure. What we are proposing requires human judgment.”
“I don’t see why we can’t do both,” he added.
Timetable for Expansion
Mr. Cohen also said he agreed with board members that the current naep timetable may be inadequate to measure progress toward national student-achievement goals.
Unless the Congress acts soon to expand the assessment, noted Mark D. Musick, president of the Southern Regional Education Board, naep will be unable to present a full state-by-state “report card” until at least 1997.
Under legislation passed in 1988, naep is authorized to conduct a pilot state-level assessment in 8th-grade mathematics in 1990, and in 4th- and 8th-grade math and 4th-grade reading in 1992.
“There is a general recognition we have a long way to go before we have state-by-state achievement data,” Mr. Cohen said. “I don’t think any of them would be happy waiting that long.”
A version of this article appeared in the March 14, 1990 edition of Education Week as NAEP Board Adopts Blueprint for 1992 Reading Test