Board Ponders New Format To Make NAEP More Cost-Effective, Useful
Pushed by a tight budget, policymakers are considering changes in the format of the National Assessment of Educational Progress that they hope will make it more cost-effective and more useful to educators and the public.
In a possible move that would likely spark controversy, however, NAEP may buck current thinking among many educators that performance-based test items more accurately assess students' learning than multiple-choice questions. Among other changes, the draft for a redesigned NAEP calls for a greater proportion of multiple-choice questions because they are less expensive to develop, administer, and score than items that require students to perform an experiment or write out an answer.
The National Assessment Governing Board, which sets policy for the congressionally mandated test, this month began reviewing the draft of the new design of the only nationwide ongoing test of U.S. students' academic achievement.
The board is also considering a more predictable schedule for administering the assessment and the use of longer tests with more questions. The changes would not affect the content of NAEP, or "the nation's report card."
The board discussed the preliminary recommendations at its quarterly meeting this month in St. Louis. Still months from final approval, the plan offers the first extended look at modifications the board is considering. The changes would take effect in 2000.
NAEP has tested what students know and can do since 1969. The tests are given in various academic subjects to a sampling of students in grades 4, 8, and 12.
The governing board is a 26-member independent panel created by Congress. A six-member work group of the board drafted the tentative recommendations.
Tight finances have been a driving force behind the long look at a new way to run the NAEP program, said Mark Musick, the chairman of the work group and the president of the Southern Regional Education Board in Atlanta. NAEP's budget was frozen this year at the previous year's level, $32 million, he said, adding that the governing board was grateful to get even that.
"We just don't see more money coming down the pike," Mr. Musick said last week. "We have made the assumption that we're going to have to plan for the years ahead on basically the same amount of money that we have now."
At the same time, the national assessment faces pressures from both legislative mandates and consumer complaints to deliver data about student achievement in more subjects, more frequently, more dependably, and in a more timely fashion.
Mix of Questions
Perhaps the work group's most controversial potential recommendation is one that could change the mix of multiple-choice and performance-based questions.
The assessment now devotes about half the testing time on a given exam to each type of question, although this year's science assessment gives over about 80 percent of test time to performance items.
Many educators prefer such items because they better simulate problem-solving in the real world. But performance questions are more expensive; students must be provided with a science kit in order to perform experiments, for example.
The work group said NAEP might offer performance items to just a subsample of the students tested. Or, the national assessment might conduct a special study using performance items and report the results separately.
Marilyn A. Whirry, a board member, said that at the meeting this month she was outspoken with her concerns about not abandoning the performance-based items.
"You can ask a child to memorize many, many things, and he can do it," said Ms. Whirry, who is a high school English teacher in Manhattan Beach, Calif., and an instructor at Loyola Marymount University in Los Angeles. "But can he apply any of these things? Will he remember it the following year?"
In its draft document, the work group also suggests that six NAEP subjects--reading, writing, mathematics, science, U.S. history, and geography--be tested more often than the other subjects: arts, civics, foreign languages, world history, and economics.
The plan also calls for the assessment to be given every year, as opposed to the current alternating years.
Two tests out of the core six would be given each year, a cycle that would yield results from a given test every three years. The other subjects would be tested once every five years. Under the proposal, the state-level NAEP exams would be limited to the six core subjects and would follow the three-year cycle, allowing states to plan for NAEP results on a fixed schedule for the first time.
To make frequent testing of more subjects possible, other suggested changes include:
- Scaling back the amount of data collected and reported. Comprehensive testing and reporting, with large student sample sizes, would be done every 10 years. In the intervening years, the student samples and the reports would be smaller and less detailed.
- Increasing the number of questions students take and doubling the test time to about two hours to reduce reliance on complex and costly statistical procedures.
- Reviewing the relevance of the separate and smaller "long-term trend" version of the NAEP, which has been given since the 1970s.
After receiving input from fellow board members, the work group is expected to present the recommendations in March at the board's next meeting. Final action could come at the May meeting.
Meanwhile, this year's program to allow districts to give the assessment at their own expense and obtain test results for individual schools has largely fizzled.
Four districts applied to the Department of Education and were accepted: Atlanta; Fairfax County, Va.; Milwaukee; and Philadelphia. However, as of last week, all but Milwaukee had dropped out, said Archie Lapointe, the director of the center for assessment at the Educational Testing Service in Princeton, N.J., which conducts NAEP.
The districts, he said, were dependent on federal funding to be able to take part.
The ongoing federal budget impasse, however, meant they could not count on the money by the time they had to commit to participate this month.
Vol. 15, Issue 18