Panel Offers Blueprint for Expanded NAEP
A group studying how to alter the National Assessment of Educational Progress to allow for state-by-state comparisons of student achievement last week released two reports outlining how the new assessment should be conducted.
The panel, established last fall by the Council of Chief State School Officers, recommended to the assessment's contractor, the Educational Testing Service, what skills and content should be tested and how the results should be reported.
Legislation pending in the Congress would allow NAEP to test an expanded sample of 8th graders in mathematics in 1990.
The first of the two reports issued last week provides a "blueprint for that assessment in terms of what it should measure,'' according to Wilmer S. Cody, chairman of the 18-member group and former superintendent of the Montgomery County (Md.) Public Schools.
The test should include as much of the content of state curricula as possible, as well as what scholars and practitioners recommend and what was covered in prior NAEP assessments, the panel proposed, rather than represent the "least common denominator'' of state programs.
By being inclusive, the report states, the assessment can both help states improve their programs and provide a link with past assessments.
Such a strategy is unlikely to lead to drastic changes in state programs or to a national curriculum, as some critics have feared, Mr. Cody said.
In addition to placing greater emphasis on statistics and algebra, the proposed test would include items that represent "some shifts'' from current practice in many states, he said, but it would "not represent a major reform.''
States can still decide for themselves, he added, what they should teach in their own schools.
"This is not a curriculum guide for what states should be doing,'' he said.
States will receive information on whether their students had an "opportunity to learn'' particular items on the assessment, he noted. "If a state scores low on one item, it may be because their students have not been taught that,'' he said. "It is still up to the state whether it wants to put that item into its curriculum.''
The 18-member group Mr. Cody headed was created with a $572,000 grant from the Education Department and the National Science Foundation. (See Education Week, Sept. 9, 1987.)
Its other report issued last week outlined ways the results of the expanded assessment should be reported.
That issue has been controversial among state officials, who fear that students' scores could be used to make inappropriate comparisons between states.
To avoid that outcome, the panel recommended that the assessment include data on state demographic variables and student backgrounds, so that states can compare themselves with states with similar characteristics, or with neighboring states.
But the panel--bowing to what Mr. Cody called the "inevitable''--also proposed that the assessment include a single average score for each state as well.
Some future Secretary of Education may want to include the single indicator of student achievement in a state-by-state comparison, Mr. Cody said. But, he noted, a single score is "of very little use'' for state officials.
The group recommended that the assessment also include data that are more useful to state policymakers, such as how a state's students performed on particular items, how regions within a state compared, and how different demographic groups compared.
"If it is going to happen,'' Mr. Cody said, "there should be enough descriptive information so that state officials know where and how to target improvements.''--RR