Expert Panel Recommends Continuing State-Level NAEP--But Only as Trial

By Robert Rothman — April 17, 1991 4 min read

Washington--The Congress should extend the state-level National Assessment of Educational Progress in 1994, but only on a trial basis, a panel of experts has concluded.

In a preliminary report issued this month, the Congressionally mandated panel, appointed by the National Academy of Education, found that the 1990 trial assessment--the first to be conducted at the state level--worked well. It recommended that the scores be released as scheduled in June.

However, the panel pointed out, the 1990 assessment, and another trial scheduled for 1992, would only provide information on two subjects and two grade levels.

Rather than expand the state-level assessment into a full-blown program, the panel argued, the Congress in 1994 should permit testing at grades 4, 8, and 12 in three subjects.

“As yet, we haven’t had any trials in grades 4 and 12,” said Robert L. Linn, a professor of education at the University of Colorado at Boulder and a co-chairman of the panel. “By 1992, we won’t have had any trials at grade 12.”

“We think it would be unwise to generalize what you can do at grade 8 to grade 12,” he added. “High-school seniors are not as compliant as 8th graders in many ways.”

The report also urged caution in lifting the prohibition against the use of NAEP data at the school-building and district level, and asked the National Center for Education Statistics to arrange for an independent evaluation of the National Assessment Governing Board’s procedure for setting standards on the 1990 mathematics assessment.

Emerson J. Elliott, acting commissioner of the NCES, said the report contains “a couple of things that should give people pause.”

The Administration and the Congress “should think through how things are going on,” Mr. Elliott said.

Experiment Not Flawed

Roy E. Truby, executive director of the NAGB, applauded the report, and said he understood the panel’s caution.

But, he said, “if by 1994 we aren’t finding major flaws, we shouldn’t have to keep coming back and getting authorization [to add] one subject and one grade. It should be part of the regular NAEP.”

A Congressionally mandated project, NAEP has since 1969 tested a national sample of students in reading, writing, mathematics, science, and other subjects.

In 1988, the Congress authorized a limited expansion of the assessment to permit the first-ever state-by-state comparisons of student-achievement data. The legislation authorized a state-level test in 8th-grade math in 1990 and in 4th- and 8th-grade math and 4th-grade reading in 1992.

The law also required an independent evaluation of the “feasibility and validity of [state] assessments and the fairness and accuracy of the data they produce.”

Although the evaluation, conducted by the National Academy of Education under a grant from the NCES, is not expected to be complete until October, after the data are released, the center asked the panel to prepare an interim report to guide the Administration and the Congress as it considers whether to continue the state-level assessment in 1994, according to Mr. Elliott.

In its report, the 18-member panel concluded that its investigation found “no signs that the experiment is flawed, that major redirection is necessary, or that the trial state assessment be terminated.”

However, it recommended that future state assessments provide sufficient funds to test a sample of private-school students, who were excluded from the 1990 round. The report states that the inclusion of such students in the results would “better reflect educational achievement’’ and would improve comparisons between states and between states and the nation.

The panel cautioned, however, that the data from the state-level assessment should not allow policymakers to draw conclusions about state policies or practices.

“If you see one state doing very well in comparison with its neighbor, education policies may be one way the states differ,” Mr. Linn said. “But there may be a host of other factors.”

In its October report, he said, the panel will examine how the results are interpreted.

‘A Corrosive Effect’

The panel also criticized the process the NAGB used last fall to set the standards on the math assessment. Under that process, groups of educators and lay officials met to determine the “basic,” “proficient,” and “advanced” levels of achievement at each grade level.

Although the board has taken steps to ensure that the levels are valid, the NAEP panel urged that the validation process be reviewed before the levels are used and reported.

“Since the panel believes that the use of inadequately developed achievement levels could have a corrosive effect on state participation in the future, as well as on the credibility of NAEP more generally,” the report stated, “the panel will monitor the validation studies.”

The panel also recommended that policymakers carefully review the “policy, technical, logistical, and cost factors” associated with lifting the prohibition against the use of NAEP data at the district and building levels. Although some officials, including the NAGB, have urged allowing such use, the panel warned that such comparisons could destroy NAEP’s value as a monitor of student achievement.

“If it’s administered to all kids,” Mr. Linn said, “it could undermine what NAEP’s all about.”

A version of this article appeared in the April 17, 1991 edition of Education Week as Expert Panel Recommends Continuing State-Level NAEP--But Only as Trial