NAEP Weighing 1st Standards for Reading, Writing
WASHINGTON--Continuing in its controversial effort to set standards for student performance, the National Assessment Governing Board is considering proposals for the first-ever national standards in reading and writing.
The panel set off a storm last year when it reported for the first time results on the National Assessment of Educational Progress mathematics test against standards for "basic,'' "proficient,'' and "advanced'' levels of achievement.
A number of technical experts sharply criticized the process the board used to set the math standards, and questioned the validity and reliability of the results.
But board members, saying that they have addressed those criticisms, are expanding the process to include other subjects. Late last week, they were expected to receive a report outlining proposed standards for the 1992 NAEP reading and writing tests.
"I'm convinced setting achievement levels is appropriate for the national assessment,'' Mark D. Musick, the chairman of the NAEP governing board, said in an interview. "I think what we are doing is important and correct.''
Mr. Musick declined to comment on the proposed reading and writing standards until after receiving the report. The board is not expected to act on them until March.
But Roy E. Truby, the executive director of the board, said he was "very satisfied'' with the standards-setting process in reading and writing thus far.
"We should come up with reasonable, defensible standards,'' he said. "I think we will.''
New Reporting Policy
A Congressionally mandated project, NAEP has since 1969 tested a national sample of students in a variety of subjects. Its reports are generally well publicized and are considered the best information available on the state of student achievement nationally.
But in 1990, the governing board set out to change the method of reporting the results to make them more useful to the public and to policymakers.
In contrast to the traditional method of simply reporting how students performed on the assessment, the new policy called for reporting how students performed against agreed-upon standards for performance in grades 4, 8, and 12.
The panel first applied the new method in preparing its report on the 1990 NAEP math assessment. It found that fewer than 20 percent of students performed at the proficient level on the assessment.
That report, however, unleashed a hail of criticism from technical experts, including a panel of reviewers that had been hired by the board to evaluate the process.
A technical-review panel from the the National Academy of Education, the National Center for Education Statistics, and the General Accounting Office also criticized the board's method of setting standards for the 1990 math test, and charged that the process should not be expanded unless it was substantially changed.
Mr. Musick said the board has taken a number of steps to improve the process, including contracting with American College Testing to conduct the standards-setting effort for the 1992 assessment.
Mel Webb, the project director for the Iowa City-based testing firm, said the new process represented a major improvement over the 1990 attempt.
"We were able to benefit from all the criticism aimed at the 1990 process,'' he said. "We built into our design features that will allow us to overcome the shortcomings of the 1990 effort.''
To set the standards in reading, the A.C.T. convened 62 people--including 35 teachers and 10 other educators--for four days in St. Louis in August. The panelists examined each of the items on the 1992 NAEP reading assessment and made judgments as to how many students at the basic, proficient, and advanced levels of performance would be able to answer them correctly.
To set the writing standards, the A.C.T. convened 66 people for four days in St. Louis in July, who evaluated writing prompts and actual student papers and determined how students at each level would perform on the prompts.
The panelists also came up with written descriptions of performance at each of the achievement levels, which are expected to be edited to reflect comments from public hearings held this fall on the proposed standards.
The final versions, along with statistical data that compare performance at each level with performance on NAEP's traditional scale scores, are expected to be presented to the governing board of the national assessment in March.
The preliminary descriptions suggest that, for example, at the 4th grade:
- Basic reading performance should include determining what a text is about and connecting material to personal experiences; proficient performance should include summarizing a text and using information to draw a conclusion; and advanced performance should include explaining an author's intent and describing similarities and differences in characters.
- Basic writing should state a central idea with some supporting details; proficient writing should provide enough detail to communicate the purpose to the intended audience; and advanced writing should elaborate on the idea with descriptive and supportive details.
Mr. Webb of the A.C.T. said that, in response to concerns raised about the 1990 effort, the firm is conducting a number of studies to gauge the technical quality of the standards set by the panelists.
In one study, researchers from the firm found that two separate sets of panelists came up with essentially the same judgments.
"There is no standard for judging reliability in this kind of work,'' Mr. Webb said. "Some would interpret that as very reliable. Others would interpret it as somewhat unreliable.''
He also said that researchers are conducting studies to determine the validity of the standards, and said those results will be available by next spring. In addition to the A.C.T., the National Academy of Education is conducting a series of validity studies on the project, Mr. Webb said.
But Mr. Webb cautioned that the research will not show definitively whether the standards are valid or not.
"Validity is not an either-or thing,'' he said. "It's an ongoing process. There are degrees of validity.''
"They will say, 'The process has a certain amount of validity,''' Mr. Webb said. "The professional community will have to say whether that's adequate or not.''
Mr. Truby, the executive director of the NAEP governing board, added that, while technical soundness is important, the standards-setting process is ultimately one of judgment. And on that score, he said he feels confident about what the A.C.T. has done.
"Judgments have always been made about NAEP data,'' Mr. Truby said.
"We think this is a much better way to make judgments.''
Vol. 12, Issue 12