Kentucky To Link Assessment Results to NAEP's Scores
WASHINGTON--In an unprecedented experiment, Kentucky is attempting to link the results of its new assessment program with those of the National Assessment of Educational Progress.
In a report expected to be released this year, Kentucky officials are hoping to show how students performed on the state assessment according to NAEP's scale, which indicates the number of students who performed at the "basic,'' "proficient,'' and "advanced'' levels of achievement.
The project is aimed at providing a check on the state assessment by showing parents how the results compare with national results, according to Edward Reidy, the associate state commissioner of education for assessment and accountability services.
"How can we assure people in Kentucky that we are not conning them? That's the problem,'' he said at a meeting of the National Assessment Governing Board here this month. "When we say a youngster is proficient, [we want to assure them that the youngster] is proficient not just for a few tests in Kentucky, but in the United States of America.''
In addition to serving Kentucky's needs, the project may also provide evidence of the feasibility of linking the results of different tests.
Such evidence is essential, researchers at the meeting said. While proposals for national assessment systems have called for linking a variety of assessments to common standards, they noted, such coordination may be difficult to achieve.
"Right now, it's possible to give a recipe for how you go about doing it,'' said Robert J. Mislevy, a research scientist at the Educational Testing Service. "What we don't have is examples of things being done to use as prototypes.''
Mandated by Legislature
The Kentucky project, which is being conducted jointly with the Education Department, is one of two efforts under way to link NAEP results with those of another test. Later this year, the department is expected to release a report comparing U.S. student performance on NAEP with the performance of students from 20 nations on a test of mathematics and science achievement.
In addition, the department is also scheduled to release the results of its second trial state-level NAEP assessments. Those reports will show how students in 44 states and territories and the District of Columbia performed in reading in the 4th grade and math in the 4th and 8th grades, with the goal of enabling states to compare their results with those from other states and the nation as a whole.
But the Kentucky experiment will represent the first attempt to compare an entire statewide assessment to NAEP.
It came about because the Kentucky legislature mandated linking the new assessment program to NAEP when it created the program as part of the state's landmark school-reform law, according to Mr. Reidy.
In carrying it out, the state also had to adhere to the N.A.G.B.'s policy, which requires any test linking with NAEP to be of equal reliability and similar content. Mr. Reidy estimated that the Kentucky assessment, which was built using the state's "valued outcomes'' and NAEP's frameworks, shares about 80 percent of its content with the national examination.
Over the next few months, researchers from the E.T.S., which operates NAEP under contract to the Education Department, will use statistical techniques to show how students who took the Kentucky assessment in reading, writing, and math in the 4th, 8th, and 12th grades would have performed on the NAEP scale.
If the project is successful, Mr. Reidy said, officials may expand it to include science and social studies in 1994 and 1996.
But, he added, "That's a big if.''
A 'Warning Sticker'
Researchers at the meeting said they shared Mr. Reidy's caution.
The statistical correlation may not yield a precise equivalence between the two results, warned Robert L. Linn, a co-director of the national center for research on evaluation, standards, and student testing at the University of Colorado at Boulder.
The degree to which the results can be compared, Mr. Linn explained, depends on the extent to which the tests measure similar things and were administered in similar ways.
Mr. Mislevy said the governing board could issue a policy outlining how to interpret the results of the linking projects. But he warned that such a policy would only be as useful as manuals explaining the proper use of ladders.
"You can write manuals, and put warning stickers on the ladders,'' he said. "But there are still going to be people who do stupid things with them.''
Vol. 12, Issue 25