The groundbreaking battery of tests that Kentucky officials have used to measure the success of their statewide school-reform experiment is “seriously flawed” and has produced misleading results, according to a panel of testing experts.
The tests, which assess achievement in mathematics, reading, science, social studies, and writing, are not reliable enough to use in determining which schools deserve cash rewards and which should be penalized with state supervision, the panel concluded. Further, it found, the open-ended tests may not be a very good gauge of student achievement.
The five-member panel, commissioned last year by the legislature, released its findings to Kentucky lawmakers late last month.
“The question we had to answer was, ‘Are the intended uses of the tests appropriate?”’ said Daniel M. Koretz, the co-director of the education-policy-research program at the Urban Institute in Washington and a panel member. “And the answer is an unambiguous ‘No.’”
‘Too Much, Too Soon’
State education officials agreed with many of the recommendations contained in the panel’s report. But they argued that the pessimistic tone of the report’s summary contrasts sharply with their own optimism that the system is improving.
“We take strong exception to their finding that gains in scores have been exaggerated,” said Thomas C. Boysen, who responded to the report on his next-to-last day as Kentucky’s commissioner of education.
“Their report is too narrow psychometrically because it doesn’t look at the impact on instruction, and they were not quantitative,” he said. “If a group of this caliber is going to say something is flawed, it should be able to say how much.”
Kentucky officials have made adjustments in the tests since they began administering them to the state’s 4th, 8th, and 12th graders in 1992. The state school board is expected to consider further changes at its meeting this month, and the national panel acknowledged that state officials are moving in the right direction.
However, the state still may have bitten off more than it can chew, according to the group.
The panel concluded that while no state could have flawlessly implemented so many wholesale changes, Kentucky tried to do “too much, too fast,” and therefore has spent a lot of time trying to patch up problems. And despite the willingness of state officials to address concerns about the tests, panel members agreed that many significant corrections are necessary.
Process Over Content?
The Kentucky Instructional Results Information System--the testing program’s formal name--is at the heart of Kentucky’s 1990 education-reform act, which replaced the state’s entire education system. A departure from standardized multiple-choice exams, kiris is a battery of essays, physical tasks, and collected classwork intended to better represent student accomplishment and improvement. The tests have no multiple-choice questions.
Beyond providing a better snapshot of student work, the tests are also meant to be a catalyst for transforming classroom instruction, as teachers spend more time on reading, writing, problem-solving, and group activities. To underscore the link between testing and teaching, the state this year distributed $26 million to teachers in schools that met improvement targets on the tests. (See Education Week, 4/26/95.)
Beginning in next year, the law will allow the state to intervene in schools with declining test scores.
Critics in the state have argued that the tests move away from emphasizing basic skills and elevate process over content--a view shared by the panel, which recommended incorporating multiple-choice items into the tests.
On larger issues of how the tests were implemented, members of the review panel concluded that performance standards used to gauge test scores are too narrow and unreliable, scoring of portfolios is too subjective and inconsistent, efforts to equate assessments from one year to the next are problematic, and student gains on the tests do not match changes in performance on other standardized tests.
A ‘Doable’ System
State officials agreed with 10 of the panel’s 12 recommendations. Education department officials were already at work on most of the items and said they will explore a recommendation to stop teachers from scoring portfolios from their own schools.
The department disagreed with the recommendation that portfolio scores should not be included in the tests, and some lawmakers also registered their support for portfolios.
Members of the legislature’s joint oversight committee, which commissioned the report, said they were pleased that it addressed many teachers’ and critics’ complaints about the assessment system. The group, com~posed largely of legislative leaders, promised to push ahead with the tests while considering changes.
“We asked if the panel believed that high-stakes accountability and rewards are appropriate, and they said yes,” said David Karem, the Senate majority leader. “We asked if the system we developed is doable, and they said yes.”
A version of this article appeared in the July 12, 1995 edition of Education Week as Ky. Student Assessments Called ‘Seriously Flawed’