Dispute Over Ky. Test Section Sparks Broader Debate

May 28, 1997 3 min read

Recommendations from the Kentucky attorney generals office about the states evaluation of a handful of schools have raised broader questions about ratings of all the schools in the state.

The office weighed in after several schools appealed a state change related to a student exam used to gauge school performance. State education officials decided this year that the results on a certain portion of their 1996 test, which had already been given, would not count in a school’s final score--one that can determine whether teachers get reward money for improving student performance or a school winds up on a “crisis” list.

“We cannot use results that are technically unreliable, but the hearing officer thought that it was unfair of us to change the rules in the middle of the game,” said Jim Parks, a spokesman for the Kentucky Department of Education. “We knew it was going to be a tough call and knew it was going to be contested.”

Siding with 12 of the 13 schools that appealed, Ann M. Sheadel, the chief hearing officer for the attorney general’s office, called the state’s action “fundamentally unfair,” and said the schools should not be held “accountable for the mistakes and problems in an assessment system.”

Ms. Sheadel’s recommendations will go before the state school board at its June 10 meeting. If schools disagree with the board’s decision, they can appeal to a state circuit court.

Reliability Issues

The dispute stems from a section of an exam given to Kentucky students annually to assess their performance and that of their schools. The section in question, called the performance event, is an open-ended, project-based question that requires students to work in small groups to solve the problem and then provide individual answers.

For example, Mr. Parks said, students might be asked to plan a road trip across Kentucky, figure their expenses on items such as food and gas, calculate their mileage, and arrange to hit various tourist attractions along the way.

The schools challenging the state’s decision were those that fell just short of meeting their performance-improvement goals, meaning they would have received a monetary reward or been moved off the “in crisis” list had their scores on the performance event been calculated in the final results.

Mandated by the 1990 Kentucky Education Reform Act, the high-stakes test has been administered annually since 1992. Two years of test data are averaged to determine whether schools have met their goals. Following the 1993 and 1994 tests, state officials felt the scoring of the performance events was reliable enough to issue the rewards and the sanctions.

But for security reasons, officials changed the performance events and then ran the questions through a “reliability analysis” to see if they were as difficult as those that had previously been on the test.

“Our reliability scores were all over the place,” Mr. Parks said.

The problems that Kentucky educators face are not unlike those that testing experts around the country are confronting.

“The simpler the questions, the more reliable the instrument,” Mr. Parks said. “The more complex skills you try to measure, the less reliable the instrument. We’re trying to find some balance.”

‘Good Instructional Strategy’

The state is now field-testing revised versions of performance-event questions. While the questions will continue to appear on the annual test, they won’t count again until 1999, Mr. Parks said.

He also noted that the performance event is only a small portion of the entire assessment--just 6 percent of the total score. The test also includes other open-ended questions, as well as a portfolio section.

School administrators complained that because teachers spent time teaching students to tackle the problem-solving activities found in the performance event, they didn’t have enough time to adequately prepare students for other parts of the test.

Ms. Sheadel agreed. “If those skills are not going to be tested ... such allocation of time may be unwise,” the hearing officer said, “and a school and its students might be better served, for testing purposes, in allocating time to teach the skills that actually will be included in the test.”

But Mr. Parks said the skills students need to complete the performance events should be taught regardless of the test’s contents.

“Having students work in groups and work on projects is good instructional strategy,” he said.

Mr. Parks added that the schools that are appealing the test change represent only a small minority of the 1,400 schools in the state.

The hearing officer has yet to rule on appeals from several other schools, but those complaints don’t focus on the performance events, Mr. Parks said.