North Carolina education officials last week ordered a major audit of the state’s testing and accountability program to determine the soundness of the system after problems emerged over interim scoring measures for the state’s end- of-grade math exam.
The scores used to determine whether students demonstrated proficiency on the test were set too low, resulting in unexpectedly high passing rates for the state’s elementary and middle school students. The miscalculation reduces the number of students who would have been held back a grade under the state’s tough new promotion requirements.
“We are concerned that the reputation of this program has been tarnished by recent problems, and we are committed to taking every step necessary to correct any problems identified in an audit of this program,” state schools Superintendent Michael E. Ward and state board Chairman Philip J. Kirk Jr., said in a statement.
The audit, which will be conducted with the assistance of the Southern Regional Education Board, will review the technical standard-setting process and examine the school rating system.
Some of the state’s 119 districts alerted state officials this month that as many as 95 percent of their students had scored in the “proficient” range in mathematics, despite officials’ contention that the test was more rigorous than in previous years. The test, given in math and reading in grades 3-8, was restructured this year to match new state standards.
“Our goal was to make the test as rigorous and challenging as is reasonable,” said Mildred G. Bazemore, who heads the testing section of the North Carolina Department of Public Instruction. “But we are now aware that the standards were pretty lenient,” she said. “We erred on the side of the kids.”
A new policy is being phased in this school year—beginning with the 5th grade—to limit the promotion of students who do not perform at grade level on the tests. Many districts have already begun requiring all students to prove they have mastered the material for their respective grade levels in order to advance to the next grade.
The cutoff scores used to determine whether students are proficient in math were based on a field test of the new questions conducted as part the state’s regular exam last year. But the benchmarks did not take into account that students might try harder once strict consequences were attached to the test, Ms. Bazemore said.
“The field-tested questions were embedded in the [regular] tests last year, which would make one believe that we were getting pretty good data” about student-performance expectations, she said. “But the projections that were made could not take into consideration that once the tests had high stakes for the students, they were motivated to perform well.”
Some districts realized the scoring system was faulty as soon as they received the results.
“We saw unusually high scores,” said Patricia O. Hester, the director of instruction and accountability for the 22,000-student Johnston County district, south of Raleigh. “We’re not saying that our kids did not do better. We knew they would improve, but in an amount that would be believable.”
About 95 percent of the students in the county who took the tests scored in the proficient range, about 10 percentage points higher than last year, Ms. Hester said. By comparison, the county saw an improvement of about 1 percentage point on the reading test.
State officials are advising districts to take other factors into consideration besides the test scores to make decisions about promoting students. The state will adjust the scoring standards for next year.
In the meantime, the glitch will not affect the performance ratings given to schools. Nor will bonuses for staff members at schools that meet or exceed state expectations be affected, because the rewards are based on average scale scores instead of achievement levels.
A version of this article appeared in the May 30, 2001 edition of Education Week as Testing Glitch Prompts N. Carolina To Order System Audit