Researchers Blame Test Revisions for Falling Ill. Reading Scores
Illinois educators have been scratching their heads over four years of mysterious declines in students' scores on the state's 10-year-old reading tests.
Now, two of the researchers who helped design the assessment say the fault for the declining scores lies with the tests themselves.
"Simply put, we think the reading scores are wrong," P. David. Pearson and Timothy Shanahan write in an article scheduled to appear this month in the Illinois Reading Council Journal.
The dip in scores on the reading portion of the Illinois Goals Assessment Program, or igap, has been a puzzle because it has come as scores on other parts of the test were rising. Also, some large districts, such as Chicago and Springfield, have seen their students' reading scores on other standardized tests go up while their state scores were dropping.
"Even when we'd see small dips in scores or we were treading water, what we were seeing on the IGAP was huge Niagara Falls drops," said Thomas Kerins, who oversees testing programs for the 14,800-student Springfield district.
The state tests, given to students in grades 3, 6, 8, and 10, are important because districts whose students score poorly over a period of years are placed on academic warning lists and can eventually be dissolved by the state.
What's at Stake?
Beginning in 1995, state education officials launched a series of studies to determine the cause for the declining scores. The studies found that, although the tests had some flaws, the achievement declines were real.
In their analysis, however, Mr. Pearson and Mr. Shanahan point out that the scores headed southward only after 1993, when the state began reporting the test results for individual students rather than for districts.
One of the problems with that change, they say, was that it altered some of the conditions in the way the test was scored.
Because Illinois, like many other states, wants to see how students do over the long run, each year's version of the test has to be made comparable to the previous year's. To do that, test administrators give a smaller subset of students two tests--the current one and the previous year's version--and the results are compared.
For most test-takers, explained Mr. Shanahan, the director of the University of Chicago's center for literacy, the stakes got higher with the switch to reporting individual test results.
"All of a sudden, kids' names go on the test, and Mom and Dad know how you do on the test," he said.
But for the subset of students taking the older tests, the stakes stayed the same, he said. They are told, "'Look, no one cares how you do,'" Mr. Shanahan said. If those students don't try as hard as a result, he said, then students taking the regular test have to get more questions right in order to score as high.
"It's like a tractor pull that gets heavier every year," he said.
State officials, though, said that the "equating" tests are given days after the the regular tests and that students may not even be aware the results don't count.
"Basically, we have looked at the equating, scaling, scoring, and linking issues up one side and down the other, and we have used the best people in the country, and none of them found a problem," said Eunice Greer, the division director for standards and assessment for the Illinois state school board.
What is more, state officials said, the issue could become moot as Illinois moves next year to a new assessment--one that was designed to yield individual results from the start.
But Springfield's Mr. Kerins disagreed. "It's certainly not a dead issue to the principals whose phone numbers I have in front of me and who want to be called the minute last March's scores are available," he said.
Vol. 17, Issue 43, Page 12