Remember that study last week that showed how district-mandated tests take far more time away from students’ and teachers’ schedules than state tests do? It turns out that it had a major error.
As soon as “The Student & The Stopwatch” came out Feb. 5, critics pounced on it for posting far too low a figure for the amount of time students have to spend on states tests in Illinois. You can see some of these attacks in the comments section of the blog post I wrote about it.
You might recall that the report examined a dozen urban districts, and some surrounding suburban districts, focusing on their relative burdens of state testing time and district-imposed testing time.
In the face of a flood of critical feedback about the times posted for Illinois state tests, however, Teach Plus, the organization that put the report together, went back and corrected its data. When I pressed them for an explanation for the mistake, all a spokeswoman could tell me is that “the report was a year in the making; the state test time we were going from was wrong, although districts had verified it as correct in multiple rounds of consultation.”
The corrections have the predictable domino effect of changing the findings specific to some of the grade levels Teach Plus studied, as well as the average testing times for urban districts, and for the comparison of urban and suburban districts.
Teach Plus officials said that the corrections in the revised report are indicated with an asterisk anywhere they appear, with the phrase “Data revised for Illinois state test.” But when I looked through it to see exactly where the revisions affected results, there was at least one pretty big shift that was not marked with an asterisk.
Check, for instance, page 4 of the report, where Teach Plus lists its “key findings.” One of the most startling and newsworthy findings—that urban districts vary in their testing times by as much as fivefold—turns out not to be true with the updated figures, but it’s not flagged with an asterisk to indicate a change, either. The revised report shows that some urban districts test not five times more, but 3.3 times more than others.
Teach Plus reports the central revision to the testing times on page 9 and 10, where it revises the time for 3rd and 7th grade Illinois state tests from 2 hours to 4.5 hours. It leaves the time for district-mandated tests the same, at 3.1 hours. The change in the state testing time doesn’t affect Chicago’s position in the urban rankings at either grade level; it’s still the district with the lowest testing times.
But you can see that the change in state data affects the overall finding of differences in urban testing times at the 3rd grade level: there’s that 3.3-times-as-much figure on page 9—again without an asterisk—where before it had been five times as much.
The revisions also affect the chart on page 11, which shows the variation in the amount of testing time in high-testing versus low-testing urban districts. With Illinois’ testing times corrected, that spread is smaller: 120 hours’ difference in the original report, now updated to 105 hours.
Another of the striking findings now looks far less striking, as well. In the original report, on page 11, Teach Plus offered contrasts between how much cumulative time a student would spend on testing by 8th grade in a high-testing district compared with a low-testing district. It had used Denver and Chicago as opposite ends of that spectrum, with a child in Denver spending 159.4 hours on tests, compared with his peer in Chicago, who would have spent 38.8 hours. That Chicago figure is now 53.8 hours, still a big spread, but a far cry from the original figures.
Suffice to say, if this is a report that interested you the first time it came out, I urge you to go back through it with a fine-toothed comb. It remains to be seen whether the revisions will satisfy the sharp-eyed critics who pointed out the errors to begin with. I note that some of the commenters on my blog post put state testing times higher than Teach Plus revisions show.
Let’s see what happens now.
A version of this news article first appeared in the Curriculum Matters blog.