The U.S. Department of Education is revising its recent analysis of the second year of the controversial School Improvement Grant program, after it became clear that an outside contractor charged with crunching the data erroneously left out too many schools that should have been included in the mix.
[UPDATE: (Dec. 12): The contractor in question was the American Institutes for Research (AIR), which has conducted other SIG research for the department. And the analysis cost $28,300 overall.]
Those who took a close look at the data remember that tons of schools were left out. The analysis, which was released just a couple of weeks ago, excluded about half of the schools that entered the newly revamped SIG program in its first year (the 2010-11 school year) and about a third of the schools that started in the second year (the 2011-12 school year.)
At the time, the department gave a host of reasons for the exclusions. For instance, the agency said, number crunchers took out schools in states that changed their assessments, schools that merged with other schools, and schools where proficiency rates were missing.
But, apparently, in some cases, the contractors were a little overzealous in deciding which schools to toss out, according to an email sent to reporters Wednesday by Cameron French, a department spokesman. For instance, they decided to leave out all SIG schools in states that had changed either their high school or elementary school assessment, instead of just the schools affected by the switch.
Overall, the analysis showed that about two-thirds of schools improved, while another third saw stagnant student performance (or even slipped backward). It’s unclear if the do-over will significantly change those conclusions.
The department is pulling the SIG data from its website, for now, in “an effort to be cautious and ensure accuracy,” French wrote. The department is hoping to re-release an updated analysis, with more schools included, in January.
Importantly, the change doesn’t have any impact on the actual school- and district-level data, for every school in the country, that was released along with the SIG results. Some outside edu-experts have tried to do their own analysis of how SIG schools fared using that data, but they’ve had a tough time.
And, even before the department made this announcement, we noted that there were lots of unanswered questions in its SIG summary data. Read more here.