Your Education Road Map

Politics K-12®

ESSA. Congress. State chiefs. School spending. Elections. Education Week reporters keep watch on education policy and politics in the nation’s capital and in the states. Read more from this blog.


Ed. Dept. Analysis Paints Mixed Picture of SIG Program

By Alyson Klein — November 19, 2012 7 min read
  • Save to favorites
  • Print

Two-thirds of chronically underperforming schools that tapped into a big new infusion of cash under the federal School Improvement Grant program made gains in math or reading, but another third saw student achievement decline in their first academic year, according to an analysis by the U.S. Department of Education.

A quarter, or slightly more, of the schools in the program had seen their student progress slip before they got the grant, then saw gains after they received SIG funding, the analysis found.

But slightly more than a quarter of the schools in the program already were beginning to show improvement before they got SIG dollars—only to see student achievement dip afterwards. (Check out page 4 of the analysis for details on this.)

Over all, the analysis paints a mixed picture of the first year of the supercharged SIG program, which received $3 billion under the American Recovery and Reinvestment Act, the largest federal investment in school turnaround in history. The program, which requires schools to take dramatic steps such as closing schools, removing staff, and extending the school day, has been the subject of significant controversy, all the way from district central offices to Capitol Hill.

Upon releasing portions of the analysis late last week, U.S. Secretary of Education Arne Duncan cautioned against reading too much into the data, which only covers the changes in student achievement from the 2009-10 to the 2010-11 school year. That represents the first year of the new version of the program, which, in addition to the infusion of federal stimulus money, calls for schools to use one of four controversial school improvement models. SIG grants cover three academic years.

“I think it’s way, way too early to draw any conclusions,” Duncan said in an interview. He spoke publicly about the data Friday in a speech to the Council of Chief State School Officers, who were meeting in Savannah, Ga. “We’re in this for the long haul. One year of gains isn’t success. One year of declines isn’t failure.”

But during this year’s election campaign, President Barack Obama cited the achievement gains under the school turnaround program in his debates with Mitt Romney, his GOP opponent, telling the voters that “schools that were having a terrible time” were “finally starting to make progress.”

Looking at the Data

Elementary schools generally did better than middle and high schools, the analysis found. (High schools make up a significant chunk of the program’s first cohort.) Rural schools—which some policymakers cautioned would not be able to implement the stringent turnaround models required by the program—performed about as well as their suburban and urban counterparts.

Here’s a more detailed breakdown of what the department found:

• Out of 731 schools that received funding in the first year of the program, 25 percent posted double-digit gains in math, and 15 percent posted double-digit gains in reading. Forty percent posted single-digit gains in math, and 49 percent posted single-digit gains in reading. Twenty-eight percent saw a single-digit dip in math, 29 percent in reading. Another 6 percent saw a double-digit decline in math, 8 percent in reading.

• In some cases, schools that got SIG money were already on a path to improvement, but fell off once they got the grant. Twenty-six percent of schools in the program were on a trajectory to improve their math scores, but declined once they entered the SIG program, while 28 percent of schools where math scores had been slipping began to show improvement after getting the grant. In reading, 28 percent of schools that had been showing gains before SIG actually lost ground once they got the grant. A smaller percentage of schools, 25 percent, had been showing sluggish improvement in reading before the grant and began to improve once they got the funding.

• Elementary schools were more likely to see double-digit growth and less likely to see declines in achievement, particularly in reading. Twenty percent of elementary schools posted double-digit gains in the subject, compared with 6 percent of middle schools, and 15 percent of high schools.

• Rural schools, in the department’s view, did about as well as their urban and suburban counterparts. For instance, 19 percent of rural schools posted double-digit gains in reading achievement, compared with 15 percent of urban schools, 14 percent of suburban schools, and 15 percent of schools in towns.

Left Unanswered

While the analysis paints a broad-strokes portrait of overall student achievement, it leaves unanswered some major questions. For one thing, the data isn’t broken out by which schools used which of the four improvement models, so it’s tough to say which model is most effective.

The data also doesn’t include graduation rates, or any information on discipline and school climate—two important indicators of school turnaround. Duncan said in an interview he was particularly interested in seeing that data.

And it doesn’t include a breakdown of which states made the best use of the program—states took radically different approaches to distributing the funds, as this report by the Center for American Progress found.

The department is aiming to release more complete data in January. (Right now, researchers are “scrubbing it” to make sure nothing can be traced back to individual students.)

Implications for Policy

The data tracks closely with what Terry Holliday, the education commissioner in Kentucky, is seeing in his home state. He said he’s witnessed gains in about two- thirds of his schools, especially where there is a strong building leader.

But he added, “We just have some schools that aren’t making the move.” He said in those schools, the state may have to take more aggressive action, such as taking over more site councils, which have a lot of authority over school decisions.

The SIG program is on shaky footing on Capitol Hill. Some Senate Democrats have supported allowing states to come up with their own turnaround remedies. And House Republicans have tried more than once to eliminate the program. It’s unclear how—or whether—the department’s analysis will impact those discussions, particularly as some lawmakers look to trim spending in the wake of discussions over the looming fiscal cliff. (For more on the cliff, see this story.)

U.S. Sen. Tom Harkin, D-Iowa, the chairman of the panels that oversee K-12 funding and policy, has fought to preserve SIG’s funding, even as he pushed for adding additional models to its menu of options.

“This data is an important preliminary assessment of the first full year of implementation,” he said in a statement in response to my queries about the data. “While it shows the effectiveness of the models for some schools, it also presents an opportunity to dig deeper into what is happening in SIG schools, and to explore other possible models that may address the needs of all students.”

Analysts, Researchers Split

Researchers and analysts who have studied SIG and turnarounds offered radically different interpretations of what the early results might mean when it comes to the program’s effectiveness.

Diane Stark Rentner, the deputy director of the Center on Education Policy, in Washington, said the data looked rosier than she expected.

“I’m surprised that the numbers were so positive. I would have thought we’d see more stagnation,” she said. “What we found in our research was that schools were focusing on climate” in the first year of the program and saving their “achievement-focused efforts for the second and third year.”

SIG schools have gone through “a lot of turmoil and churn,” including finding new principals and teachers, she noted. While those steps can lead to achievement gains down the road, they might have an impact on initial data. (Check out the Center’s research on SIG here.)

But Andrew Smarick, a partner at Bellwether Education in Washington, who has written about school turnarounds, called the data “heartbreaking.”

“We spent several billion dollars, and more than a third of schools went backward,” said Smarick, who recently was the deputy commissioner in the New Jersey Department of Education, and served in the federal Education Department under President George W. Bush.

Smarick said that, in his experience, schools are most likely to post gains in the beginning years of a turnaround. The trouble, he said, is sustaining the turnaround. If schools in the program “couldn’t even see a bump in year one, what is that going to tell us about future years? This just shows hope is not a strategy.”

And Robin Lake, the director of the Center of Reinventing Public Education, at the University of Washington in Seattle, said it’s pretty tough to tell if SIG schools are living up to the program’s promises because the department never laid out a clear vision for what success should look like.

“They’ve talked about bold, dramatic change, but never really defined it,” she said. It’s unclear, she said, whether schools will be able to sustain the gains they’ve made after the first year of the program. Policymakers should think carefully about whether the SIG models are the best use of scarce federal funding for improving schools, she added. (Lake’s Center has also looked closely at the SIG program. Its report here.)

Separately, an April 2012 study of California schools that got SIG grants vs. schools that almost qualified for them, showed that the funding appeared to make a difference. More on that here. And check out this story on the program.