By guest blogger Jackie Zubrzycki
The Council for Opportunity in Education has submitted a Request for Correction to a 2009 study from Mathematica that suggested that Upward Bound, a program that helps low-income students and students whose parents do not have bachelor’s degrees pursue college, did not have a significant positive impact on its participants.
The council is a nonprofit group that advocates on behalf of Upward Bound and other programs that work to expand college opportunities for disadvantaged students. It says that, among other design flaws, an unrepresentative program was disproportionately weighted in the study’s analysis and resulted in the “ineffective” designation—and that if that data is removed, the study would show a positive impact. The Washington-based group also says that while U.S. Department of Education officials were aware of the study’s flaws, the study remained publicly available and the flaws were not addressed.
But though the COE is just releasing this request now, these concerns have been circulating for years, says Grover J. “Russ” Whitehurst, a senior fellow at the Brookings Institution who was chief of the Institute for Educational Sciences at the time of this study—and they’ve already been addressed. The IES did not commission the study, but it conducted a review of the report before it was released in 2009. Whitehurst said that “the reviewers thought that the issues that had been raised were real issues.” They agreed that the case the COE’s pointing to was unduly weighted but decided that “did not undermine the bottom-line conclusions.” Reviewers also looked at a separate analysis of the Upward Bound data conducted by Margaret Cahalan, who was at the Education Department’s office of planning, evaluation and policy development at the time and made some of the same criticisms as the COE report, and “didn’t find it persuasive,” Whitehurst said. Cahalan is now a senior research scientist at the COE’s Pell Institute.
Mathematica, the Princeton, N.J.-based policy research group that produced the study, agreed that advocacy group’s concerns aren’t new, and that the report’s findings are still valid. “Mathematica stands behind the findings of its report, which went through multiple peer reviews and was approved and released by the U.S. Department of Education,” Mathematica’s communications director Joanne Pfleiderer wrote in an email.
The COE says the study has helped result in stagnant funding for Upward Bound and is frequently cited by the program’s detractors. But according to Maureen Hoyler, the executive vice president at COE, “The biggest negative impact is that it suggests that you can’t impact college-going by working intensively with low-income students.” Ms. Hoyler says the COE hopes the request for correction will result in the study’s removal from the Education Department’s website and that “it sheds light on national evaluations on programs for low-income people, especially students. I hope everyone learns from it, and that it helps them do better evaluation.”
Whitehurst, however, adds that the COE’s advocacy led to the cancellation of a rigorous, randomized evaluation of Upward Bound that was to begin in 2006, and might have helped settle some of the uncertainty around the program’s effectiveness. He referred to this as a “low point” for federal program evaluation. “The programs need to be rigorously evaluated. The COE seems to think rigorous evaluation places programs at risk, but it only does so if they’re ineffective or not willing to change.” He said the Mathematica report indicated that Upward Bound might be more effective if “it focused on students who were younger in high school and less likely to go to college—high-risk kids.”
In other research news, a study that questions the idea that ‘stereotype threat’ contributes to the gender gap in math ability is featured over on Curriculum Matters. Check it out!
A version of this news article first appeared in the Inside School Research blog.