Remember Voltaire’s character Pangloss from Candide—the guy who saw the world through rose-colored glasses? For Pangloss, everything was fine and dandy, despite resounding evidence to the contrary. Education Sector, the Washington, DC-based think tank, created a NCLB-related index in his honor.
The Pangloss Index
For the second year in a row, Education Sector’s Pangloss Index ranked states according to their own reports to the U.S. Department of Education about their progress on the No Child Left Behind Act. Accountability measures reported include student proficiency, adequate yearly progress, graduation rates, teacher and paraprofessional qualifications, access to professional development, and school safety.
A highly-ranked state on the Pangloss Index—such as Iowa and Wisconsin—means the state reported that it’s doing quite well on those NCLB measures. A low-ranking state—such as New Mexico and Hawaii—means the state reported that it has a ways to go in meeting the goals of NCLB.
But NCLB gives states flexibility in defining and measuring those very indicators of progress, and this is exactly what the Education Sector report author Kevin Carey sought to expose. Carey analyzed the Pangloss Index ranking of states against uniformly-calculated external indicators of progress—and found some interesting results. Outside sources included the National Assessment of Educational Progress and independent studies of states’ graduation rates from the Urban Institute, and numbers of highly-qualified teachers published by the Brookings Institution.
Findings
• Some state rankings on the Pangloss Index compared to external indicators made sense. Connecticut scored well on the Pangloss Index and also did well on achievement measures from other sources. The District of Columbia was last on the Pangloss Index and also ranked very low on indicators from other sources. In other words, their reported data were in line with what external indicators suggest.
• For some states, however, there was a mismatch between the Pangloss Index and external indicators of reality. Iowa and Wisconsin, for example, tie for first place on the Pangloss Index—meaning their NCLB progress reports suggest they’re doing extremely well—but they do not achieve those top rankings in other sources.
• Massachusetts, on the other hand, consistently a top-performer on achievement measures, was ranked 46th on the Pangloss Index. According to Carey, the discrepancy shows that the state is very tough on itself when it comes to achievement.
• Alabama stood out because of its dramatic ‘improvement’ on the Pangloss Index this year. The state went from 22nd place in 2006 to 5th place in 2007. Should Alabamians jump for joy? No, says Carey. The state department of education changed the way it calculates AYP and—lo and behold—saw a 40 percent increase in the number of districts that made AYP.
What People Are Saying About the Pangloss Index
So what’s going on here? Carey thinks that the Pangloss Index highlights how states are choosing to present themselves, not how they are actually performing. Some herald the study as yet another sign that accountability under NCLB isn’t working.
The Fordham Foundation’s Coby Loup says in The Gadfly that the report’s findings point at the need to scrap NCLB:
Unfortunately, due to a combination of limited foresight and inevitable political compromise, lawmakers just aren't very good at closing loopholes. And government agencies, such as the Department of Education, are even worse at it. Congress needs to put down the pens and pull out the scissors.
Lisa Schiff, writing for BeyondChron.org, agrees that Pangloss Index should raise some alarms:
Although Carey’s research is quite critical of NCLB, it should be more generally read as a cautionary tale for both policy makers and supporters. The moral of the story is that standards should be applied equally to all states and tools such as peer review and precision about requirements and definitions are essential.
Others dismiss interpretations of the Pangloss Index as misrepresentative and biased. Not surprisingly, the biggest critics are from Iowa and Wisconsin.
Thomas J. Mertz, blogger for Advocates for Madison Public Schools, blasts the report and claims:
If you peek behind the curtain you will see that [the Pangloss Index] is in fact a lazy and useless piece of garbage intended only to fan the flames of panic among those inclined to believe the worst about public education and ‘educrats’… The whole thing is based on the absurd assumption that all positive data is wrong and all negative data is correct. Therefore, states that report good things get a high (bad) rating for ‘gaming’ the system and states that report bad things get a low (good) rating for being honest and accountable.
Milwaukee public school teacher and blogger, Jay Bullock lambasts the Pangloss Index in his blog, Teachable Moments:
…unlike the fictional Pangloss, Wisconsin really isn't facing insurmountable evidence to the contrary. When you look at census data and national standardized test scores, you can confirm, outside of what we self-report, that Wisconsin's schools are pretty good… It seems likely that there will be another version of the Pangloss Index next year. And if Wisconsin continues to do a good job educating our children, expect to be told that we're gaming the system.
Tom Deeter, Assessment Consultant with the Iowa Department of Education, claims in a regional op-ed that the Hawkeye State is merely following the letter of the law.
[Suggesting that Iowa is ‘gaming’ the system] …is really not fair, because we are using the flexibility that we are finding in the law to protect our schools and give them an opportunity to improve.
Defending his report and the premise of the Pangloss Index in Educator Sector’s blog The Quick and the Ed, Kevin Carey barks back against these critiques:
…if you let individuals or organizations define how they'll be publicly evaluated, this is what you get. People are people, and few are going to be reliably objective about owning up to their successes and failures in a neutral way, particularly when the stakes are high. That’s really what the Pangloss Index is all about.
What do you think?
Is state manipulation of education data a problem? How much flexibility should states have in determining definitions and formulas? How do you know if schools are improving?