It’s been treated like a smoking gun. News that a Houston high school altered its dropout figures—and that 14 other schools there filed incorrect dropout reports—has fueled a frenzy of speculation in recent months about the district and its former superintendent, Rod Paige, now the U.S. secretary of education.
To critics of President Bush’s education policies, the revelations call into question the achievements of a district that served as a model for the No Child Left Behind Act of 2001, the federal law that aims to hold schools more accountable for student results. One typical editorial called the disputed data “The Lie Behind No Child Left Behind.”
Almost lost in the politically charged debate over what to make of the Houston case is a larger story about the way Texas reports dropouts for school accountability purposes. Critics charge that lax oversight, and a tracking procedure that excuses many students who leave school from being counted as dropouts, combine to produce rates so low as to be almost meaningless.
“You can’t knock the schools in Texas for following the Texas rules, or even for consistently fudging what those numbers are, because that’s the system,” said Glynn Ligon, the president of Evaluation Software Publishing, an Austin-based company that consults with state and local education systems. “If the department of public safety said that 99 percent of drivers on Texas highways are following the speed limits, that’s just as believable as our [reported] dropouts.”
True, Houston has had its own problems reporting data, which district officials say they’re working to fix. But it’s not the only Texas district where some schools, using the state’s coding methods, report dropout counts that many observers see as unrealistic.
An analysis of state data by Education Week found 108 high schools throughout Texas where 70 percent or more of the students are considered at risk of academic failure. Of those schools, about half claimed a dropout rate of 1 percent or less. Further examination shows that at many of the schools, student enrollments dwindled by 30 percent or more from 9th to 12th grade.
Texas as a whole, with 4.1 million public school students, reports an annual dropout rate of 1 percent.
Some experts say Texas illustrates a nationwide problem around efforts to account for students who wind up leaving school. Few states report figures as low as those in Texas, but many use methods that yield rates that seem out of whack when compared with other data. Maine, to give one example, reports a dropout rate of 3 percent, but a graduation rate of 87 percent.
“The reason why everyone is so focused on Houston is that the secretary of education came from there, and the president has cited the success of Houston in his policy efforts,” said Jay P. Greene, a senior fellow at the Manhattan Institute, a New York City-based think tank that released a national study of graduation rates last week. “The truth is that the misreporting of dropout and graduation statistics is a national phenomenon.”
The dropout-reporting system in Texas has long been a source of debate there, but major changes have been slow in coming. This year, an attempt to link school accountability to a more streamlined definition of dropouts failed in the legislature. State lawmakers also recently revoked a short-lived rule requiring outside audits of each district’s dropout reports.
State education officials pledge to address the issue. As the Texas Education Agency drafts new accountability procedures—in part to jibe better with new federal requirements—it plans to include measures of school performance other than the state’s current definition of a dropout, said Criss Cloudt, the agency’s associate commissioner for accountability and data quality.
But, she added, coming up with a fair and accurate alternative is easier said than done. “If you get 20 people and put them in a room and ask them to agree on a definition of a dropout,” she said, “you will likely get 20 different answers.”
At the heart of the controversy are the state’s “leaver codes.” Each year, the TEA requires every public secondary school to account for students who have left their campuses since the previous year. In reports filed with the state agency, schools assign 30 different codes to the students to explain where they went. (“Tracking Texas High School Students,” this issue.)
Twenty of those reasons are not counted as dropping out—the most obvious being “student graduated” and “student died.” But also not considered dropouts are students who went into a General Educational Development program, or who returned to their home countries.
State officials say they created the codes to get a better handle on why students leave school. But in doing so, they add, they didn’t want to penalize schools in special circumstances or schools doing what might be in the best interest of some students.
“It’s very important that schools see this system as being fair to everyone,” Ms. Cloudt explained.
Schools near the Mexican border, she noted, have higher student mobility because of immigration. In some cases, schools might believe that a GED program is a student’s best chance to earn a high school credential. Without the list of such exemptions, those schools would be put at a disadvantage in the state accountability system, which rates schools and districts based, in part, on their dropout rates.
But some observers contend that the codes don’t hold schools accountable for many students who should be considered dropouts. The method doesn’t, for example, count as dropouts students who fail the state’s high school exit exam, so long as they complete their districts’ other graduation requirements.
“On the one hand, the state does not give them a diploma,” said Maria Robledo Montecel, who heads up the Intercultural Development Research Association, a San Antonio-based policy group. “And on the other, the state does not count them as dropouts. I think that flies in the face of accounting for all students.”
Another charge is that the exemptions practically invite schools to fudge the data. Local administrators, the argument goes, have a big incentive to count as many missing students as possible as having left for reasons that don’t affect their schools’ performance ratings.
Students who move out of state or opt to be home-schooled don’t count as dropouts. Yet such claims are hard to verify. State guidelines list documents for schools to collect to back up the reasons they give—such as GED-registration forms—but critics of the coding system suspect that, in practice, schools aren’t following them.
“You have all these ways of gaming the system,” said state Rep. Rick Noriega, a Democrat who represents part of Houston. “You then have clerks ask students: ‘You’re transferring to another school, aren’t you?’ or ‘Aren’t you intending to get a GED?’ ”
The Houston case shows the pitfalls of the coding system.
It began in February, when an assistant principal at Sharpstown High School told reporters from local television station KHOU that the school had falsely reported having had no dropouts. Composed largely of students from low-income families, the school’s enrollment shrinks by more than half from the 9th to the 12 grade. (“Principals at Center of Press for Results,” this issue.)
The coverage prompted Rep. Noriega to call for a state probe. Investigators from the Texas Education Agency pored over records from 16 Houston schools— including Sharpstown—that had reported especially low dropout rates. They found that all but one of those schools had either assigned students the wrong leaver codes, or had failed to properly gather the documents to back up their choices of codes.
In many cases, schools had merely written that a student intended to work toward a GED, without including any statement from the student to that effect. As one of the investigators wrote in an e-mail back to the TEA: “If it was permissible for school officials to declare intent for a student, they could state anything they please, and we would be obliged to accept their word as verification.”
Ron Rowell, the state official who oversaw the inquiry, agreed the records were a mess. “Time after time, there were similar issues—poor documentation, or very sloppy documentation, or inconsistencies,” he said in an interview.
A separate investigation, just of Sharpstown High, paints a similar picture. Asked by district leaders to look into the matter, the Houston law firm of Rusty Hardin & Associates reported that while leaver codes had been altered at the school, the bigger problem was a “complete breakdown in the chain of command” at the campus.
Even administrators at Sharpstown doubted the zero-dropout figure, says the firm’s report. And yet, contends the document, all they did was “ask for repeated assurances that the data was supported by proper backup documentation from a single administrative clerk. ...”
That documentation included handwritten notes stating, “Dad thinks she will get a GED” and “Student has been ill and is home-schooling.”
Based on its findings, the state downgraded the accountability ratings of 15 Houston schools, including Sharpstown. But the TEA’S interim chief, Robert Scott, who took over after Commissioner Felipe Alanis resigned this past summer, gave a six-month reprieve to the district as a whole. Houston has until February to clean up its reporting procedures or it could lose its current overall rating of “acceptable.” (“Houston Escapes Lowered Rating Over Dropout Errors,” Sept. 3, 2002.)
Houston officials say they’ve long known that they had “data integrity” problems, but say that the Sharpstown High affair has intensified efforts to address them.
Abelardo Saavedra, the executive deputy superintendent of the 210,000-student Houston Independent School District, said new checks and balances have been put in place. Data specialists now review each school’s numbers on a monthly basis, he said. The system also plans an annual review of dropout records at each campus.
Meanwhile, district leaders have tied the performance bonuses of administrators to the accuracy of the data they submit. And, at the state’s insistence, the entire leadership of Sharpstown High has been removed.
“We need to hold a professional person accountable, should we encounter problems,” Mr. Saavedra said.
Mr. Paige himself addressed the issue at a forum on No Child Left Behind held in New York City on Sept. 16. Countering claims that Houston’s data problems raise questions about the new federal law, the Education Secretary said the real lesson in Houston’s data problems is about the need for adequate oversight on data quality wherever schools are held accountable for results.
“There should be an efficient monitoring system that should deal with the issue,” he added. “And we’d expect that to happen in Houston, or any other place in the world where that happens.”
Missing Red Flags
While Houston works to tighten up its internal controls, many observers say the Sharpstown incident points up larger deficiencies in the state’s monitoring procedures. Why, they ask, did it take a whistle-blower to expose data irregularities at a high-poverty school that had reported to the state such negligible dropout figures.
“It should have been, in my opinion, caught by TEA a long time ago, based on the unusual patterns in the data,” said Chrys Dougherty, the director of research at the National Center for Educational Accountability, an Austin-based group that researches school performance data from around the country.
The TEA does run checks to see if students listed as having transferred to another district in Texas actually showed up. When districts report large numbers of such transfers who can’t be found, the agency checks the school system’s paper files for proper documentation. The TEA also has audited districts with a propensity for using certain leaver codes, like the one for out-of-state transfers.
But some experts say that’s not enough. To catch a case like Sharpstown requires comparing schools’ actual dropout rates with those of similar schools—something the TEA hasn’t done, at least not to date. The Texas state auditor’s office recognized such weaknesses in a report last spring when it wrote that the way the TEA identifies systems for audits “may be outdated and may not be based on the most effective selection system.”
It’s a crack that soon will be filled. State legislation passed this year calls on the TEA to design new statistical tools to identify districts most at risk for data- quality problems. Ms. Cloudt, the agency’s associate commissioner for accountability and data quality, said her division plans to zero in on schools whose reported dropout rates are way off the norm.
Still, she predicts most of the anomalies found would be the result of human error, not outright deception. Even in Houston, only at Sharpstown High have investigators charged that numbers were intentionally fudged; the other 14 schools there that were identified by the state essentially got into trouble for poor record- keeping.
“My experience has been that I rarely run into cases where districts are misrepresenting the information deliberately,” Ms. Cloudt said.
Others, though, point out that the TEA has limited resources to follow up on suspicious cases. The office that checks to see if leaver codes are properly documented has four people—a number that isn’t likely to grow, given that state budget cuts just downsized the whole agency from 860 employees to 660. Texas has some 1,650 high schools.
“They do a noble job,” Virginia Carmichael, an official with the state auditor’s office who spent a year reviewing the TEA’s data-quality procedures, said of the state officials. “But they just don’t have enough time.”
As an additional check, Ms. Carmichael’s agency has recommended that districts be required to have their own outside auditors review their dropout data when the auditors go over their financial records. Although state lawmakers approved just such a mandate in 2001, they rescinded the law this year, after districts complained that it posed too great a burden.
Carol Smith, another official at the state auditor’s office, still thinks it’s important to do external audits in addition to statistical checks to flag suspect districts and schools. Just as public companies hire third parties to review their books, schools should be subjected to similar scrutiny, she said.
“TEA can implement the best risk-assessment model in existence, but it is only as good as the data that feeds into it,” she said. “The periodic audits would also provide TEA with the assurance that it can rely on the data it is using to analyze risk.”
Rep. Noriega, the legislator who first called for the Houston investigation, thinks he has a simpler solution: Get rid of the state’s current method of counting dropouts. “The one thing that is clear is that the leaver code system that the state uses is riddled with loopholes,” he said. “And it does allow for abuses.”
Graduation Rates Eyed
The lawmaker introduced a measure this year that would have tied the state’s school accountability ratings to a much-pared-down definition. In essence, any missing student who hadn’t died, transferred to another district, or graduated would have been counted a dropout. But the language was rejected.
The state has begun to emphasize other measures of school success. Last year, the TEA started calculating districtwide dropout rates with a method used by the National Center for Education Statistics. For now, those calculations are just for reporting purposes. But critics of the state’s own dropout definition at least like that the NCES method doesn’t exempt students in GED programs.
Eventually, Ms. Cloudt says, the TEA plans to use high school completion rates, instead of dropout rates, to hold schools accountable. The Texas state school board called for as much in a resolution last spring.
“The use of a completion rate in an accountability system is much more consistent with the public’s perception of a successful high school,” Ms. Cloudt said. “It’s how many students did they actually graduate.”
But, she cautions, the change won’t come overnight. Coming up with a definition of a graduate raises its own set of questions, such as whether to count students who graduate after five years, or just after four, she explains.
And even once something is settled on, all of the state’s districts will have to retool their data-collection procedures.
“We probably are not going to be able to make those changes in our data system until 2007,” Ms. Cloudt said. “So we have to live with our current definition for a while.”
Research Associate Jennifer Park contributed to this report.
Coverage of urban education is supported in part by a grant from the George Gund Foundation.