A new tracking system in Texas has produced the most detailed report yet on where the state’s students go when they don’t return to their schools, though the picture in some districts remains fuzzy.
Texas schools have long had to report their dropout rates to the state, but traditionally they haven’t had to show what happens to the students who leave for other reasons. Starting with the 1997-98 school year, though, the Texas Education Agency began requiring schools to account for each student in grades 7-12 who had been on their campuses in the previous academic year but who failed to return in the fall.
The results, included in a report released this month, show that statewide: 18.5 percent of the “nonreturning” students had transferred to another school in the state; 7.3 percent transferred out of state; 5.8 percent dropped out; 3.1 percent moved to a private school or began to be home-schooled; and another 1.6 percent left to pursue a General Educational Development program.
The most cited reason, however, was graduation, which accounted for 43.8 percent of the students who failed to return in the fall.
Texas education groups are hoping the new information will serve as a guide for policymakers.
“Often we develop policies for issues when we really don’t have good information on which to base those policies,” said Karen Soehnge, the assistant director for governmental relations for the Texas Association of School Administrators. “Our policies are never going to be truly effective until we have good data.”
Mix-Up in Austin
But the system’s inaugural year wasn’t without its glitches. Statewide, the effort still could not account for 55,700--or 12.7 percent--of the 439,000 students in grades 7-12 who, in fall 1997, did not return to the schools they attended the year before. But state officials say an 87 percent success rate isn’t bad for such a new program.
“I thought the districts did pretty well, considering that this was the first year of the data reporting,” said Criss Cloudt, the TEA’s associate commissioner for policy planning and research. “We were not expecting a zero percent for unaccounted-for students. Now, that doesn’t mean we don’t want them to do better.”
Some of those unaccounted for, she added, are likely attributable to difficulties in matching up names. For example, she said, students’ nicknames might be used on one record, while their formal names appear on others.
Even so, the results for some districts left more room for improvement than others. According to the state results, 4,369 students from the 76,900-student Fort Worth Independent School District went unaccounted for last year, representing more than 40 percent of the system’s nonreturning students in grades 7-12. About 1,400 students from the 16,100-student Waco school system also went unaccounted for, representing more than half of that district’s nonreturning junior high and high school students.
The 76,600-student Austin Independent School District failed to provide any data on students who didn’t either drop out or graduate. District administrators say they collected the information, but failed to include it on the computer disk the district sent to the state.
“It was a lot of small errors put together, and things weren’t communicated clearly enough,” said Susan Kemp, Austin’s interim director of accountability. “We had the data, and it should have been submitted, but it wasn’t.”
Ms. Kemp said the Austin district would release its own report on the data missing from the state analysis in the coming weeks.
The mix-up comes at a bad time for Austin. The Travis County prosecutor is investigating the system’s reported dropout rates as part of an inquiry into broader accusations that Austin’s student-performance data may have been tampered with to improve its standing in the state’s school accountability system. (“Austin District Charged With Test Tampering,” April 14, 1999).
Ahead of the Curve
Meanwhile, officials of the Texas Education Agency are mulling the use of sanctions to encourage districts to provide even better information. In the future, for instance, all unaccounted-for students might be counted as dropouts, Ms. Cloudt said. The state accountability system uses dropout rates as one factor in rating schools from “low performing” to “exemplary.”
Despite the gaps this year, many observers say Texas is on the leading edge of states attempting to monitor their students’ movements more closely.
In 1991, Texas began assigning personal identification numbers to each student in the state, allowing the TEA to more easily compare information between districts and with data from other programs, such as the division that administers the GED test.
“It’s very important that we know what our success rates are with kids, " said Kathy Christie, a policy analyst for the Denver-based Education Commission of the States, “and Texas has a student-records-management system which many states do not.”
Attempts to build such tracking systems can, however, become controversial if the plan is perceived as a threat to privacy, Ms. Christie added.
For example, she said, “in Colorado, if you were going to talk about creating a new database of student records, you would have a real difficult time getting it into policy, because it doesn’t sit well with many people.”
A version of this article appeared in the May 26, 1999 edition of Education Week