School & District Management

The Dropout Dilemma

By Debra Viadero — February 07, 2001 12 min read
  • Save to favorites
  • Print
Dropout numbers are notoriously unreliable at a time when they are more important than ever.

Imagine this scenario: John Doe, a 17-year-old student at Big City High School, informs administrators there that he is quitting school to enroll in an adult education program across town. He plans to earn a high-school-equivalency diploma.

Is John Doe a dropout?

According to the federal government, yes. But, in many states, the answer is no. Bean counters in Florida, Indiana, South Carolina, and a handful of other states would call John Doe a full-fledged student because he is pursuing an alternative diploma. And in many places, whether the boy ever sets foot in his newly assigned school is beside the point: he still wouldn’t show up on the dropout rolls.

That hypothetical example illustrates just some of the reasons why dropout numbers are notoriously unreliable. States, school districts, and federal researchers may all use different methods and different definitions to tally up how many students have dropped out of school. And the results can vary enormously. Depending on the methodology used, for example, Texas’ high school dropout rate for the 1998-99 school year ranged from as low as 2.2 percent to an alarming 36.6 percent, according to a new report from that state’s Legislative Budget Bureau.

But at a time when schools are ratcheting up high school graduation requirements, keeping accurate—or, at least, agreed-upon—counts of dropouts is becoming increasingly important.

“We’re just entering an era when there’s going to be more accountability in making kids meet standards to actually get a diploma. There’s a danger that will mean more kids dropping out, and we need to know if that is happening,” says Phillip Kaufman, a senior research associate at MPR Associates, an education research and consulting firm in Berkeley, Calif. “Without better data, we’ll just have ongoing debate about it.’'

Keeping dropout numbers low is critical because, without a high school diploma, young adults’ prospects in the labor market are bleak and getting bleaker. Between 1979 and 1996, the real earnings of 25- to 34-year-old male dropouts fell by 28 percent, researchers report. Earnings for female dropouts declined 7 percent.

Newer studies suggest that a GED can’t make up for the lack of a regular high school diploma.

And experts take little comfort from statistics showing skyrocketing numbers of students earning the General Educational Development certificate, the most popular second-chance diploma. That’s because newer studies suggest that in the job market, a GED can’t make up for the lack of a regular high school diploma. The research shows that lower-skilled GED holders get a small income boost but that better-skilled students who get a GED diploma tend to earn only as much as they did before passing the testabout $11,000 a year in 1995 dollars.

“GED holders aren’t the labor-market equivalent of regular high school graduates,” says John H. Tyler, an assistant professor of education, economics, and public policy at Brown University. He conducted the new GED studies with Harvard University researcher Richard J. Murnane. “They’re much more like high school dropouts.”

Ironically, the accountability movement that is fueling the need for more accurate dropout reports could be helping to skew the numbers, too. A growing number of states, in a well-meaning effort to hold schools and districts accountable for raising test scores and for keeping youngsters in school at the same time, have begun to include dropout figures in their accountability models.

But researchers say that those efforts, in some cases, may have prompted schools to misreport their dropout numbers or to shuttle students off to alternative-diploma programs where they won’t be counted in dropout calculations. In Texas, for example, where schools and educators face heavy pressure to raise test scores while keeping dropout figures low, state school officials have cited at least four districts because of concerns about the accuracy of their dropout data. (“Testing System in Texas Yet To Get Final Grade,” May 31, 2000.)

Controversy over the extent of the dropout problem in Texas, in fact, has helped fan an ongoing debate over whether the test-score gains in the state President Bush led from 1995 to 2000 should be viewed with a more jaundiced eye. Critics contend that the state’s high-stakes testing system could be forcing more students out of school than state dropout figures indicate.

Methodology Counts

Experts say dropout numbers are particularly vulnerable to manipulation because of the lack of agreement on the best way to keep count.

“Any time there are questions in the data, then the data is much more amenable to getting it to say what you want for a particular political agenda,” Tyler of Brown University says.

Newspaper editorial writers in Michigan voiced suspicions last year, for example, when state-reported graduation rates for Detroit rose from 29.7 percent the previous year to 83.4 percent—only to be revised downward to 67.6 percent a few months later.

‘Any time there are questions in the data, then the data is much more amenable to getting it to say what you want for a particular political agenda.’

John H. Tyler
Assistant Professor of Education, Economics, and Public Policy,
Brown University

The rate reports came amid debate over a ballot initiative, which the governor opposed, that would have provided private school vouchers to parents in districts graduating fewer than two-thirds of their students.

“The fact that the revised number is now 67.6 percent—just barely above the cutoff point— looks questionably convenient for opponents who would like to undermine Detroiters’ support for vouchers,” the Detroit News wrote in an editorial last April.

State officials attributed the numbers to differences between the state and the Detroit school district in the way graduation and dropout rates are calculated. In the end, whatever the reason, the voucher initiative failed on the November ballot.

“Detroit had its own system for doing things. Our numbers were pretty much always different from theirs, and it never came under the limelight until the statewide voucher proposal came into existence,” says Brad Wurfel, a spokesman for the Michigan education department, which is now revising its methods for calculating graduation rates.

At the national level, the federal government itself uses several different methods for counting dropouts. The National Center for Education Statistics, the U.S. Department of Education’s primary data-collection arm, reports state-by-state rates for the proportion of students who leave school without a diploma each year. Known as “event” rates, those annual calculations yield the lowest numbers of any dropout calculations. They also tend to be the preferred method used by states and school districts for reporting dropout statistics.

But some experts contend that such annual rates can be misleading because the public thinks of dropouts in terms of students who are in school for grade 7 or 9 and then, sometime over the next five or six years, quit.

The latest federal figures, gathered for the 1997-98 school year, range from a low of 2.7 percent in North Dakota to 11.6 in Louisiana.

Participation a Problem

Rates are reported, though, for just 38 states— only those that agree to use the NCES dropout definition. “The growth [in state participation] has been steady, but we’re still not at 100 percent yet,” says Lee M. Hoffman, the program manager for the survey, known as the Common Core of Data.

Reasons for not joining the survey vary.

Rates are reported for just 38 states—only those that agree to use the NCES dropout definition. Reasons for not joining the survey vary.

A handful of states, such as Florida, Indiana, and South Carolina, don’t consider GED holders or people pursuing a GED to be dropouts. Washington state does not take part because state law requires counting dropouts only in grades 9-12, while federal data gatherers require statistics for students in grades 7-12. And Texas has run into conflict with the federal reporting requirements because students who quit school, enroll in an alternative education program, and then drop out again are only counted as dropouts once by state officials.

But the most common reason for nonparticipation is that the federal and state reporting calendars don’t jibe. While states typically collect statistics based on their fiscal years, which usually start July 1, the federal government follows a calendar starting Oct. 1, when its fiscal year begins.

Without 100 percent participation, comparing rates among states can be a futile exercise.

Kaufman of MPR Associates, who presented a paper on federal dropout calculations last month at a conference held at Harvard University, says the state-by-state survey may also underestimate dropout numbers because it relies on self-reported data from states and districts that lack the means to track every student who disappears from their rolls. Few schools, for instance, would check up on a student such as John Doe to see if he ever showed up at his alternative program.

“If your numbers are based on the assumption that local schools can tell you the disposition of every one of their kids, then it’s never going to work,” Kaufman says.

To improve accuracy, a growing number of states have started electronically tracking students as they move from school to school.

To improve accuracy, a growing number of states, including Texas, Florida, and Louisiana, have started electronically tracking students as they move from school to school. But those tracking systems can follow students only within a state; students who leave for another state or country are lost to statisticians. Errors also occur when school employees enter the wrong student- identification codes in the records.

Federal researchers also draw on U.S. Census Bureau surveys for annual rates and other dropout calculations. One of those figures is the “status” dropout rate, which is the proportion of students between 16 and 24 who are not in school and do not have a high school diploma. That rate, reported to be 11.2 percent last year, is typically higher than the event rate, which was about 5 percent nationally last year, because the status rate is based on a broader population.

According to Kaufman, the status rate may also undercount dropouts, especially members of minority groups, because of sampling errors.

Education Department statisticians also rely on Census Bureau data to measure high school completion and graduation rates—the “yang” to dropout statistics’ “yin.” Those numbers suggest some confusing discrepancies. They show that high school completion rose over the past decade while graduation rates have actually decreased.

Experts speculate that the differences are due in part to data-collection problems and in part to growing numbers of students earning GEDs, which are treated in the completion surveys as high school credentials.

The Gold Standard

To Kaufman and other experts, longitudinal studies that follow representative groups of individual students, tracking what happens to them over a long period of time, are the gold standard for dropout research. The NCES uses several such surveys, including High School and Beyond and the National Educational Longitudinal Study, to look at dropout and completion rates.

But longitudinal studies are costly, and the lag between data-gathering cycles can be as much as 10 years. “Between years, you really don’t know what’s going on,” Kaufman says. “To actually get good estimates on who’s dropping out and why, you really do have to track individual kids.”

Part of the problem at the federal level, he contends, is an imbalance in the resources that go into dropout data-collection efforts compared with the amount spent measuring academic achievement through the National Assessment of Educational Progress, or NAEP.

‘The federal government spends $45 million on NAEP and less than a million on dropout statistics every year, and that seems out of whack to me.’

Phillip Kaufman,
Senior Research Associate,
MPR Associates

“The federal government spends $45 million on NAEP and less than a million on dropout statistics every year, and that seems out of whack to me,” says Kaufman, who spent four years himself as a statistician at the NCES.

Given the shortcomings of federal and state data on dropouts, Robert W. Balfanz, a research scientist at the Center for the Social Organization of Schools, based at Johns Hopkins University in Baltimore, suggests taking a look at the problem in yet another way. In a paper presented at the Harvard conference, he and co-author Nettie E. Legters make a case for measuring the “promoting power” or attrition rate for schools. (“Dropout Studies Target ‘Pockets of Problems,’” Jan. 24, 2001.)

“Since there’s no common agreement at the state level, let alone at the school level, on how to report this, you have to rely on kind of proxy measures,” he says.

Balfanz and Legters measure a school’s “promoting power” by comparing enrollment rates for 9th grade with enrollment rates for 12th grade four years later. Their method produces estimates much higher than conventional dropout rates. In the 34 cities the researchers studied, nearly half the schools graduated fewer than 50 percent of entering 9th graders in four years.

The federal Education Department used a similar method in the early 1980s in its so-called “wall chart,’' which provided statistics and graded states on indicators of educational progress. But researchers say the method was controversial then, and it’s still controversial now.

“If you could nail those students to the floor for four years that might work, but kids move, families move. So much can happen in over four years,” says Robert E. Jones, a research analyst for the Oregon Department of Education.

But Balfanz argues that “promoting power” can point to problems in particular schools or districts, enabling policymakers to target resources where they are needed most. Moreover, Balfanz adds, enrollment figures are generally easier to get and less questionable than dropout estimates.

The Johns Hopkins researchers found some corroboration for their argument in Baltimore, where they rated all nine of the city’s nonselective high schools as low in “promoting power.” In that city, enrollment got progressively smaller at each grade level in all those schools, suggesting that the disappearing students were not merely transferring to other schools. Data on GED certificates at the time also led researchers to conclude that few students were leaving school for alternative education programs.

Until more accurate dropout counts come along, policymakers, educators, and the public may have to look at several different numbers to get a clear picture of what happens to students in their schools. “In economics, we’ve been educated to not look at just one number,” MPR’s Kaufman says. “All of us intuitively know that no one number can tell us everything that’s going on.”

The Research section is underwritten by a grant from the Spencer Foundation.

Related Tags:

A version of this article appeared in the February 07, 2001 edition of Education Week as The Dropout Dilemma

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Equity and Access in Mathematics Education: A Deeper Look
Explore the advantages of access in math education, including engagement, improved learning outcomes, and equity.
Content provided by MIND Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School & District Management Opinion Principals, You Aren't the Only Leader in Your School
What I learned about supporting teachers in my first week as an assistant principal started with just one question: “How would I know?”
Shayla Ewing
4 min read
Collaged illustration of a woman climbing a ladder to get a better perspective in a landscape of ladders.
Vanessa Solis/Education Week via Canva
School & District Management Opinion 3 Steps for Culturally Competent Education Outside the Classroom
It’s not just all on teachers; the front office staff has a role to play in making schools more equitable.
Allyson Taylor
5 min read
Workflow, Teamwork, Education concept. Team, people, colleagues in company, organization, administrative community. Corporate work, partnership and study.
Paper Trident/iStock
School & District Management Opinion Why Schools Struggle With Implementation. And How They Can Do Better
Improvement efforts often sputter when the rubber hits the road. But do they have to?
8 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
School & District Management How Principals Use the Lunch Hour to Target Student Apathy
School leaders want to trigger the connection between good food, fun, and rewards.
5 min read
Lunch hour at the St. Michael-Albertville Middle School West in Albertville, Minn.
Students share a laugh together during lunch hour at the St. Michael-Albertville Middle School West in Albertville, Minn.
Courtesy of Lynn Jennissen