Faulty Data Leads to Mix-Up in High School Rankings

By Christina A. Samuels — May 11, 2012 1 min read
  • Save to favorites
  • Print

Jeff Horn, the principal at Green Valley High School in Henderson, Nev., knows his school is good. He was a little less certain, however, that it is the 13th best in the country, as the recent “Best High Schools” report released by U.S. News and World Report had placed it.

For one thing, the rankings presumed that his school had 477 students, 111 teachers, and a 100 percent passing rate on Advanced Placement tests. In actuality, the school has about 2,850 students, a student-teacher ratio that is closer to 24 to 1, and an AP pass rate of about 64 percent, according to an article in the Las Vegas Sun.

The mistake has now been traced back to erroneous data provided by the Nevada education department to the U.S. Department of Education and has not been questioned until now, according to a follow-up article in the Sun. The error not only draws into question the validity of the rankings, but the validity of other research that relies on the federally maintained Common Core of Data, which tracks information about students and staff in every public school, district, and state education agency in the country. The Common Core of Data is used as part of the U.S. News ranking system.

From the article (which is really worth the read):

The mess-up originated inside the single-story Nevada Department of Education building—a world steeped in statistics, reflecting the national push to evaluate students, teachers, principals, schools and entire states on the weight of cold data. Presumably accurate data. One day in September 2010, with a click of a button, Julian Montoya, the deputy director of accountability, sent the data to the federal Department of Education database. It's called the Common Core of Data and is used extensively by academics, think tanks and educators. The data were filled with errors that, Nevada officials believe, were created by a computer program that collects data from school districts and forwards it to the federal government. ... At the federal level, two safeguards, meant to catch errors, failed. One failure—during an "edit check"—appeared to be a programming issue that the federal government is still investigating, said Marilyn Seastrom, the chief statistician and acting deputy director with the National Center for Education Statistics, the statistical unit for the U.S. Department of Education. A second check, done by the department with help from the U.S. Census Bureau, failed because of confusion among the staff over earlier errors that had been caught and ensuing confusion over which of those errors had been corrected. Suffice to say, it's a long, complicated story.

How many schools and districts might be affected by similarly poor data collection? It’s not possible to say yet, though another article on the mix-up suggests that at least six other Nevada schools have flawed data. None of them made it to this high-profile list, however.

As of Friday afternoon, Green Valley still maintained its position as the 13th best high school in the country, and the Common Core of Data still reflected incorrect information for the school. As soon as I hear back from these sources, I’ll update this post or write a new one.

A postscript: the Nevada case is garnering national attention, but in a recent blog entry the website DCist brought to light some other unusual data, this time involving high schools in the District of Columbia. In this case, the numbers aren’t a question, but how they are weighted. The weighting may have placed a high school higher than others in the city that produce more high school graduates.

In D.C., Benjamin Banneker High School in Northwest came in first (and 700th nationwide), followed by Calvin Coolidge High School in Takoma (1,455th nationwide). Banneker, a magnet school, is certainly very good—according to recent DCPS stats, it had a 100 percent graduation rate in 2011—but Coolidge is an interesting choice for the city's second-best school, even according to U.S. News' own rankings. According to the methodology used by the magazine—student-to-teacher ratio, college-readiness index, math proficiency and reading proficiency—Woodrow Wilson High School, School Without Walls, and Ellington School of the Arts all seem to be better choices. (Even by DCPS standards, all three graduate more students than Coolidge.) Still, none of those was even ranked, either nationally or in D.C. A person we spoke to with knowledge of these issues explained that Coolidge may have been weighted differently because it has a higher proportion of economically disadvantaged students than, say, School Without Walls.

Related Tags:

A version of this news article first appeared in the District Dossier blog.