Assessment

Districts Push Back Over Cheating Probe

By Christina A. Samuels — April 03, 2012 5 min read
  • Save to favorites
  • Print

A newspaper investigation that turned up unusual test-score fluctuations in about 200 school districts in a nationwide sample of 14,700 has revived a debate about cheating on standardized tests—and prompted immediate pushback from some of the districts flagged by the analysis. They contend that the newspaper’s methodology was flawed.

The Atlanta Journal-Constitution article looked at test scores in about 69,000 schools around the country. The reporters requested average reading and mathematics results for state exams given in grades 3-8 from 50 states and the District of Columbia, as well as the count of students tested for each school, grade, and subject in those jurisdictions.

The newspaper did not have access to student-level data. Instead, it created “classes” of all the test-takers in a given grade in each school—for example, comparing all the 3rd grade test-takers to all the 4th grade test-takers in the same school the next year. If the 4th graders in a “class” performed unusually better or unusually worse on state standardized tests than they had the previous year, that school was flagged. In some schools, the scores varied so widely that it was nearly impossible to attribute the variation to chance, the article said.

In the March 25 article, the reporters said that the fluctuations did not prove there was cheating in those schools, a point reiterated by Kevin Riley, the editor of the Journal-Constitution, in an interview with Education Week.

“What we’ve really done is something that points out suspicious scores and says, ‘This warrants further investigation,’ ” Mr. Riley said. He also noted that the “vast majority” of educators are working in districts where no suspicious variations were found.

The story did note that similar test-score fluctuations were seen in the Atlanta district, which was the center of a recent high-profile cheating scandal. The Journal-Constitution‘s extensive investigation into its 50,000-student hometown district eventually prompted a state probe, which found evidence of adult-led cheating on the 2009 Georgia state test at 44 of the 56 schools examined. (“Report Details ‘Culture of Cheating’ in Atlanta Schools,” July 13, 2011.)

The responses from many districts and education groups, some of which were released preemptively a few days before the article appeared, indicated that they saw themselves as being accused of cheating based on methodology they considered severely flawed.

The 78,000-student Nashville, Tenn., district said the schools flagged there were campuses with high rates of student mobility, making it hard to measure one cohort of students against another. The district also said that scores for students in special education taking modified assessments, measured on a 200-to-400 scale, were averaged in with the scores of students in regular education classes, which are scored on a 600-to-900 scale. One Nashville middle school was included in the original article as having an unusual fluctuation, but based on those statistical problems, the district said the newspaper’s calculations were 48 scale-score points lower than the true score. A reference to that school was later taken out of the online version of the story.

More Analysis Needed?

The newspaper analysis also flagged some schools in the Houston Independent School District. The 203,000-student district responded to the story by noting that it has had confirmed cases of cheating. But the district also said it takes a vigorous stance against testing impropriety.

Houston school officials also took exception to the Atlanta newspaper’s methods, saying that, although test-score variations can be a “useful statistical tool,” such analyses tend to flag schools with large changes in their student enrollments, or schools that serve special populations. For example, an alternative school with short-term placements was flagged, as were two “overflow” schools that serve as crowding-relief campuses.

Mr. Riley, the newspaper’s editor, said that the paper is responding seriously to the objections of districts in the report and that it plans follow-up stories. However, he also said that the Atlanta district hammered the paper’s investigative methods, which were eventually proved correct. “What we need now is courageous people who will dig into this without fear,” he said.

Jaxk H. Reeves, an associate professor at the University of Georgia, in Athens, and the director of the Statistical Consulting Center there, worked with the newspaper on its analysis. In an interview, he said student mobility, which districts have coalesced around as a way to discount the newspaper’s results, is not as important as districts suggest. That’s because even though schools may not serve the same students from year to year, they tend to serve the same types of students, in terms of demographics and achievement. The newspaper also made some adjustments for mobility, for example, excluding “classes” where student numbers varied by more than 25 percent from one year to the next.

If mobility were the sole reason for the variation, Mr. Reeves said, then more schools in urban districts, where mobility is often high, should have been flagged.

“I do believe if a district is being flagged a lot, they should look at individual schools,” Mr. Reeves said.

Gary J. Miron, a professor of evaluation, measurement, and research at Western Michigan University, in Kalamazoo, has emerged as a critic of the newspaper’s work. A week before the report’s release, he evaluated some of the data for Ohio districts for a separate story published in the Dayton Daily News, which is owned by the same newspaper group that owns the Journal-Constitution. He said he identified weaknesses in the research, but was told that the paper would be moving forward with publication.

Mr. Miron said the reporters made an “adequate” first step at identifying irregularities, but it was only a first step. What is also needed is student-level data and then erasure-analysis data from the testing companies.

“Throughout the reporting, they imply cheating,” Mr. Miron said, noting that there are a number of explanations beyond cheating that address the fluctuations. “They should finish their analysis.

A version of this article appeared in the April 04, 2012 edition of Education Week as Test-Cheating Probe Spawns Questions Over Its Methodology

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Webinar
Unlocking the Full Power of Fall MAP Growth Data
Maximize NWEA MAP Growth data this fall! Join our webinar to discover strategies for driving student growth and improving instruction.
Content provided by Otus
Classroom Technology K-12 Essentials Forum How to Teach Digital & Media Literacy in the Age of AI
Join this free event to dig into crucial questions about how to help students build a foundation of digital literacy.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Taking Action: Three Keys to an Effective Multitiered System to Supports
Join renowned intervention experts, Dr. Luis Cruz and Mike Mattos for a webinar on the 3 essential steps to MTSS success.
Content provided by Solution Tree

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Letter to the Editor Are Advanced Placement Exams Becoming Easier?
A letter to the editor reflects on changes to the College Board's Advanced Placement exams over the years.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Assessment Opinion ‘Fail Fast, Fail Often’: What a Tech-Bro Mantra Can Teach Us About Grading
I was tied to traditional grading practices—until I realized they didn’t reflect what I wanted students to learn: the power of failure.
Liz MacLauchlan
4 min read
Glowing light bulb among the crumpled papers of failed attempts
iStock/Getty + Education Week
Assessment See How AP Exam Scores Have Changed Over Time
The College Board adopted a new methodology for scoring AP exams which has resulted in higher passing rates.
1 min read
Illustration concept: data lined background with a line graph and young person holding a pencil walking across the ups and down data points.
iStock/Getty
Assessment Here’s Why More Students Have Passed AP Exams in Recent Years
It isn't that the exams became easier, according to the College Board.
7 min read
Image of wooden block cubes showing the concept of climbing growth.
shutter_m/iStock/Getty