Assessment

Districts Push Back Over Cheating Probe

By Christina A. Samuels — April 03, 2012 5 min read
  • Save to favorites
  • Print

A newspaper investigation that turned up unusual test-score fluctuations in about 200 school districts in a nationwide sample of 14,700 has revived a debate about cheating on standardized tests—and prompted immediate pushback from some of the districts flagged by the analysis. They contend that the newspaper’s methodology was flawed.

The Atlanta Journal-Constitution article looked at test scores in about 69,000 schools around the country. The reporters requested average reading and mathematics results for state exams given in grades 3-8 from 50 states and the District of Columbia, as well as the count of students tested for each school, grade, and subject in those jurisdictions.

The newspaper did not have access to student-level data. Instead, it created “classes” of all the test-takers in a given grade in each school—for example, comparing all the 3rd grade test-takers to all the 4th grade test-takers in the same school the next year. If the 4th graders in a “class” performed unusually better or unusually worse on state standardized tests than they had the previous year, that school was flagged. In some schools, the scores varied so widely that it was nearly impossible to attribute the variation to chance, the article said.

In the March 25 article, the reporters said that the fluctuations did not prove there was cheating in those schools, a point reiterated by Kevin Riley, the editor of the Journal-Constitution, in an interview with Education Week.

“What we’ve really done is something that points out suspicious scores and says, ‘This warrants further investigation,’ ” Mr. Riley said. He also noted that the “vast majority” of educators are working in districts where no suspicious variations were found.

The story did note that similar test-score fluctuations were seen in the Atlanta district, which was the center of a recent high-profile cheating scandal. The Journal-Constitution‘s extensive investigation into its 50,000-student hometown district eventually prompted a state probe, which found evidence of adult-led cheating on the 2009 Georgia state test at 44 of the 56 schools examined. (“Report Details ‘Culture of Cheating’ in Atlanta Schools,” July 13, 2011.)

The responses from many districts and education groups, some of which were released preemptively a few days before the article appeared, indicated that they saw themselves as being accused of cheating based on methodology they considered severely flawed.

The 78,000-student Nashville, Tenn., district said the schools flagged there were campuses with high rates of student mobility, making it hard to measure one cohort of students against another. The district also said that scores for students in special education taking modified assessments, measured on a 200-to-400 scale, were averaged in with the scores of students in regular education classes, which are scored on a 600-to-900 scale. One Nashville middle school was included in the original article as having an unusual fluctuation, but based on those statistical problems, the district said the newspaper’s calculations were 48 scale-score points lower than the true score. A reference to that school was later taken out of the online version of the story.

More Analysis Needed?

The newspaper analysis also flagged some schools in the Houston Independent School District. The 203,000-student district responded to the story by noting that it has had confirmed cases of cheating. But the district also said it takes a vigorous stance against testing impropriety.

Houston school officials also took exception to the Atlanta newspaper’s methods, saying that, although test-score variations can be a “useful statistical tool,” such analyses tend to flag schools with large changes in their student enrollments, or schools that serve special populations. For example, an alternative school with short-term placements was flagged, as were two “overflow” schools that serve as crowding-relief campuses.

Mr. Riley, the newspaper’s editor, said that the paper is responding seriously to the objections of districts in the report and that it plans follow-up stories. However, he also said that the Atlanta district hammered the paper’s investigative methods, which were eventually proved correct. “What we need now is courageous people who will dig into this without fear,” he said.

Jaxk H. Reeves, an associate professor at the University of Georgia, in Athens, and the director of the Statistical Consulting Center there, worked with the newspaper on its analysis. In an interview, he said student mobility, which districts have coalesced around as a way to discount the newspaper’s results, is not as important as districts suggest. That’s because even though schools may not serve the same students from year to year, they tend to serve the same types of students, in terms of demographics and achievement. The newspaper also made some adjustments for mobility, for example, excluding “classes” where student numbers varied by more than 25 percent from one year to the next.

If mobility were the sole reason for the variation, Mr. Reeves said, then more schools in urban districts, where mobility is often high, should have been flagged.

“I do believe if a district is being flagged a lot, they should look at individual schools,” Mr. Reeves said.

Gary J. Miron, a professor of evaluation, measurement, and research at Western Michigan University, in Kalamazoo, has emerged as a critic of the newspaper’s work. A week before the report’s release, he evaluated some of the data for Ohio districts for a separate story published in the Dayton Daily News, which is owned by the same newspaper group that owns the Journal-Constitution. He said he identified weaknesses in the research, but was told that the paper would be moving forward with publication.

Mr. Miron said the reporters made an “adequate” first step at identifying irregularities, but it was only a first step. What is also needed is student-level data and then erasure-analysis data from the testing companies.

“Throughout the reporting, they imply cheating,” Mr. Miron said, noting that there are a number of explanations beyond cheating that address the fluctuations. “They should finish their analysis.

A version of this article appeared in the April 04, 2012 edition of Education Week as Test-Cheating Probe Spawns Questions Over Its Methodology

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Teaching Profession Webinar
Professional Wellness Strategies to Enhance Student Learning and Live Your Best Life
Reduce educator burnout with research-affirmed daily routines and strategies that enhance achievement of educators and students alike. 
Content provided by Solution Tree
English-Language Learners Webinar The Science of Reading and Multilingual Learners: What Educators Need to Know
Join experts in reading science and multilingual literacy to discuss what the latest research means for multilingual learners in classrooms adopting a science of reading-based approach.
School & District Management K-12 Essentials Forum Get a Strong Start to the New School Year
Get insights and actions from Education Week journalists and expert guests on how to start the new school year on strong footing.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Opinion Are There Better Ways Than Standardized Tests to Assess Students? Educators Think So
Student portfolios and school community surveys are but two of the many alternatives to standardized tests.
3 min read
Illustration of students in virus environment facing wave of test sheets.
Collage by Vanessa Solis/Education Week (Images: iStock/DigitalVision Vectors/Getty)
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Empowering personalized instruction with a three-tiered approach to learning evidence
Navvy is the first classroom assessment system designed to empower personalized learning by providing granular, reliable, and proximal le...
Content provided by Pearson
Assessment Letter to the Editor We Need NAEP
The president and CEO of Knowledge Alliance responds to a recent opinion essay's criticism of the National Assessment of Educational Progress.
1 min read
Illustration of an open laptop receiving an email.
iStock/Getty
Assessment Letter to the Editor 2022 Assessment ‘Most Important’ Ever
The executive director of the National Assessment Governing Board responds to criticism of NAEP in this letter to the editor.
1 min read
Illustration of an open laptop receiving an email.
iStock/Getty