Assessment

Errors on Tests in Nevada And Georgia Cost Publisher Harcourt

By Caroline Hendrie — September 04, 2002 4 min read
  • Save to favorites
  • Print

Nevada has imposed steep penalties on Harcourt Educational Measurement for errors in administering statewide exams, and Georgia is poised to do the same, following scoring glitches typical of the kind that have plagued state- sponsored testing programs in recent years.

Nevada has imposed steep penalties on Harcourt Educational Measurement for errors in administering statewide exams, and Georgia is poised to do the same, following scoring glitches typical of the kind that have plagued state- sponsored testing programs in recent years.

A divided Nevada state school board agreed last week to a settlement that requires the San Antonio-based company to pay penalties totaling $425,000 because of a mistake that threw off the scores of nearly 31,000 high school students who took the state’s high school-exit exam in mathematics last spring. Harcourt also has agreed to change its quality-control procedures to prevent similar errors and to pick up any summer school costs incurred by the 736 10th and 11th graders who were mistakenly told they had failed the test.

“It’s very unfortunate,” Jack W. McLaughlin, Nevada’s superintendent of public instruction, said last week of the scoring error. “On the other hand, ... they’ve really gone the extra mile to ensure that it won’t happen again.”

Georgia officials, meanwhile, have decided not to pay Harcourt for this year’s administration of state-mandated, nationally normed tests because of scoring problems that tainted the results, said Cathy Henson, the chairwoman of the state school board. State officials concluded last month that the results of the Stanford Achievement Test-9th Edition, given last spring to some 340,000 students, were no longer worth trying to salvage.

This marks the second consecutive year of problems for Harcourt in the Peach State: The company was late in delivering the scores from the spring 2001 administration of the Stanford-9, prompting state officials to insert penalty clauses in this year’s contract. That $552,000 contract expired June 30, and the board had already decided not to renew it before this year’s problem surfaced, said Harcourt spokesman Rick Blake. He declined to comment on whether Harcourt expected to be paid for the contract.

Harcourt is not alone in facing difficulties in carrying out its contracts to run standardized programs for states. Errors leading to stiff penalties for several major testing contractors have cropped up with increasing frequency in recent years as states have ramped up their testing. (“States Face Limited Choices in Assessment Market,” March 8, 2000.)

Problems Could Worsen

Some experts predict such problems will get worse as states push to comply with new federal requirements for annual student testing in grades 3-8 and one high school grade.

In an Aug. 26 statement about the settlement in Nevada, Harcourt President Dean Nafziger said the payment to the state would come in the form of “cash, teacher instruction, library books, and materials.” Nevada’s testing director, Paul M. La Marca, said that $275,000 of the penalty would be in the form of cash discounts on the contract.

The state’s $2.36 million, 18-month contract with Harcourt to administer the state’s high school exam extends through next June. The company also has a $2.5 million contract covering the same period to write and administer standards-based state tests for grades 3-8, state officials said.

The Nevada state board approved the $425,000 agreement with Harcourt Aug. 26, but also stipulated that another “significant” error by the company would result in the contract’s immediate termination, Mr. McLaughlin said. Some board members denounced the company for failing to provide greater personal compensation for the students who were erroneously told they had flunked, he added.

The mistake occurred because of an improperly formatted computer file and took place during the process used to “equate” the results from the April 2002 administration of the math test to those from tests given previously, according to Mr. La Marca.

Results Deemed Too Late

Georgia education officials suspected a problem after seeing sharp spikes and drop-offs in students’ average scores in certain subjects from 2001 and 2002, Ms. Henson said.

Harcourt officials told the state that the company might be able to correct the problem, given more time. But after consulting with an independent panel of testing experts, state board members concluded that even if valid results were eventually made available, they would be too late to be useful, Ms. Henson said. Some districts use the results to make decisions about student placements and merit pay for employees, and the state wanted to use the results to help see how Georgia students stack up against their peers nationwide.

But the state also administers what are called criterion-referenced tests—ones, in this instance, that are aligned with its academic standards—and it is those that are used in making certain major decisions under the state’s accountability system, such as which schools are placed on the list of those needing improvement.

Harcourt officials said the raw scores from the 2002 Georgia tests were correct, but that the problem arose during the equating process, which is designed to allow for scores to be compared from one year to the next.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Equity and Access in Mathematics Education: A Deeper Look
Explore the advantages of access in math education, including engagement, improved learning outcomes, and equity.
Content provided by MIND Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Sponsor
Fewer, Better Assessments: Rethinking Assessments and Reducing Data Fatigue
Imagine a classroom where data isn't just a report card, but a map leading students to their full potential. That's the kind of learning experience we envision at ANet, alongside educators
Content provided by Achievement Network
Superintendent Dr. Kelly Aramaki - Watch how ANet helps educators
Photo provided by Achievement Network
Assessment Opinion What's the Best Way to Grade Students? Teachers Weigh In
There are many ways to make grading a better, more productive experience for students. Here are a few.
14 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
Assessment Spotlight Spotlight on Assessment
This Spotlight will help you evaluate effective ways to offer students feedback, learn how to improve assessments for ELs, and more.
Assessment Opinion To Replace Skill Mastery for Seat Time, There Are 3 Requirements
Time for learning and student support take on a whole new meaning in the mastery-based learning model.
4 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty