SAT Glitches Prompt Broader Testing Worries
On the heels of revealing that it had mistakenly reported lower scores for 4,000 students—and higher scores for 600—who took the SAT in October, the College Board admitted last week that a batch of 1,600 answer sheets was not rechecked as part of a corrective effort launched when the first round of errors was discovered.
As College Board spokeswoman Chiara Coletti said, “It’s been very embarrassing.”
Ms. Coletti said Pearson Educational Measurement, the Iowa City, Iowa-based testing company that scored the SAT answer sheets, had been asked to put in “more-rigorous quality controls” to remove any glitches that had led to the inaccurate reporting.
But watchdogs of the testing industry—dominated by CTB/McGraw-Hill, Harcourt Assessment, and Pearson—warn that errors could become only too common as standardized testing in schools multiplies under federal and state mandates, increasing pressure on the companies that develop, administer, and score such tests.
A recent report from Education Sector, a Washington think tank, pointed to the competitive pressures in the testing industry, exacerbated by short deadlines, a dearth of testing experts, and lax state oversight. ("U.S. Should Do More to Aid States in Developing Tests, Report Says," Feb. 1, 2006.)
“All testing companies are overwhelmed by the burdens of writing, scoring, and reporting vastly more tests than in the past under No Child Left Behind, and the SAT problem is symptomatic of the burdens that the industry faces in trying to test vastly greater numbers of students under very tight timelines and under highly competitive conditions,” said Thomas Toch, a co-director of Education Sector.
Every testing company faces a steady stream of problems, he said, and while the errors are relatively small as a percentage of the number of tests they handle, they nevertheless persist.
Robert Schaeffer, the public education director of the Cambridge, Mass.-based FairTest, said errors in reporting tests like the SAT can greatly impact a student’s life. Although colleges say that the nation’s most widely used college-entrance exam is just one component for admission, several institutions announced changes in admissions decisions after receiving the corrected scores.
Answer Sheets Rescanned
For example, at Rutgers University in New Jersey, spokeswoman Sandra Lanman said seven students were admitted after their scores changed, while two more received scholarships.
As standardized testing multiplies under federal and state mandates, critics of such tests warn that errors in scoring will increase because of mounting pressure on testing companies. Among the glitches reported in recent years:
2004: Scoring mistakes by the Educational Testing Service on its teacher-licensing test cause 27,000 test takers to receive the wrong scores, including 4,100 men and women who are told they have failed when they had actually passed the exam. ETS has agreed to create a $11.1 million fund to pay damages to the teachers who were given incorrect scores.
2002: Nevada officials report that 736 sophomores and juniors who had been told they failed the mathematics portion of the state test had in fact passed it. Harcourt Assessment, the testing company, says the problem stemmed from a Harcourt computer programmer’s calculation that students needed to answer 42 questions correctly when they needed to answer only 41 correctly to pass.
2001: The Georgia education department is forced to postpone indefinitely the release of results from the Stanford Achievement Test-9th Edition given in grades 3, 5, and 8, as a result of a technical error on a special test form prepared for the state by Harcourt.
2000: More than 8,000 Minnesota high school students are mistakenly told they have failed a state exam. As a result of that mistake, NCS Pearson, a previous version of Pearson Educational Measurement, has to pay a $7 million settlement.
“The more these tests are misused for high-stakes educational decisions,” Mr. Schaeffer said, “the more these errors will have life-altering impacts.”
The College Board rushed to emphasize that the wrong scores were reported for less than 1 percent of the total 495,000 students who took the SAT in October. In about 83 percent of the corrected cases, scores jumped between 10 and 40 points, while only 5 percent saw their scores jump by 100 points or more, out of a possible total score of 2400.
Ms. Coletti urged students who got the wrong scores to look at the issue with a broader perspective.
“The College Board is 106 years old, and the SAT has existed since 1926. This has never happened before on the SAT reasoning test. Our record has been very good, and we hope students will recognize that,” she said.
Pearson said the errors could be related to abnormally high moisture content, perhaps as a result of the weather, which caused the answer sheets to expand. Also, spokesman David Hakensen said, some answer ovals had been marked lightly or incompletely by test-takers and were not readable by the scanner.
Since the errors were discovered, he said, Pearson has rescanned all 495,000 answer sheets from the October exam, as well as those from administrations of the test in November, December, and January. Altogether, 1.5 million answer sheets were scanned.
Pearson is now creating a software program that will look for any evidence of paper expansion and plans to give answer sheets more time to acclimate in its scanning facility in Austin, Texas.
Ms. Coletti said the College Board plans to continue its relationship with the company. “Pearson does scanning for most prominent tests in the United States,” she said. “It would be difficult for us to go into better hands.”
Observers said the SAT reporting problem should prod both the College Board and Pearson to invest in better quality.
“The SAT and ACT [admissions exam] are widely accepted, but when something like this happens, it shakes the confidence of students to a great extent,” said Jon Zeitlin, the general manager for SAT and ACT programs for Kaplan Inc., a New York City-based test-preparation company.
“A lot of our students are angry,” he said. “They are wondering how many other times this has happened before.”
In a survey conducted by Kaplan immediately after the errors surfaced, students railed against the College Board. Samir Hashmi, a senior at Paramus High School in Paramus, N.J., told Kaplan that he found it “ridiculous” that the College Board had “messed up something this important.”
Mr. Hashmi, whose score rose by 20 points after the papers were rechecked, said that “such a small difference means a lot when it comes to applying for college and financial aid.”
Mr. Schaeffer of FairTest said the foul-up could lead more people to question the use of the SAT in admissions decisions.
“This is possibly the straw that broke the camel’s back, in terms of credibility,” he said, pointing to the wide news coverage of the snafu.
Already, more than 700 U.S. colleges do not require applicants to submit SAT or ACT scores.
Mount Holyoke College in South Hadley, Mass., made the SAT optional for applicants in 2001. The college has been studying the effects of the policy with a grant from the New York City-based Andrew W. Mellon Foundation, said Joanna V. Creighton, the college’s president. Early results show just one-tenth of a point difference in the GPA of students who submit scores and those who don’t.
But Mr. Zeitlin of Kaplan said that while some colleges have made the SAT optional, many others are increasing their focus on standardized tests.
The vast majority, he said, are putting more weight on standardized-test scores than ever before because perceived grade inflation at the high school level necessitates a common yardstick like the SAT.
“The SATs are here to stay and ever more important,” he said.
Vol. 25, Issue 28, Pages 8-9