Assessment

Study: Test-Preparation Courses Raise Scores Only Slightly

By Mary Ann Zehr — April 04, 2001 4 min read
  • Save to favorites
  • Print

Students and their parents shouldn’t expect scores on college-entrance exams to improve significantly through test-preparation courses or tutors, a study suggests.

The study, published in the winter issue of Chance, a magazine of the American Statistical Association in Alexandria, Va., calls into question some companies’ claims that their coaching services cause students’ scores to improve on average by more than 100 points on the SAT, for example, said Derek C. Briggs, the author of the study.

Mr. Briggs, a doctoral student in education at the University of California, Berkeley, found that while students who have taken either the SAT or the ACT once and want to take it again can raise their scores with special test preparation, they likely will do so by only a very small amount.

“ ‘A gain is different from an effect’ is essentially the message” of the study, Mr. Briggs said in an interview last week. He challenged test-preparation companies to conduct more rigorous analyses of the effects of their services on test-takers’ scores than they have so far.

Seppy Basili, the vice president for learning and assessment for a leading test-preparation company, Kaplan Inc., based in New York City, criticized Mr. Briggs’ study for not drawing a distinction between comprehensive, time-intensive courses and short-term sessions that may last for less than a day.

“You can’t really lump all test preparation together,” Mr. Basili said. “We don’t feel his study applies to Kaplan students or comprehensive programs like ours.”

From Tutors to Videos

The study analyzes the scores of students who have taken the PSAT—the preliminary test that serves as practice for the SAT—and then take either the SAT or the ACT. It compares the scores of students who participated in test preparation with those of test-takers who did not, taking into consideration various student characteristics, such as family income level.

Mr. Briggs based his analysis on information from a database of students called the National Education Longitudinal Survey of 1988, which tracks a representative national sample of 16,500 U.S. students from the 8th grade through high school and beyond.

The study used responses to a question included in the survey about whether students had prepared for the SAT or the ACT by taking a special course in school, taking a course offered by a commercial service, receiving private one- on-one tutoring, or studying from test-preparation books, videotapes, or computer programs.

The type of test preparation students choose may have an impact on their scores, the study found. For example, students with private tutors were able to improve their scores on the math section of the SAT on average by 19 points more, on an 800-point scale, than students who didn’t have private tutors. A commercial class had a similar effect. By contrast, the use of a video to prepare for the SAT showed no effect on math scores, and the verbal scores of students who had used the videos even went down.

Mr. Briggs includes a caveat even with those findings—the fact that students who participate in test preparation for the SAT or the ACT tend to be more affluent, more motivated, and generally more academically ready to take the tests than students who do not.

“This pattern of differences suggests that an analysis restricted to test-score changes will overestimate the effect of coaching,” he writes.

After controlling for such differences, Mr. Briggs concludes that the average test-preparation boost on the math section of the SAT is 14 to 15 points, and 6 to 8 points on the verbal section, which also uses an 800-point scale. He adds in the study that the effect of test- preparation on students’ ACT scores is similar.

Mr. Basili of Kaplan said that Mr. Briggs’ findings don’t contradict claims that his company has made. A 1995 study commissioned by Kaplan found that students who had spent 36 hours in a Kaplan course averaged a 120-point improvement on the SAT over what they had scored on the PSAT, he said.

Mr. Briggs said that such studies by commercial companies are based on surveys of their own students and lack the important ingredient of a control group of students who didn’t participate in test preparation.

Kaplan and other commercial companies haven’t conducted studies with a control group of students in part because almost all students now use some kind of test-preparation services and it’s difficult to find any who don’t, said Mr. Basili.

He added that the data used in Mr. Briggs’ study was collected at a time—in the early 1990s—when test-preparation services were less popular than they are now and students also were less likely to admit they used them.

Stephen P. Klein, a senior research scientist who specializes in educational assessment for the RAND Corp., a think tank based in Santa Monica, Calif., characterized Mr. Briggs’ study as an “important contribution” because it was produced by an independent party and “adjusts for a lot of factors that are important to adjust for.”

A version of this article appeared in the April 04, 2001 edition of Education Week as Study: Test-Preparation Courses Raise Scores Only Slightly

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Your Questions on the Science of Reading, Answered
Dive into the Science of Reading with K-12 leaders. Discover strategies, policy insights, and more in our webinar.
Content provided by Otus
Mathematics Live Online Discussion A Seat at the Table: Breaking the Cycle: How Districts are Turning around Dismal Math Scores
Math myth: Students just aren't good at it? Join us & learn how districts are boosting math scores.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment The 5 Burning Questions for Districts on Grading Reforms
As districts rethink grading policies, they consider the purpose of grades and how to make them more reliable measures of learning.
5 min read
Grading reform lead art
Illustration by Laura Baker/Education Week with E+ and iStock/Getty
Assessment As They Revamp Grading, Districts Try to Improve Consistency, Prevent Inflation
Districts have embraced bold changes to make grading systems more consistent, but some say they've inflated grades and sent mixed signals.
10 min read
Close crop of a teacher's hands grading a stack of papers with a red marker.
E+
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Sponsor
Fewer, Better Assessments: Rethinking Assessments and Reducing Data Fatigue
Imagine a classroom where data isn't just a report card, but a map leading students to their full potential. That's the kind of learning experience we envision at ANet, alongside educators
Content provided by Achievement Network
Superintendent Dr. Kelly Aramaki - Watch how ANet helps educators
Photo provided by Achievement Network
Assessment Opinion What's the Best Way to Grade Students? Teachers Weigh In
There are many ways to make grading a better, more productive experience for students. Here are a few.
14 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty