Assessment

Study: Test-Preparation Courses Raise Scores Only Slightly

By Mary Ann Zehr — April 04, 2001 4 min read
  • Save to favorites
  • Print

Students and their parents shouldn’t expect scores on college-entrance exams to improve significantly through test-preparation courses or tutors, a study suggests.

The study, published in the winter issue of Chance, a magazine of the American Statistical Association in Alexandria, Va., calls into question some companies’ claims that their coaching services cause students’ scores to improve on average by more than 100 points on the SAT, for example, said Derek C. Briggs, the author of the study.

Mr. Briggs, a doctoral student in education at the University of California, Berkeley, found that while students who have taken either the SAT or the ACT once and want to take it again can raise their scores with special test preparation, they likely will do so by only a very small amount.

“ ‘A gain is different from an effect’ is essentially the message” of the study, Mr. Briggs said in an interview last week. He challenged test-preparation companies to conduct more rigorous analyses of the effects of their services on test-takers’ scores than they have so far.

Seppy Basili, the vice president for learning and assessment for a leading test-preparation company, Kaplan Inc., based in New York City, criticized Mr. Briggs’ study for not drawing a distinction between comprehensive, time-intensive courses and short-term sessions that may last for less than a day.

“You can’t really lump all test preparation together,” Mr. Basili said. “We don’t feel his study applies to Kaplan students or comprehensive programs like ours.”

From Tutors to Videos

The study analyzes the scores of students who have taken the PSAT—the preliminary test that serves as practice for the SAT—and then take either the SAT or the ACT. It compares the scores of students who participated in test preparation with those of test-takers who did not, taking into consideration various student characteristics, such as family income level.

Mr. Briggs based his analysis on information from a database of students called the National Education Longitudinal Survey of 1988, which tracks a representative national sample of 16,500 U.S. students from the 8th grade through high school and beyond.

The study used responses to a question included in the survey about whether students had prepared for the SAT or the ACT by taking a special course in school, taking a course offered by a commercial service, receiving private one- on-one tutoring, or studying from test-preparation books, videotapes, or computer programs.

The type of test preparation students choose may have an impact on their scores, the study found. For example, students with private tutors were able to improve their scores on the math section of the SAT on average by 19 points more, on an 800-point scale, than students who didn’t have private tutors. A commercial class had a similar effect. By contrast, the use of a video to prepare for the SAT showed no effect on math scores, and the verbal scores of students who had used the videos even went down.

Mr. Briggs includes a caveat even with those findings—the fact that students who participate in test preparation for the SAT or the ACT tend to be more affluent, more motivated, and generally more academically ready to take the tests than students who do not.

“This pattern of differences suggests that an analysis restricted to test-score changes will overestimate the effect of coaching,” he writes.

After controlling for such differences, Mr. Briggs concludes that the average test-preparation boost on the math section of the SAT is 14 to 15 points, and 6 to 8 points on the verbal section, which also uses an 800-point scale. He adds in the study that the effect of test- preparation on students’ ACT scores is similar.

Mr. Basili of Kaplan said that Mr. Briggs’ findings don’t contradict claims that his company has made. A 1995 study commissioned by Kaplan found that students who had spent 36 hours in a Kaplan course averaged a 120-point improvement on the SAT over what they had scored on the PSAT, he said.

Mr. Briggs said that such studies by commercial companies are based on surveys of their own students and lack the important ingredient of a control group of students who didn’t participate in test preparation.

Kaplan and other commercial companies haven’t conducted studies with a control group of students in part because almost all students now use some kind of test-preparation services and it’s difficult to find any who don’t, said Mr. Basili.

He added that the data used in Mr. Briggs’ study was collected at a time—in the early 1990s—when test-preparation services were less popular than they are now and students also were less likely to admit they used them.

Stephen P. Klein, a senior research scientist who specializes in educational assessment for the RAND Corp., a think tank based in Santa Monica, Calif., characterized Mr. Briggs’ study as an “important contribution” because it was produced by an independent party and “adjusts for a lot of factors that are important to adjust for.”

A version of this article appeared in the April 04, 2001 edition of Education Week as Study: Test-Preparation Courses Raise Scores Only Slightly

Events

Jobs October 2021 Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Data Webinar
Using Integrated Analytics To Uncover Student Needs
Overwhelmed by data? Learn how an integrated approach to data analytics can help.

Content provided by Instructure
Classroom Technology Webinar How Pandemic Tech Is (and Is Not) Transforming K-12 Schools
The COVID-19 pandemic—and the resulting rise in virtual learning and big investments in digital learning tools— helped educators propel their technology skills to the next level. Teachers have become more adept at using learning management

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Data Young Adolescents' Scores Trended to Historic Lows on National Tests. And That's Before COVID Hit
The past decade saw unprecedented declines in the National Assessment of Educational Progress's longitudinal study.
3 min read
Assessment Long a Testing Bastion, Florida Plans to End 'Outdated' Year-End Exams
Florida Gov. Ron DeSantis said the state will shift to "progress monitoring" starting in the 2022-23 school year.
5 min read
Florida Governor Ron DeSantis speaks at the opening of a monoclonal antibody site in Pembroke Pines, Fla., on Aug. 18, 2021.
Florida Gov. Ron DeSantis said he believes a new testing regimen is needed to replace the Florida Standards Assessment, which has been given since 2015.
Marta Lavandier/AP
Assessment Spotlight Spotlight on Assessment in 2021
In this Spotlight, review newest assessment scores, see how districts will catch up with their supports for disabled students, plus more.
Assessment 'Nation's Report Card' Has a New Reading Framework, After a Drawn-Out Battle Over Equity
The new framework for the National Assessment of Educational Progress will guide development of the 2026 reading test.
10 min read
results 925693186 02
iStock/Getty