Students and their parents shouldn’t expect scores on college-entrance exams to improve significantly through test-preparation courses or tutors, a study suggests.
The study, published in the winter issue of Chance, a magazine of the American Statistical Association in Alexandria, Va., calls into question some companies’ claims that their coaching services cause students’ scores to improve on average by more than 100 points on the SAT, for example, said Derek C. Briggs, the author of the study.
Mr. Briggs, a doctoral student in education at the University of California, Berkeley, found that while students who have taken either the SAT or the ACT once and want to take it again can raise their scores with special test preparation, they likely will do so by only a very small amount.
“ ‘A gain is different from an effect’ is essentially the message” of the study, Mr. Briggs said in an interview last week. He challenged test-preparation companies to conduct more rigorous analyses of the effects of their services on test-takers’ scores than they have so far.
Seppy Basili, the vice president for learning and assessment for a leading test-preparation company, Kaplan Inc., based in New York City, criticized Mr. Briggs’ study for not drawing a distinction between comprehensive, time-intensive courses and short-term sessions that may last for less than a day.
“You can’t really lump all test preparation together,” Mr. Basili said. “We don’t feel his study applies to Kaplan students or comprehensive programs like ours.”
From Tutors to Videos
The study analyzes the scores of students who have taken the PSAT—the preliminary test that serves as practice for the SAT—and then take either the SAT or the ACT. It compares the scores of students who participated in test preparation with those of test-takers who did not, taking into consideration various student characteristics, such as family income level.
Mr. Briggs based his analysis on information from a database of students called the National Education Longitudinal Survey of 1988, which tracks a representative national sample of 16,500 U.S. students from the 8th grade through high school and beyond.
The study used responses to a question included in the survey about whether students had prepared for the SAT or the ACT by taking a special course in school, taking a course offered by a commercial service, receiving private one- on-one tutoring, or studying from test-preparation books, videotapes, or computer programs.
The type of test preparation students choose may have an impact on their scores, the study found. For example, students with private tutors were able to improve their scores on the math section of the SAT on average by 19 points more, on an 800-point scale, than students who didn’t have private tutors. A commercial class had a similar effect. By contrast, the use of a video to prepare for the SAT showed no effect on math scores, and the verbal scores of students who had used the videos even went down.
Mr. Briggs includes a caveat even with those findings—the fact that students who participate in test preparation for the SAT or the ACT tend to be more affluent, more motivated, and generally more academically ready to take the tests than students who do not.
“This pattern of differences suggests that an analysis restricted to test-score changes will overestimate the effect of coaching,” he writes.
After controlling for such differences, Mr. Briggs concludes that the average test-preparation boost on the math section of the SAT is 14 to 15 points, and 6 to 8 points on the verbal section, which also uses an 800-point scale. He adds in the study that the effect of test- preparation on students’ ACT scores is similar.
Mr. Basili of Kaplan said that Mr. Briggs’ findings don’t contradict claims that his company has made. A 1995 study commissioned by Kaplan found that students who had spent 36 hours in a Kaplan course averaged a 120-point improvement on the SAT over what they had scored on the PSAT, he said.
Mr. Briggs said that such studies by commercial companies are based on surveys of their own students and lack the important ingredient of a control group of students who didn’t participate in test preparation.
Kaplan and other commercial companies haven’t conducted studies with a control group of students in part because almost all students now use some kind of test-preparation services and it’s difficult to find any who don’t, said Mr. Basili.
He added that the data used in Mr. Briggs’ study was collected at a time—in the early 1990s—when test-preparation services were less popular than they are now and students also were less likely to admit they used them.
Stephen P. Klein, a senior research scientist who specializes in educational assessment for the RAND Corp., a think tank based in Santa Monica, Calif., characterized Mr. Briggs’ study as an “important contribution” because it was produced by an independent party and “adjusts for a lot of factors that are important to adjust for.”