Requirement for Admissions Tests
By Mark Pitsch
The officials acknowledged that they were taking a gamble.
For decades, the S.A.T. and the American College Program test have been used by college-admissions officials as a “common yardstick” to determine if students from New England boarding schools, one-room schools, and everywhere in between are ready for college-level work.
But the Bates officials did not jump into the uncharted waters precipitously. In true academic fashion, they commissioned a study to determine if the policy resulted in a change in admissions decisions, and decided not to make the policy permanent until the results of the study were in.
A year and a half ago, the study was completed, and its findings confirmed the hopes of the policy’s backers: Not only did it show that students who had not taken the S.A.T. succeeded academically as well as those who had taken the test, officials also found that the optional-test policy resulted in a more diverse student body. “We have to believe that students are right when they tell us that they’re better students than their test scores might indicate,’' says William C. Hiss, the dean of admissions and financial aid at Bates.
“As a nation,” he adds, “we’re caught on a hook of defining intelligence in certain ways and then letting our colleges and universities be defined in the same narrow ways.”
Although researchers at other colleges and at the College Board, which sponsors the S.A.T., maintain that the test remains a valuable predictor of college success, Bates officials say their study casts doubt on its usefulness.
As a result, the faculty last year voted 94 to 1 to make all standardized admissions and achievement tests optional. The 392 freshmen who started their collegiate careers at Bates this fall were the first who had the ability to choose whether they wanted Bates admissions officials to review their performance on standardized tests, not just the S.A.T.
Those students will be studied, in turn, to determine how those who were admitted without having reported any test scores stack up academically against those students who wanted their scores to be used in their admissions applications.
‘Scaring Off Great Kids’
According to the National Center on Fair & Open Testing, or FairTest, some 107 colleges and universities, or state systems, do not require the S.A.T. or A.C.T.
Bates, a small, private liberal-arts college, was one of the first to reduce its reliance on the tests.
Under its 1984 policy, students were no longer required to submit S.A.T. scores. But the admissions office required the submission of three of the College Board’s Achievement Tests, which measure performance in specific subjects, or results of the A.C.T. Students could voluntarily submit S.A.T. scores.
Those who declined, according to Mr. Hiss, were evaluated on a personal interview, recommendations, high-school performance, and other qualities that might not show up on an academic record.
“What Bates [was] trying to do [was] find a way--and we make no claims to have any magic wands--to allow the youngster who may have these other skills, these other intelligences,” an opportunity to express them, Mr. Hiss says.
“We came to feel we were scaring off more great kids that would do well at Bates by requiring tests than we were in finding great kids by using the tests,” he says.
The study, which was conducted by Bates faculty members and students, suggests that they were right.
Examining the academic performance of the 1,379 students who submitted their S.A.T. scores over the five-year period with that of the 417 who declined to submit their scores, the study found that the optional-S.A.T. policy had “no visible disadvantageous consequence.”
Although the average S.A.T. scores of submitters and those who did not differed by 160 points out of 1,600, the difference in cumulative first-year grade-point averages was very small, the study found.
Submitters averaged a freshman G.P.A. of 2.89 on a 4.0 scale, compared with an average 2.84 G.P.A. for non-submitters, according to the study.
Moreover, between 92 percent and 99 percent of Bates students over the five years were found to be in good academic standing, and the percentage of non-submitters in that category exceeded that of submitters.
Only 1 of the non-submitters, compared with 14--or 1 percent--of the submitters, had been dismissed for academic reasons, the study found.
Minorities’ Applications Doubled
Mr. Hiss also notes with pride that applications from minority students doubled during the years Bates had an optional S.A.T. policy, and that the trend has continued since the college dropped all testing requirements.
Bates received 3,650 applications for the 392 available spots this semester, an increase of more than 7 percent over last year, he notes. Of the 392, 110, or 28 percent, were non- submitters.
Of the non-submitters, 65 percent were women, and about two-thirds of the blacks and one-half of the Latino enrollees chose not to submit their test scores, he says.
“It speaks volumes to the real or perceived bias against women and minorities,” Mr. Hiss says, adding that “the increase in applications in a down market is worthy of note.”
The admissions officer acknowledges that making all standardized tests optional for admissions has become an impressive marketing tool.
It sends the right signals to students, he says, telling them that Bates believes they are more than their test scores and that it doubts that the tests are fair to women and minorities.
Although Mr. Hiss has found encouragement in the research conducted on optional testing at his school, other higher-education officials say that standardized tests are valuable tools for predicting a student’s first-year college performance.
Susan Murphy, the dean of admissions and financial aid at Cornell University, says a study she conducted found that using the S.A.T. along with high-school rank was nearly twice as effective as using high school grades alone in predicting freshman performance.
“When you can see that kind of difference... that’s an important factor for us,” Ms. Murphy says.
But she acknowledges that the test and grades together did not predict much of the performance of students in one part of the study, from the university’s school of human ecology.
“Is that sufficient?” she asks. “That’s a tough question to answer. We’re not looking for a magic number.”
Ms. Murphy points out, however, that institutions like hers that use standardized tests to make admissions decisions do not fail to take into consideration such things as personal characteristics, recommendations, and talents that would not be recognized by such tests.
“It’s incumbent as we do this to look at a number of other factors,” she says. “We don’t just plunk kids into a computer with their class ranks and S.A.T. scores.”
The results of the Cornell study are similar to those found in research at the College Board.
That study acknowledges that simply using a high-school record as a predictor of first-year success is better than using solely the S.A.T.
But the study notes that combining those two factors--"the way [the S.A.T.] was intended to be used,” according to the board’s executive director for research and development, Robert Cameron--is more effective than either used solely.
Mr. Cameron also notes that more than 150 institutions each year--most of them small liberal-arts schools--use a College Board validity service to test their use of the S.A.T. as an admissions criterion.
Mr. Cameron discounts the findings of the Bates study by noting that most of the institutions similar to it do require submission of standardized tests. Since most students apply to at least three colleges, he says, those applying to Bates know how they would compete at similar schools.
“If all colleges stopped using admission tests, and students stopped taking them, there would soon be a return to a test requirement,’' Mr. Cameron predicts.
But Mr. Hiss says the policy is here to stay.
“I have yet to have my first comment from students, parents, or counselors saying [the policy] is wrong, that it’s ethically unfair to measure some kids one way and other kids another,” he says.
A version of this article appeared in the November 27, 1991 edition of Education Week as Study Affirms Bates College Officials’ Hunch In Dropping