Low-Key ACT Avoids Uproar On College Tests
Just off the Interstate here, in an ordinary brick building partially hidden by groves of Japanese maples and crab apple trees, the headquarters of one of the nation's college-admissions-test leaders spreads across a grassy campus draped in spring shadows.
It's the type of unassuming home befitting ACT Inc., whose low-key style has helped transform this once small, egalitarian admissions-testing alternative into a national testing powerhouse and, more recently, a player in state high school assessments for accountability.
The place has an academic air. Test- writers work on exams in secured rooms. Researchers and analysts go about their bookish business in a maze of cubicles. With about 1,000 employees, ACT Inc., a nonprofit organization, is the largest private employer in Iowa City.
Nine hundred miles from the skyscrapers of New York City—where its better-known competitor, the College Board, which owns the SAT, has its headquarters amid the valuable real estate of Columbus Avenue in Manhattan—the ACT has in many respects avoided the public scrutiny directed at the SAT.
This university town of about 50,000 in the rolling prairie of eastern Iowa, home to the University of Iowa, also is a long way from the prestigious Ivy League town of Princeton, N.J., home of the Educational Testing Service, which administers the SAT for the College Board.
But even as the SAT holds the leading role—both savoring the attention and sweating it out as the main attraction in the admissions-testing world—the ACT has been anything but a backup singer.
Some 1 million students took the ACT last year, the highest number in the company's 42-year history and 250,000 more than in 1990.
And while the SAT, taken by 1.3 million students last year, is still the test most students take on the East and West coasts, a majority of colleges located in the vast, middle swath of the country use the ACT as their primary admissions test. More students take the ACT than the SAT in 26 states.
Peter Sacks, the author of Standardized Minds: The High Price of America's Testing Culture and What We Can Do To Change It, published this year, says the ACT enjoys its relatively low profile.
"The ACT has relished its role as this quiet, serious alternative to the big, bad SAT behemoth," Mr. Sacks said. "They like the role of number two in the industry. They prefer to have the College Board take all the heat."
Richard C. Atkinson, the president of the University of California system, dropped a bombshell on the education world in February when he declared that an overemphasis on college-entrance exams had led to "the educational equivalent of a nuclear arms race."
In a speech to hundreds of educators gathered for a higher education conference in Washington, he recommended eliminating SAT I scores as a requirement for admissions to the 10-campus UC system. A slew of ensuing media coverage once again put the SAT under the microscope. ("UC President Pitches Plan To End Use of SAT in Admissions," Feb. 28, 2001.)
Throughout those discussions on what role college-entrance-test scores should play in admissions decisions, most of the attention centered on the SAT. The ACT was rarely mentioned.
But in interviews here recently, ACT officials said their test meets many of the standards Mr. Atkinson outlined for what he says would constitute a better exam: an assessment tool more closely tied to the actual high school curriculum.
In his proposal, Mr. Atkinson recommended that the UC system move toward a more "holistic" admissions process. He proposed that UC use only tests like the SAT II that assess the mastery of specific subjects—like writing or biology—until standardized tests could be developed that were more directly tied to college-preparatory courses.
Gaston Caperton, the College Board's president, later released a statement in which he agreed that too much emphasis was placed on admissions tests. But he asserted that the SAT "is the only common yardstick in an era of grade inflation."
Mr. Atkinson declined recently to comment about the ACT. But Richard L. Ferguson, ACT Inc.'s president, said in an interview that "much of what [Mr. Atkinson] was describing is what the ACT has been doing for the last 40 years. Our whole philosophy from day one was to measure what is taught in high school, and what college professors say are required skills."
ACT accomplishes that, he said, by sending curriculum surveys to high schools and colleges on a consistent basis. Hundreds of high school teachers and college professors trained by ACT submit potential test questions to the testing service, where they go through a thorough review by testing experts.
"Our message is we're really measuring what the classes are teaching," said Mr. Ferguson, a former high school math teacher who has been with ACT since 1972 and has served as its president since 1988. "You would be hard pressed if you took a copy of the ACT math test, for example, and could not fit that into exactly what is being taught in the classroom."
Mr. Ferguson sees that as one of the most significant differences between the ACT and the SAT. "What we're measuring," he said, "are the important skills that the schools are saying youngsters need to have."
Before 1959, when the American College Testing program was founded, the SAT was the only national college-admissions test. What was then called the Scholastic Aptitude Test became the primary measure for students seeking admission into the nation's elite colleges and universities, mainly those on the East Coast.
ACT's founders, University of Iowa professor E.F. Lindquist and Ted McCarrel, the university's dean of admissions, recognized the need for another admissions test for the remainder of American colleges, which often admitted students based mainly on family connections or a hodgepodge of entrance exams offered by individual institutions.
A new national test, the founders hoped, would help provide more detailed information to colleges, spur improvements in the high school curriculum, and help students make more informed decisions about postsecondary education.
In a 1958 speech he gave to an Educational Testing Service conference in New York City just a year before ACT was formed, Mr. Lindquist, a giant in the world of assessment who also developed the Iowa Tests of Basic Skills and several other tests, broadly touched on the themes for a better admissions exam.
"The right kind of tests will therefore make the high schools more keenly aware of their own responsibilities and shortcomings," he said, "and at the same time will give them positive aid in meeting these responsibilities, by drawing their attention to broad areas or aspects of achievement most in need of improvement.
"I need hardly point out," he continued, "that college-entrance examinations of the type generally regarded as intelligence tests or scholastic aptitude tests ... are almost wholly useless for these purposes, as they are for motivating the individual students."
Presaging a debate that has become more intense over the past four decades, Mr. Lindquist also warned the conference-goers about the potential for an overreliance on admissions tests because of their "impersonal and objective basis for making unpopular decisions about the applicants," and the likelihood of "superficial cramming" and "bad coaching practices." The American College Testing program changed its name to ACT Inc. in 1996.
The ACT is a three-hour multiple-choice test that assess critical reasoning and higher-order thinking skills in English, math, reading, and science. Students receive a score on a scale of 1 to 36 in each subject and a composite score.
Last year, the average national score was 21 for the fourth consecutive year. White and Asian-American males had the highest composite score, averaging 21.9. African-American men and women had the lowest scores, with average scores of 16.7 and 17.2, respectively.
While the college- admissions exam receives the most attention, officials here are quick to emphasize that the exam is just one part of ACT's programs. A "Standards for Transition" program called Explore begins in 8th grade, when students are tested on their academic progress.
High school sophomores can take ACT's Plan, a midpoint review of students' progress toward their educational and career goals. Students receive detailed feedback about how their scores in each subject area coincide with their skills. Ultimately, ACT officials say, that information can be valuable in helping students better understand what is required of them if they want to pursue college or other types of postsecondary study.
In 1989, ACT officials revised their college-entrance tests to match changes in college-prep high school curricula. And as more states align their curricula with state standards that are increasingly being measured through statewide assessments, the ACT has been used in some states along with state exams as one type of assessment tool. ACT officials agreed to become more involved in high school assessment under the condition that their tests not be used as high-stakes exams that determine whether students will graduate.
For the first time, for example, some 120,000 high school juniors in Illinois this year took the ACT, along with the Prairie State Achievement Examination in writing, science, and social studies, as part of a new state testing program. In Colorado, as part of an education improvement effort supported by Gov. Bill Owens, about 49,000 students took the ACT in April.
And in Oklahoma, the state regents for higher education are spending about $750,000 a year on ACT's educational planning and assessment system. Last year, 438 of the state's more than 500 school districts volunteered to give the 8th grade Explore and 10th grade Plan assessment. ("K-12 and College Expectations Often Fail To Mesh," May 9, 2001.)
While the ACT has not attracted as much public scrutiny as the SAT, it is not immune from similar criticisms that the test has race, class, and gender biases that result in disparate scores reflecting social and economic inequalities, not differences in academic ability.
Mr. Sacks, the testing expert and author, contends that the ACT's claim of offering a curriculum-based test, rather than an aptitude test, is "more superficial than real." The test may do a better job phrasing questions in a way that makes them seem to be more about subject content, he said, but the ACT, like most tests that assess achievement, has roots in intelligence testing.
John Katzman, the founder and chief executive officer of the Princeton Review, a New York City-based test-preparation company, said that changes in the ACT test in 1989 made the assessment more like what he refers to as the "pseudo-aptitude" SAT test.
The ACT--both the test and the organization—generally is viewed, he said, as more straightforward and fairer than the ETS in its dealing with students and educators.
"ETS has the rare combination of arrogance and incompetence that allows them to make bad moves very publicly and repeatedly," Mr. Katzman asserted. "The public relations strategy of the ACT is not to have a public relations strategy."
Officials of other firms in the test-prep world say students generally feel more comfortable with the ACT.
"Our experience with the ACT is that the test generally provokes less anxiety than does the SAT," said Robert Margolis, a partner with PrepMatters Inc., a Bethesda, Md.-based tutoring service that provides students with preparation for the SAT and the ACT. "The SAT has dominated the East Coast high school and college market for several years and has an accumulated mythology that the ACT so far lacks."
Mr. Margolis, whose students are middle- to upper-income youths, also sees the ACT as less deceptive than the SAT.
"The ACT tends to be about what people are used to dealing with in high school," he said. "There is a feeling with the SAT that you have to know all the tricks. The SAT has very little to do with what the average students are used to dealing with in their curriculum."
Seppy Basil, the vice president of learning and assessment at Kaplan Inc., a major test-prep company based in New York City, said that while both major college- admissions tests are coachable, tutors spend more time on testing strategy when working with students for the SAT.
Mr. Basil also said bilingual and English- as-a-second-language students feel more comfortable with how the ACT tests language skills. The ACT does not have sections where students are asked to complete sentences, and there is no vocabulary-analogy section.
Some students also prefer the ACT because it allows them to send their best scores to colleges. Students can send their best SAT math and verbal scores, but college- admissions officers also can view all scores received by a student.
Alan J. Tuchtengan, the director of admissions at the University of Wisconsin-River Falls, a 5,800-student school in the 26-campus University of Wisconsin system, said the ACT provides college officials with a broad picture of applicants.
Being able to look at specific subscores in geometry and algebra achievement on the math exam, for example, makes the ACT an effective advising tool, he said.
"I really like the ACT better. It helps us look more holistically at a student," said Mr. Tuchtengan, who has worked in admissions at various institutions for 20 years. "It just gives us more information. Students feel more comfortable with it, and guidance counselors like it."
As with many Midwestern universities, most of the students applying to the University of Wisconsin-River Falls take the ACT. The university began accepting the SAT in place of the ACT only about 11/2 years ago.
Sarah Lindsey, South Carolina's deputy superintendent for public schools, said that while most students in that state take the SAT, education officials have been encouraging them to take the ACT also. Five years ago, 4,648 students took the ACT in South Carolina. By last year, that number had jumped to about 9,000.
Sometimes, students who are academically able but who struggle with standardized testing, Ms. Lindsey said, have more success on the ACT. "They are much more likely to use what they have learned in the classroom on the exam because it is content-based," she said.
But College Board officials say there is little difference between the SAT and the ACT. "They both measure important skills that are related to college success," said Wayne Camera, the College Board's vice president of research and development. Mr. Camera said research shows that there is a near identical correlation between students' scores on the SAT and the ACT.
The perception that the ACT is a test students are more comfortable with because it more closely measures what students cover in high school, he said, may come in part from the SAT's inclusion of more problem- solving questions that ask students to use analytical or higher reasoning skills to apply knowledge. But he argued that such skills are an important measure of a student's ability that can't be completely captured in a content test that simply measures what students have learned in the classroom.
Open-ended math and verbal questions that ask students to write out and explain their answers, rather than fill in a correct answer in a multiple- choice format, he said, also provide a type of measurement a solely multiple-choice test like the ACT can't provide.
Mr. Camera called the SAT II, which students can take in 23 different subject areas such as writing, history, biology and social studies, the closest standardized tests can come to an end-of-course exam closely aligned to high school standards. About 20 percent of students who take the SAT take those subject-area tests.
How Good a Predictor?
Christina Perez, an official with FairTest, a Cambridge, Mass.-based watchdog group that pushes for universities to make admissions tests optional, said there is some validity to ACT officials' contention that their test measures what students are learning in high school.
But she maintained that the ACT contains language and format biases, and she argued that the exam is a poor indicator of students' ability to perform well in college. "The ACT is not a better option to the SAT," Ms. Perez said. "It adds very little to the application process."
According to a FairTest survey, at least 383 colleges nationwide do not use the SAT I or ACT scores to choose significant portions of their entering classes. Those institutions include selective schools such as the University of Texas at Austin; Mount Holyoke College in South Hadley, Mass.; Bates College in Lewiston, Maine; and Bowdoin College in Brunswick, Maine.
"A lot of schools are realizing these tests are acting as gatekeepers in the admissions process," Ms. Perez said, "and are realizing the SAT and ACT are oversold and limiting students' access."
Cynthia Schmeiser, the vice president of the development division at ACT Inc., insists that test scores' role in college-admissions decisions has been exaggerated. "The debate has been misdirected and misguided in that it is really focused on selective admissions, which is an issue for only a small number of elite colleges," she said.
ACT officials don't deny that the socioeconomic levels of students influence the broader array of opportunities and skills a student may bring to a test. But they emphasize the importance of making sure students in all schools have access to a rigorous curriculum.
Since 1990, there has been a significant increase in the number of ACT test-takers who have enrolled in upper-level high school classes. Ten years ago, fewer than half the graduates reported taking a core curriculum of at least four years of English and three years each of math, social sciences, and natural sciences. Last year, 63 percent of students reported taking those classes.
And statistics indicate that following a core curriculum pays off for students when it comes to success on the ACT. Students with four or more years of high school math, for example, have an average ACT math score of 22.6, compared with 16.7 for students with two years or less.
"The real debate needs to be about helping students make successful transitions to college," Ms. Schmeiser said. "The emotional part of this debate criticizes test scores that exclude students from college. But in most schools, testing is used in positive ways. The debate is inside out because it focuses on the exceptional cases."
Vol. 20, Issue 40, Pages 1,12-13