Driven by the federal No Child Left Behind law, state exams have proliferated at the high school level in recent years. And, as part of the growing effort to ensure that students graduate from high school “college and career ready,” more and more states are taking care to align those tests with the academic expectations of colleges and universities.
So, if that’s the case, why should students have to take college-entrance exams on top of state tests? Wouldn’t it make more sense if universities and colleges could use those same tests to predict which students are likely to succeed on their campuses and which are likely to fail?
In a study published in the summer issue of Educational Measurement: Issues and Practice a trio of researchers—Adriana D. Cimetta and Joel R. Levin of the University of Arizona, and Jerome V. D’Agostino of Ohio State University—test out exactly that proposition with Arizona’s state tests, which are known as the Arizona Instrument to Measure Standards, or AIMS. In particular, they zeroed in on results for the 1999 and 2000 test administrations, comparing them with data for the SAT I verbal and math exams that students generally take in their junior or senior years of high school, and tracking students as they moved on to state colleges and universities. What the researchers aimed to find out was whether the state exams could be just as useful as more-traditional college-entrance exams at predicting students’ grade point averages in their freshman year of college.
You might think, as I did, that the state exams would be less effective for predicting college performance because they test more basic material, but you would be wrong. The results were mixed. For the 1999 cohort of students, the AIMS test did not do as well as the SAT in terms of adding predictive value to students’ high school GPAs, which, in and of themselves, are considered to be pretty good indicators of college performance. However, the state test functioned just as well as the SAT in predicting students’ freshman-year grades for the 2000 test-taking group. (We should note here that passing the state test did not become a requirement for Arizona students until the 2000 test administration.)
There also were few—if any—differences between the two exams in predicting how specific racial and ethnic subgroups of students would do in college.
There are all kinds of practical reasons why higher-education institutions might prefer to stick with conventional college-entrance exams, the authors note. “Nonetheless,” they add, “we also generated considerable evidence to support the notion that state tests (at least in one state) are aligned enough with college expectations to render them useful as indicators of college readiness.”
This team of researchers isn’t alone. In 2006, a similar, but broader, study of the Connecticut Academic Performance Test, or CAPT, also found that the state tests were pretty good at forecasting college success.
It’s worth noting, too, that these findings come at a time when growing numbers of colleges are already abandoning the SAT or the ACT as a requirement for admission. The latest tally from FairTest, a Massachusetts-based group that tracks these things, puts the number of four-year schools that don’t require the SAT or the ACT at more than 830.
From universities’ perspective, of course, a national test, such as the SAT, would be more fair to applicants, who hail from all corners of the country. On the other hand, students could clearly save some time and a few bucks by not having to take traditional college-entrance exams—and that’s not including all those expensive SAT-prep classes.
Policymakers, of course, could also consider doing the reverse. Why not use the SAT as a state exam? That idea may not be as farfetched as it seems at first blush. According to this report, both Colorado and Maine have already integrated either the SAT or the ACT into their state testing programs.