In an effort to raise high school graduation standards, some states are incorporating college-admissions or -placement tests into their testing programs. But a new analysis urges the states to proceed with caution.
The analysis by the Washington-based Achieve of more than 2,000 questions from admissions and placement exams found that the tests vary considerably from one another and may not fully measure the knowledge and skills necessary for college.
The report, “Aligned Expectations? A Closer Look at College Admissions and Placement Tests,” is available from Achieve.
“What we found is that the tests that are out there, developed for very specific purposes, don’t fully or completely reflect the kinds of knowledge and skills that are being incorporated into state high school standards in math and English,” said Michael Cohen, the president of Achieve, a nonprofit group created by governors and business leaders to help states increase the rigor of their academic standards and tests.
“And if they don’t reflect those skills, but the tests are used anyway, and used for accountability,” he added, “then they will have the effect of narrowing the curriculum, and thereby reducing, or at least not improving, preparation for college.”
Comparisons of Rigor
The analysis, slated for release April 11, was conducted with the cooperation of the Iowa City, Iowa-based ACT, which administers both the ACT college-admissions test and the Compass placement test, and the New York City-based College Board, which sponsors the SAT college-admissions test and the Accuplacer placement test. Accuplacer and Compass are both computer-adaptive tests, meaning that the selection of questions varies based on test-taker’s previous answers. For the study, Achieve examined a sample of test questions.
Both groups provided Achieve with access to their admissions and placement exams. Achieve also acquired placement tests from a number of other organizations and postsecondary institutions.
In addition to comparing the tests with one another, Achieve examined how well they measure the English and mathematics benchmarks set by the American Diploma Project, which are being used by 29 states to align high school standards, curricula, assessments, and accountability systems with the demands of college and work.
Achieve recommends that state policymakers consider a range of steps, based on its study of exams used for college admissions and placement and their use in K-12 testing systems.
The study found that college-admissions tests in reading are more rigorous than college-placement tests, though the reading passages on placement tests more accurately reflect the types of informational texts students will encounter in college, rather than literary passages.
Both admissions and placement exams in math emphasize algebra, but they tend to favor pre-algebra and basic algebra rather than the more advanced concepts and skills needed for college readiness. Of the two types of exams, placement tests are more narrowly focused on algebra, while admissions tests are broader, measuring a range of other topics such as data analysis, statistics, and geometry, the study found.
In writing, both admissions and placement tests are more rigorous than most high school exams, according to the report, and generally reflect the kinds of writing students will be asked to do in college. Institution-developed placement tests were the strongest of the writing tests analyzed by Achieve.
More than 2 million students take the ACT or the SAT each year, and Colorado, Idaho, Illinois, Kentucky, Maine, and Michigan are now incorporating a college-admissions test into their state testing systems.
But while such tests do some things well, the study cautions that neither the ACT nor the SAT includes the full range of concepts and skills reflected in the American Diploma Project benchmarks and, increasingly, in state high school standards.
Achieve and two other Washington-based groups, the Thomas B. Fordham Foundation and the Education Trust, developed the ADP benchmarks.
The report recommends that states augment the ACT and the SAT with extra test questions or performance measures to ensure stronger alignment with state standards and to assess more advanced skills.
Other states are planning to use end-of-course tests to measure college readiness because they can be tied closely to the curriculum and to the courses that states require for high school graduation. But the report notes that for end-of-course tests to serve as an indicator of college readiness, they have to be given in higher-level courses, such as Algebra 2 and 11th or 12th grade English. Higher education should play a role in their development or review, the report argues.
Still other states are considering adding questions to existing high school tests so they better measure college readiness, or are making college-placement tests available for students to take voluntarily in high school.
But the report warns that placement tests should not be used as a substitute for building more comprehensive high school assessment systems. A majority of placement tests Achieve reviewed focused narrowly on a subset of knowledge and skills and, in math and reading, reflected relatively low levels of rigor, according to the report.
A version of this article appeared in the April 11, 2007 edition of Education Week as Caution in Use of College-Entry Tests Urged