When Andrew Renner sat down to take his state’s high school graduation exam this past spring, he had seen a copy of the writing test only once before the first time he failed the exam. A typical teenager, Renner did not write more than he believed necessary to pass the retake of the Arizona Instrument to Measure Standards. He failed the writing exam a second time.
Renner, 16 and a junior at McClintock High School in Tempe, Ariz., was in good company. Eighty percent of his classmates who retook the test last year failed.
Although he is sanguine about passing it this year and getting his diploma, Renner laments his lack of knowledge about what the test questions require of him. “They kept all the test questions pretty secret until the day of the test,” he says.
Around the nation, exams remain shrouded in mystery, with few people having access to test items, usually because the questions are used again in later administrations of the exams. But as states establish more serious consequences for student performance, the more the public clamors for the release of tests. Proponents of full disclosure argue that the tests should be freely available to ensure that the exams are fair and free of bias and that students are given the opportunity to learn the content being tested.
In Arizona, the state has released three sample writing items since its test was given in 1999 to help familiarize students, teachers, and parents with the test. Examples of student work, which are benchmarked to a passing or failing grade, accompany the questions.
That’s been insufficient for many, including the state’s largest newspaper. The Arizona Republic has sued the Arizona education department over the issue. The Phoenix paper argues that the public should be able to see the exact questions that students and schools are being held accountable for to examine their fairness.
The newspaper has won the first court matchup, according to Deputy Managing Editor John C. D’Anna, but the education department filed an appeal last October. A decision is pending.
Other states have also been unwilling to release state tests in their entirety. Only Maine, Massachusetts, New Hampshire, New York, Ohio, and Texas release all state tests after every administration, according to a 50-state survey by Education Week. Seventeen states release some of their state tests, usually just the performance items. New Jersey and Oregon take a different approach and publish so-called predictive exams that are similar to the real test. Those allow students to find out before taking the exam if they are likely to pass or fail.
When it comes to so-called high-stakes tests, states are even less inclined to want publicity about the content. Of the 18 states where diplomas hinge on tests, only New York, Ohio, and Texas release the entire exit exam each year. Only three states decide whether to promote students to the next grade level based on test results, and neither Louisiana, New Mexico, nor North Carolina unveils those tests.
‘Show Us the Tests’
Fairness is but one reason for test openness, says Mark D. Musick, the president of the Atlanta-based Southern Regional Education Board and the chairman of the National Assessment Governing Board, which oversees the National Assessment of Educational Progress. He argues that revealing entire versions of state exams would assuage the increasing complaints about testing and defuse some of the criticisms of what is seen as biased or excessively fact-heavy testing.
In a recent opinion essay in Education Week, he challenged states “to show us the tests” and wrote that “we may soon, in fact, come to think that it was bizarre not to make the tests public.”
Most states balk at demands such as Musick’s and argue that releasing entire exams would be too expensive. Arizona’s schools superintendent, Lisa Graham Keegan, estimates that releasing all her state’s exams would increase annual testing costs eight-fold. The state now spends about $1 million each year on administering its assessment.
But Jeffrey M. Nellhaus, the official in charge of the testing program in Massachusetts, which releases almost all its test items, says advance planning can significantly reduce the cost of releasing the questions.
Each time Massachusetts gives its assessment, he says, the state both field-tests new exam items and administers items on which students’ scores depend, meaning that an ample supply of new questions is always in the pipeline. Nelhaus estimates the cost of the procedure at 10 percent to 15 percent of the state’s $5 million annual program.
The rewards are enormous, he argues. “You can really deal with some of the criticism in a much more concrete way,” he says. “It’s a PR thing, and it helps schools to know what the test is really about.”
Concern about cost is not the only reason states do not make tests public, according to Michael H. Kean, the vice president for public and governmental affair at CTB/McGraw-Hill, in Monterey, Calif., one of the country’s biggest commercial test-makers. If a state uses an off-the-shelf test, he notes, the questions are owned by the testing company, and the state is prohibited from publishing them.
In contrast, states that design test items from scratch are free to release them.
Some testing experts question the necessity of disseminating tests and instead emphasize greater oversight of the test-making process. “I am happy with [the release of] a couple of items” from an exam, says Lawrence M. Rudner, the director of the ERIC Clearinghouse on Assessment and Evaluation at the University of Maryland College Park.
Rudner argues for more oversight of assessment programs and assurances that the tests are technically sound. Guarding against culturally biased test items and measuring improvement from year to year are complicated issues that need close examination, Rudner says. He recommends that third parties, such as universities, provide independent reviews of exams to ensure their validity and overall soundness.
Releasing whole tests might also encourage students to hone their test-taking skills and memorize rather than learn the concepts that are being tested, Rudner suggests. “We are trying to teach concepts and not items,” he says.
Testing experts point to Texas as a place where such practices might be occurring. In their view, the state may have wound up with inflated test scores because students can continually practice on previously released exams without fully understanding the content.
For instance, the state’s 4th grade math tests often contain word problems with multiple-choice questions similar to the following: “If six times a number equals 54, which expression best could be used to find the number?” The test item is supposed to measure a student’s ability to find an unknown variable. Teachers in the state, though, have reported teaching students that if they see a test question with an “if” and a “times,” then they should make sure to choose the answer that includes the division function in the expression.
Although releasing tests could produce such negative effects, political pressures have forced some states to make exams public.
In 1997, a father in Ohio sued the state to gain access to the state test and won. And in California, secrecy surrounding the California Learning Assessment System contributed to then-Gov. Pete Wilson’s scrapping of the 1-year-old exam in 1995. He echoed claims made by others that the exam was invasive and antireligious. The exam contained essay questions asking students about their views on death and religion.
The arguments that fly back and forth about the openness of exams might soon be moot, according to Stephen Klein, a senior social scientist at the RAND Corp., a research organization in Santa Monica, Calif. He describes a future of Internet-based testing, when a bank of test items would be so large that it wouldn’t benefit students to memorize the answers to specific questions. Schools would pay for using tests that could be given annually, or even monthly, and be tailored to individual student performance.
For The Arizona Republic‘s D’Anna, the future has little to do with Internet-based testing. The newspaper editor is looking forward to finishing the legal fight to see the contents of the Arizona assessment, if only to figure out why 90 percent of students failed the exam during its first administration.
“Given the controversy and low pass rate,” D’Anna says, “we want to see what is exactly in the test.”
A version of this article appeared in the January 11, 2001 edition of Education Week