Published Online:

Fix It Or Forget It

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

My 20 years as a writer, my 10 years as a teacher of high school and college English, and my independent research have convinced me that the current advanced placement exams do not reliably indicate whether a student is qualified to do advanced college-level work.

These tests, administered to more than 400,000 American high school students in 16 academic disciplines, do not do what the testmakers claim they do. My experience is with the two tests taken by more students than any other--the AP Literature and Composition and the AP Language and Composition examinations. Here are some reasons why they don't work:

1. On both AP English exams, 45 percent of a student's score comes from reading passages from literature and interpreting the passages by answering multiple-choice questions. Few college humanities professors use multiple-choice questions extensively in their courses. To assess students' readiness for college-level work, a test must ask them to write answers, not merely choose a letter from a list of possibilities.

2. Since the time of Socrates, a vital element in a good liberal arts education--some professors say the quintessential element--has been the seminar: open discussion in which students can speak freely, test ideas, and learn from each other as well as from the teacher. The AP English exams ignore this process. Indeed, the multiple-choice format discourages Socratic discussion.

3. Each of these examinations claims to test students' ability in composition by requiring them to write three essays on three previously unseen assignments within two hours. This "free response'' section counts for 55 percent of the score. But composition is the art and craft of composing, and composing connotes careful revision. We English teachers spend 12 years teaching our students to revise and revise and revise, to take several days, if necessary, to complete a piece of writing. On the AP exam, students are told to knock off three essays in 40 minutes each. For newspaper columnists such as Russell Baker and George Will, composing three essays is a full week's work, yet we expect high school AP English students to compose three essays in two hours. As Truman Capote once said of the work of a writer he viewed as facile, "That's not writing--that's typing.''

4. Multiple-choice questions that are graded by computers can have only one "correct'' answer. But to avoid questions with obvious and easy correct answers, the testmakers must include answers that they call "distracters''--answers close enough to being "right'' that a thoughtful reader might be distracted into choosing them.

Every year as I prepare my students for their AP exams, this situation arises several times in each discussion: A student chooses one of the distracters and defends it eloquently as the "right'' answer. I have to tell him or her, "Yes, you've chosen a reasonable answer to that multiple-choice question, and yes, you can make a good case for your answer, and yes, a college professor would probably agree that yours is a thoughtful answer--and might even commend you for your insight and perspective. But the testmakers at the Educational Testing Service want you to choose a different answer, and if you want to do well on the AP exams, you must, just for this three-hour period, conform to ETS's view of this reading and the world.''

Students who are intellectually mature beyond their years struggle with these multiple-choice questions. So do students who are global rather than analytical in their thinking and learning styles. Mature students may be thinking at the level of college juniors or sen-iors, and global thinkers may be seeing the question from a fresh angle, but the tests are aimed at an imaginary, "standardized,'' analytical high school junior or senior. Students who see subtle connections or nuances they're not supposed to see are penalized.

5. Some educators have long complained that the standardized tests prepared by ETS and other organizations are biased in favor of certain groups of students. Because most school districts are funded by property taxes, students from higher-income households usually live in districts with better schools. Students from higher-income households are more likely to be able to afford private schools and tutoring to prepare them for standardized tests. (Overwhelming evidence shows that students who take one of the privately run preparation courses can markedly boost their scores on ETS standardized tests.) And since, presumably, most of the psychometricians who develop and administer these tests are graduates of the kinds of high schools, colleges, and graduate schools attended by children of high-income families, their views of the world are likely to be similar.

One generation of children of high-income families writes the tests for the next generation, thus passing the advantages of the privileged class from generation to generation. This may be inadvertent; the biases in the test are so subtle, so deep-rooted, so much a part of our society's structures and conventions that the testmakers can't even see them, let alone correct them.

Look at the number of AP courses offered in economically poor school districts as opposed to the number offered in wealthier school districts. In the wealthier school districts, look at the racial makeup of the AP classes. I talked to one teacher who said that in four years of teaching AP courses in a public high school that was 40 percent African-American, about 2 percent of his AP students were African-American.

The same holds true for cultural bias. The College Board News of January 1990 reported that about 60,000 minority students who took AP examinations received a qualifying score of three or better: 37,000 Asian Americans, 15,000 Hispanic Americans, and 6,000 African Americans. But the ratio of U.S. minority populations is just the reverse. What does it mean that 37,000 Asian Americans make the qualifying scores but only 6,000 African Americans do?

I teach AP English courses, and I support the idea of high school students doing rigorous college-level work and getting academic credit or advanced placement in college courses or both. But the present AP English exams do not fairly evaluate a student's qualifications for college-level reading, writing, and thinking in the humanities.

We should drop the current exams and recast the AP English program. One alternative: Require students to submit a portfolio of their best work over the school year, along with certification by their teacher that the work was done under the specified guidelines. There is precedent for this proposal: The AP Studio Art exam is currently administered this way.

We owe it to the 400,000 American high school students who take these exams to make sure they measure what they claim to measure and to make sure they measure it fairly and equitably. If the AP program isn't doing this, then we need to fix it; if we can't fix it, then we need to drop it.

Web Only

You must be logged in to leave a comment. Login |  Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories

Viewed

Emailed

Commented

MORE EDUCATION JOBS >>