Opinion
Reading & Literacy Opinion

Writing: An Unexamined Gatekeeper

By Ardith D. Cole — February 26, 2007 8 min read
  • Save to favorites
  • Print

“Kids just aren’t learning like they used to. Have you seen those test results?” I hear this sentiment expressed by people everywhere, and often feel compelled to respond, “But tests are far more difficult now. They’re not just multiple-choice anymore. Today, they include writing.”

Everyone seems surprised to learn that our high-stakes assessments now include writing. Most say that multiple-choice tests are easier than written-response ones. They’d rather select a response than construct one.

BRIC ARCHIVE

Yet some states now include written-response items on every test—in reading, science, even math. And some of the writing tasks are lengthy and complex. This inclusion may present a significant variable, one that influences student performance and test scores. That’s how writing has become a gatekeeper for student promotion and graduation, as well as for schools’ “adequate yearly progress” and federal funding.

Researchers have reported many testing discrepancies in the past, but has anyone investigated how written response affects test results? After results for the 2006 SAT showed the largest drop off in scores in 31 years, administrators of the test took a close look—and blamed the exam’s new writing section. How did writing influence that drop in scores? Does it influence state tests the same way?

Right now, we don’t know how performance is affected when the writing variable is added. This means we cannot use current state-test comparison data until we first investigate all between-state assessment variables and their effects on performance.

Some of the between-state differences that may be significant include test formatting, writing tasks, types of writing and instruction, scoring and rubrics, and the accessibility to instructional supports.

A change in test formatting on that 2006 SAT produced significantly different results. So when states add written-response tasks to tests that previously contained only multiple-choice items, will their students experience a similar toll-taking challenge? Students in the nation’s capital encountered such a challenge last year, and only 28 out of 146 District of Columbia schools reached the 2006 benchmark. Did the addition of writing influence those scores? If so, how?

When comparing written-response and multiple-choice tests, we must also study how test length, difficulty level, and guessing affect results. Such differences currently invalidate comparisons of reading data between California and New York state, for example, or between Oregon and Washington state, because New York and Washington use written-response on their reading tests, while California and Oregon use only multiple-choice. What’s more, New York students construct written responses on all assessments (math, science, social studies), while states such as California require students to use writing only for the writing section.

Think about it. In every subject but one, students in California or Oregon read a prompt, followed by a question, and then “bubble in” their answer. Multiple-choice allows “bubblers” to guess, which gives them some chance of correctly selecting the one right answer.

On the other hand, in Washington, New York, New Jersey, Kentucky, Delaware, and other written-response states, guessing is less of an option. Instead, responders must analyze a multifaceted prompt, then organize right-answer facts from which they construct a single- or multiple-paragraph response—a far cry from coloring in circles on a Scantron sheet.

We also need to investigate how the tasks on written-response tests differ between states and between subject areas. Tasks currently vary in quantity, length, type, and, thus, difficulty. Answers range from mere one-word completion responses to multiple-page essays.

The variety of written-response-test prompts boggles the mind. Take a brief, cross-state cyber trip. Click on “released items.” You’ll notice that some prompts evoke self-based responses, while others require text-based answers.

English writing prompts generally call for the self-based type and welcome creativity, such as this one from Nevada’s 2000 writing test: “Discuss an event or situation that taught you a lesson or created a change in your life.”

But when states use writing to assess subject areas, such as reading, prompts usually require text-based responses. Thus, after reading the article titled “Porcupines,” Vermont students use text facts to answer this prompt: “Explain why porcupines do not have to fight. Use information from the article to support your answer.”

Some tests complicate matters by combining self-based and text-based response within the same task, as this one from Illinois does: “Would a person like Doc Marlowe best be described as someone to be admired or not admired? Explain why. Use information from the story and your own observations to support your answer.”

Whether written responses are brief or extended, machine-scored or human-scored, text-based or self-based, writing is elevated to the status of gatekeeper for those subjects. Might we conceivably predict, then, that students who have trouble with writing will have difficulty in every subject that’s tested through writing?

But what if the student knows the subject well—even knows the right answers—but does not write well? What if he can’t spell? What if his handwriting resembles that on the last prescription you took to the pharmacy? What if the student is from another country and confuses syntax? Should this person be encouraged to move out of New York, Washington, Ohio, Kentucky, Connecticut, and other written-response states and into a state like California? After all, his graduation could depend on it. But so might his career.

In 2004, the National Commission on Writing declared that students must learn to write well “in all subjects.” The panel called writing a “ ‘threshold skill’ for both employment and promotion, particularly for salaried employees.” Writing, it said, “could be your ticket in … or it could be your ticket out.”

There’s little doubt that students need instruction in all forms of writing. But here’s the catch: Who’s teaching written-response? English teachers, who have always been in charge of the writing realm, do not usually focus on just-the-facts responses, but rather on writing characterized by strong voice, enticing leads, clever metaphors, and creative description. Yet it is right-answer writing that is needed to construct a correct answer to many test prompts, such as this one on Washington state’s science test:

“A scientist at Acme Chemical found an unlabeled container. He knew the contents were either HCl, NaOH, or NaCl. Using Chart A on page 2 of the Direction and Scenario Booklet:

Describe four tests that can be used to determine which chemical was in the container.

For each test, tell how it would help him determine which chemical is in the container.”

Unfortunately, too many students labor long and hard composing a creative response to one of these right-answer prompts, thus making the task more difficult than it needs to be. What’s more, in the working world, will employers care about leads and voice? They’ll probably want creative thinking and just-the-facts writing.

Should we then throw creative writing out the window? Indeed not. But let’s acknowledge the influential scoring idiosyncrasies between creative writing and the right-answer writing needed to produce a test response.

Writing tests are scored holistically using rubric scales, which allow for shades of correctness. Most of these stretch to accommodate an infinite number of responses, so debates over scoring arise. That’s why states such as Nevada offer a writing-scores appeals process to those who disagree with their scoring.

Conversely, right-answer writing, because of its tight, text-based boundaries, makes scoring less debatable. Is that why some states are moving away from creative writing tasks or excluding the writing section of their tests from accountability calculations?

Moreover, expense and training related to correcting tests vary. Some written responses are machine-scored, but most are hand-scored, sometimes by educators, other times by noneducators. Is the added expense of training and correcting why states like Texas choose to use written-response only on their high school assessment?

And what about differences in accessibility to instructional-support systems? Staff development varies significantly between states, as do procedures related to released items and sample responses. Some states keep tests under tight wraps. Such “secure” tests make it almost impossible for anyone to view a particular student’s actual assessment to locate that student’s individual needs.

Scores are returned to schools when tests are not, but a few states quickly return scores and tests to schools. Some reuse tests. Others do not. These differences create between-state accessibility discrepancies, but do they also affect between-state results? If so, how much?

These are only some of the unexamined variables related to state assessments. But they bear witness to the fact that researchers must stop comparing apples to pot roast. Comparisons that use state assessments currently are unreliable and invalid because of potentially influential, unexamined differences. In light of the testing craze inspired by the No Child Left Behind Act, it seems unconscionable that writing’s gatekeeper status has gone unrecognized.

Regardless of federal or state mandates, however, one thing remains obvious: We need to help all students develop skills in both creative and right-answer writing, using authentic experiences that demonstrate the diversity and the importance of writing. It is, after all, increasingly the ticket the gatekeeper will require.

Related Tags:

A version of this article appeared in the February 28, 2007 edition of Education Week as Writing: An Unexamined Gatekeeper

Events

School Climate & Safety K-12 Essentials Forum Strengthen Students’ Connections to School
Join this free event to learn how schools are creating the space for students to form strong bonds with each other and trusted adults.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Reading & Literacy Explainer Make Time for the Read-Aloud. Here's How
Literacy experts explain the benefits of making the read-aloud a daily classroom ritual.
4 min read
Students from two kindergarten classes at the Lewiston elementary campus of Saint Dominic Academy listen to a teacher read a book in Lewiston, Maine, on Aug. 22, 2018.
Students from two kindergarten classes at the Lewiston elementary campus of Saint Dominic Academy listen to a teacher read a book, in Lewiston, Maine, on Aug. 22, 2018.
Russ Dillingham/Sun Journal via AP
Reading & Literacy For Now, California Won't Mandate 'Science of Reading.' Here's What Happened
The California Teachers Association was one of the bill's most prominent opponents.
6 min read
Female teacher reads to multi-cultural elementary school students sitting on floor in class at school
iStock/Getty
Reading & Literacy Q&A Want to Improve Reading Proficiency? Talk to Kids More
Education researcher Sonia Cabell explains how effective classroom conversations can boost reading proficiency.
4 min read
A 1st grade teacher speaks with a student about an assignment at Capital City Public Charter School in Washington, D.C., on April 4, 2017.
A 1st grade teacher speaks with a student about an assignment at Capital City Public Charter School in the District of Columbia in 2017.
Allison Shelley/All4Ed
Reading & Literacy Opinion Reading Fluency: The Neglected Key to Reading Success
A reading researcher asks whether dismal reading results could stem from the fact that decoding doesn't automatically lead to comprehension.
Timothy Rasinski
5 min read
Illustration of young boy reading and repeat icon.
DigitalVision Vectors / Getty