Both Formats Urged
In what top federal officials called a significant advance in the state of the art, the National Assessment of Educational Progress has concluded that it can conduct a large-scale assessment of students’ writing abilities by using their classroom work.
Releasing a study based on in-class writing samples from about 2,000 4th and 8th graders, the officials said last week that NAEP was able to classify and evaluate widely different pieces on a comparable basis.
But the results of the pilot study, the officials acknowledged, suggest that the capabilities of the assessment itself may have surpassed the level of writing instruction in many schools. Although the study was not based on a representative sample of students, it found little evidence of the types of practices school-based writing assessment is expected to capture, such as the use of “pre-writing’’ strategies and other techniques of the writing process and the composition of a broad range of pieces.
Diane S. Ravitch, the Education Department’s assistant secretary for educational research and improvement, said the NAEP report could spur schools to “begin a revival of writing.’'
“Educators say tests drive instruction, and you get what you test,’' she said at a press conference here. “If you take these two comments seriously, this is one of the most important events in education in 1992.’'
“By this act,’' Ms. Ravitch said, “we are helping to break the iron grip of the standardized multiple-choice test.’'
But Ruth Mitchell, the associate director of the Council for Basic Education and the author of a recently published book on alternative assessments, cautioned against overstating the significance of the NAEP study. The real test, she said, will come later this year, when NAEP conducts a large-scale assessment of student portfolios, which some states and districts are using to measure a range of writing over time.
“If this had been a real national portfolio, the way Vermont uses them and Pittsburgh uses them,’' Ms. Mitchell said, “that would be a breakthrough.’'
Keeping Pace
A Congressionally mandated project, NAEP has since 1969 tested national samples of students in a range of subjects.
During this period, said Emerson J. Elliott, the acting commissioner of the National Center for Education Statistics, NAEP has tried to remain innovative in the use of new forms of assessment, while maintaining technical rigor.
But, he noted, many language-arts educators have said that NAEP writing assessments have failed to keep pace with changes in instruction, which have emphasized more elaborate and detailed pieces of writing. Although several states and districts have moved to evaluating portfolios of student work, he said, NAEP has continued to measure students’ writing on assigned topics under timed conditions.
“Educators criticized the national assessment’s tests for not measuring what was important,’' he said.
The new study, Mr. Elliott said, was aimed at testing whether NAEP could go beyond the limitations of past assessments by measuring the writing students performed as part of their classroom work.
Low Participation Rate
To that end, NAEP asked 4,000 students--2,000 each in grades 4 and 8--who had participated in the regular 1990 writing assessment to take part in the study. The students and their teachers were asked to select a piece of writing they considered their best.
“The goal,’' the report states, “was to create a ‘Nation’s Portfolio'--a compilation of the best writing produced by 4th and 8th graders in classrooms around the country.’'
However, only 55 percent of the 4th graders and 54 percent of the 8th graders agreed to participate in the study, producing a sample of work that was not representative of the nation. Although officials were unable to completely explain the low participation rate, the report suggests that it may reflect the fact that NAEP gave teachers only a few days’ notice to collect student work.
In the 1992 assessment, it notes, NAEP will notify teachers months in advance. NAEP is also conducting a survey of teachers this year to determine why they are taking part, according to Eugene Owen, the chief of the operations and instrumentation branch of the education-assessment division of the N.C.E.S.
Mr. Owen also noted that those who did participate in the 1990 study tended to be older, higher-achieving, and more advantaged than the population in the regular writing assessment.
“What we got was a sample of writing that was better than we might expect, most likely,’' he said.
Little Evidence of Reform
The results, however, are nevertheless disappointing, said Phyllis W. Aldrich, a curriculum coordinator in Saratoga Springs, N.Y., and a member of NAEP’s governing board.
“We don’t have a hidden treasure trove of wonderful student writing out there,’' she said.
Specifically, the study found little variation in the types of writing students performed.
The majority of papers submitted from students in each grade were informative, it found, and an additional third were narrative. Very few--1 percent at grade 4 and 5 percent at grade 8--were persuasive, one of the types featured prominently in NAEP tests.
The report also notes that several teachers commented that they did not teach writing until late in the year, and instead submitted worksheets and skill sheets.
Almost all of the papers were addressed to an unspecified audience, the study found, and fewer than half showed evidence of the use of the writing process, which emphasizes pre-writing strategies, drafts, and revision.
Miles Myers, the executive director of the National Council of Teachers of English, said the findings provide additional evidence that reforms in the teaching of writing are not yet widespread.
“If you take a 20-year slice, from 1972 to 1992,’' he said, “you would say, yes, there is more attention to the writing process in writing classes in K-12 schools than you had before.’'
“On the other hand,’' he added, “we see there is a lot of work to be done in teaching writing.’'
Longer, But Still Short
The study also found that students’ performance on the writing tasks was relatively low.
Using a separate six-point scale for each type of writing, the study found that a third of the 4th graders’ informative papers were simple lists of ideas, while slightly more than half tried to relate the ideas. Although 8th graders performed better on the informative papers, only 22 percent were considered “discussions,’' in which ideas were clearly related by the use of rhetorical devices, and 8 percent were “partially developed discussions.’'
Similarly, more than half of the 4th graders’ narrative papers were lists of related events, and another fourth described a series of events. Fewer than a fifth of the 8th graders’ narratives were “extended stories,’' which describe a sequence of episodes with details about most elements.
To provide a sense of the scoring guides, the report includes a chapter of samples of student writing.
Comparing the results with those of the NAEP timed assessments, the study found that the papers in the portfolio were significantly longer and more developed.
However, Ms. Ravitch observed, the in-class papers were still fairly short. The median length of the 4th graders’ papers was 84 words; for 8th graders, it was 140 words.
“That’s [only] over half a page,’' Ms. Ravitch said.
Both Formats Urged
The report also notes that, unlike in the timed tests, student performance on the classroom-based assessment was heavily dependent on the type of instruction they receive.
While this is a virtue, because it can gauge instruction in the subject, the report says, it is also a potential weakness, since the assessors are unable to gauge student performance across a range of tasks.
In fact, it states, the assessment “may be as much a measure of the classroom activities and amount of time spent on writing instruction as of the students’ achievements.’'
As the assessment techniques are refined, it concludes, “using both portfolio and traditional modes of assessment in concert may provide educators with rich, detailed portraits of students’ writing abilities.’'
Copies of “Exploring New Methods for Collecting Students’ School-Based Writing’’ are available for $10 each from the U.S. Government Printing Office, Superintendent of Documents, Mail Stop: SSOP, Washington, D.C. 20402-9328.