New Assessments Have Little Effect on Content, Study Finds
Performance assessments may prompt students to do more writing, but so far they are having little effect on what gets taught in classrooms.
So concludes a three-year study of 14 schools across the country that have pioneered the use of performance assessments in evaluating their students' progress. The study, conducted by the Washington-based research firm Pelavin Associates, was paid for by the U.S. Education Department's office of educational research and improvement.
The Pelavin researchers presented some of their findings last month during a national conference on assessment held by the Council of Chief State School Officers in Phoenix.
Unlike multiple-choice and short-answer tests, which gauge what students know, performance assessments are aimed at evaluating what students can do with what they know. For these assessments, students might collect samples of writing for a portfolio, explain solutions to mathematical problems in writing, or conduct experiments in a group with other students.
Many proponents of this approach reason that it can change the kind of teaching that goes on in classrooms because teachers tend to "teach to the test."
For their study, the Pelavin researchers focused on elementary, middle, and high schools in states or districts that are on the cutting edge in experimenting with these newer forms of assessments. The 13 states included Arizona, Kentucky, Maryland, New York, Oregon, and Vermont, all of which have overhauled their assessment systems recently, and the districts included Colorado Springs and South Brunswick, N.J., which are currently rethinking assessments at the district level.
The researchers also visited schools trying out new forms of assessments on their own.
Pros and Cons
They visited all 14 schools in the spring of 1994 and then paid seven of them a second visit a year later. They interviewed parents, teachers, students, principals, and school board members at every site.
In most cases, they found, performance assessments had brought about some positive changes at the classroom level. For one, teachers in six sites were using the scoring guidelines from the assessments as a way to show students what was expected of them, thus setting a common frame of reference. Students also appeared to be more motivated by the kinds of projects they were doing--especially so when they knew their efforts would be included in their portfolios. And, in some cases, teachers were exercising more creativity and were collaborating more closely with one another to plan classroom activities.
However, the researchers also found shortcomings. One was that the content actually taught in the classrooms had changed little.
"Part of the reason the content had not changed much is either because the curricular frameworks are still being defined in many of the places we've been to or teachers aren't quite familiar with changes at the state level in cur~ric~u~lar frameworks," said Nid~hi Khattri, one of the researchers. This was less true in mathematics, she said, because many educators already were familiar with standards for teaching that subject that were published in 1989 by the National Council of Teachers of Mathematics.
The researchers also found that teachers complained because the new assessment methods allowed them less time to cover all the material they had taught in the past.
"We wind up with the old zero-sum game in education," said Michael B. Kane, the study's director. "From some perspectives, the purpose is to get greater depth, but teachers don't necessarily perceive that as a valued tradeoff."
And, while students were writing more, the study said, they were not necessarily writing better. Teachers, for example, vacillated over whether to stress the mechanics of writing, including draft writing and brainstorming, or to attend to its stylistic, communicative, and expressive aspects.
The researchers concluded that the schools where changes in teaching and learning had taken a firmer hold were those in which teachers had been involved with the new assessment systems from the start.
Vol. 14, Issue 40