Seeing Is Believing: State-of-the-art performance assessments have raised the hackles of parents in some states and school districts, but a new study suggests that when parents are actually exposed to such exams they prefer them to more traditional kinds of tests. As part of a larger, ongoing study, University of Colorado researchers Lorrie Shepard and Carribeth Bliem, both affiliated with the National Center for Research on Evaluation, Standards, and Student Testing, showed the parents of 33 3rd graders sample multiple-choice questions from a standardized test and sample open-ended questions from a performance-based assessment. Overwhelmingly, the parents said they favored the performance assessments because they “make children think’’ and allow teachers to better understand the child’s thought process. Some of the parents who preferred the performance assessments said they might have answered differently had they not had the opportunity to actually see the questions. The researchers say their findings suggest that educators can counter the misconceptions parents have about new forms of assessment simply by exposing them to various test items. The Handwritten Word: When it comes to taking essay exams, neatness may not count for much, according to a study published in the fall issue of The Journal of Educational Measurement. Four researchers from the Educational Testing Service--Donald Powers, Mary Fowles, Marisa Farnum, and Paul Ramsey--sought to determine whether handwritten essays or those produced on a word processor earned higher scores. They asked 32 prospective teachers who were taking a written pilot test to produce two essays--one handwritten, the other composed on a computer. Both were scored. Professional word processors then typed up the handwritten essays but did not correct any spelling or grammatical errors. The computer-written essays were transcribed--errors and all--into handwritten copies. All of the papers were then re-scored by trained readers who knew nothing about the experiment. When the typed version of the original handwritten essays were re-scored, the average dropped significantly. When the handwritten version of the computer essays were re-scored, the average increased slightly. The researchers could not pinpoint why the neater computer versions were scored lower. They speculate, however, that one reason may be that the typed versions looked shorter than the handwritten ones. The overall findings, they say, could point to a potential scoring problem as more and more states and schools turn to performance-based assessments that involve writing. Don’t Forget The Students: Teachers in reform-minded schools often say they support an active, inquisitive brand of learning, but according to the Restructuring Collaborative, a loose organization of seven federally funded research laboratories, they are not getting that message across to their students. The study found that students at schools involved in the collaborative were unaware that their teachers wanted them to think critically and take charge of their own learning. When asked to define the term “successful learner,’' for example, elementary students said it meant “pleasing the teacher.’' Middle school students said it meant being compliant, making an effort, and getting good grades. High school students defined it as “putting out the effort’'--being responsible, having good study habits, being involved in school, and using well-developed organization skills. Robert Blum, director of school, community, and professional development for the Northwest Regional Laboratory, said the researchers plan to take their findings back to the schools that were involved in the study. “This should become part of what schools consider as they go about changing themselves,’' he says. “A lot of restructuring efforts sort of miss the focus on learning and jump right in and focus on restructuring the schedule of the day, even though they may not be quite sure why.’'
--Debra Viadero