To the Editor:
What happened to norm-referenced testing? I worked as a writer, editor, subtest developer, and trainer for a major testing company for more than eight years. When I began, most nationwide testing was done with nationally normed tests.
These tests took years to develop, as they had trials all across the country and in a variety of schools and neighborhoods. Results were statistically analyzed, and test items were used only if they showed no gender or demographic bias. “Average” meant that those students were mastering skills shared by their peers across the country. If a majority of the children in a school were not mastering such skills, there was a problem.
As state assessments got under way in response to the No Child Left Behind Act, I noticed that the money began quickly shifting from the states and districts to the testing companies as states “created” their own standards, curricula, and assessments. My job was to interpret the standards from individual states to determine what they wanted to test. I observed that states were using basically the same standards, objectives, and subskills, just wording them a bit differently. There are only so many ways to slice and dice a domain. Sometimes the committees were most concerned that the wording of a question reflected their teachers’ syntax. Understanding other ways to address the same skill might add undo challenge to the item.
Granted, analysis and higher-order thinking skills are difficult to teach and test. Assuming that students are going to learn different material in different ways across the nation seems counterproductive in a nation that needs a mobile and flexible workforce and a common knowledge from which to take on future possibilities. Normed testing is a fair way to compare.
By the way, a level playing field would also mean funding schools equally (adjusted for cost of living) across the United States.
Susan Bassett
Tucson, Ariz.