“The issues invariably turn on political questions, not technical questions,” he added. “The technical issues are persnickity, but solvable. The question is, are we willing to spend money on public relations to convince the public that standardized tests are inadequate?”
The high-school mathematics and science assessment, expected to be implemented in the 1991-92 school year, got under way last month when state officials received word of a forthcoming $1-million grant from the National Science Foundation and began training teachers from seven states in the new method.
In contrast to conventional tests, the new program will measure student performance on a series of tasks that may take as long as a semester to complete. Students will be asked to work individually and in groups to frame problems, collect data, and analyze and report their results.
A handful of states--including New York, California, and Connecticut--have experimented with performance tasks in their regular assessment programs, and Vermont is developing a program that would include student work portfolios in addition to standardized tests. But the Connecticut math and science assessment, which is being closely watched by educators around the country, is considered the first to be based solely on sustained performance tasks.
“This moves us one step closer to the assessment of genuine achievement, and one step away from just looking at standardized tests,” said W. Ross Brewer, director of planning and development for the Department of Education in Vermont, one of the states working with Connecticut on the project. “This will test what kids need to know, not just what is easy to test.”
Joan B. Baron, director of the Connecticut Assessment of Educational Progress, added that the project is also aimed at influencing teaching in the state. Over time, she said, the assessment will encourage teachers to shift from merely imparting facts to facilitating learning.
National testing experts praised the project but cautioned that it may not, at least at first, be able to achieve all of its ambitious goals. Since the state will be testing only a sample of students, teachers who are not involved in the project may not change their instructional methods, noted Grant Wiggins, former director of research for the Coalition of Essential Schools, a network of high schools following the reform ideas outlined in Theodore R. Sizer’s book Horace’s Compromise.
In addition, suggested Mr. Wiggins, who is also working with Connecticut on the project, the idea that changing what is tested will change what is taught remains “a hunch.”
“We’ll see if it works out that way,” he said. “Cynics say all good tasks and tests are pervertable.”
“This is a first step, necessary but not sufficient,” he said.
‘Common Core of Learning’
The new project begins at a time of mounting criticism of standardized tests’ influence on instruction.
In response to calls for alternative methods of testing, a number of states have developed assessments that measure achievement, at least in part, on the basis of performance. For example:
Eleven states test students’ writing ability by evaluating writing passages.
New York State, as part of a statewide science test administered to some 200,000 4th graders, this year required pupils to conduct a short experiment and report the results.
“The kids really loved it,” said Douglas Reynolds, bureau chief for science education in the state department of education. “I don’t think I’ve ever before heard kids say, ‘Can we take the test again tomorrow?”’
California includes open-ended questions as part of a 12th-grade mathematics test. This fall, officials there plan to ask the legislature for additional funds to begin developing a statewide performance assessment for all students in math, sci4ence, history, and literature.
The Connecticut effort, according to Ms. Baron, represents the “next giant step” in the drive toward performance testing.
It began in 1987, when the state board of education adopted a “common core of learning.” That document outlined the board’s “policy on the skills, knowledge, and attitudes that are expected of Connecticut’s public secondary-school graduates.”
In addition to recommending an understanding of such subjects as literature, history, and mathematics, the document urges the development of reasoning and problem-solving skills and the fostering of attitudes such as “intellectual curiosity” and good “interpersonal relations.”
When it adopted the statement, the board also asked the department of education to develop a system for monitoring the extent to which students attained the goals set forth. The board agreed that the Connecticut Assessment of Educational Progress, which tests high-school students in a broad range of subjects, should be converted to the “common core of learning” assessment.
In addition to seeking ways to measure the skills included in the document, the department was also motivated, Ms. Baron said, by national and international assessments that showed students across the country performing poorly in math and science.
“It’s not that we were doing such a wonderful job we should be afraid of tampering with the system,” she said. “It seemed like a good time to be experimental.”
Measuring New Skills
In undertaking the new assessment system, state officials agreed to create an alternative to traditional multiple-choice tests.
“We believe that the time has come to develop assessments that are catalysts for the kind of learning that we value,” states a report on the assessment prepared by department officials. “The model currently in place--atomistic tasks, passive learning, and primarily convergent thinking--has been too well served for too many years by multiple-choice testing.”
Unlike conventional tests, the new system calls for evaluating student performance on a range of tasks. Such tasks--including exhibitions, work portfolios, and writing projects--are expected to be designed to enable students to formulate questions, investigate evidence, analyze data, and discuss results. (See example on page XX.)
In addition to evaluating students’ problem-solving abilities, the performance assessments will also, Ms. Baron said, “open a window into measuring a set of skills nobody ever measured.”
“If you don’t put students into groups,” she said, “you don’t know how well they work in groups.”
Not ‘Reinventing Wheels’
To begin implementing the project, officials last month began training teachers from about 35 schools in Connecticut, Michigan, Minnesota, New York, Texas, Vermont, and Wisconsin, as well as members of the Coalition of Essential Schools. During the workshops, officials outlined sample tasks and methods of scoring, as well as guidelines teachers can use to design their own tasks.
Such an effort will help promote teachers’ professionalism by enabling them to participate in the creation of the statewide assessment, noted Mr. Wiggins. “Teachers will not be mere implementers of someone else’s tasks, but they can design and implement them as well,” he said.
Although Connecticut expects to implement the test statewide in 1991, officials from several of the other participating states said they will examine the program to see if it would fit in their state testing programs.
“We’re looking at various assessments from different states,” said Joseph Huckstein, director of science education in the Texas Department of Education. This year, the Texas legislature mandated a science test for students in grades 3, 5, 7, 9, and 11.
Connecticut officials also intend to coordinate their project as broadly as possible with existing school-reform efforts. Ms. Baron noted, for example, that among those attending last month’s workshops were representatives from San Antonio, one of the sites involved in the American Association for the Advancement of Science’s Project 2061 to redesign math and science curricula.
“We hope not to reinvent wheels,” she said.
Political, Not Technical
Critics of conventional testing programs praised the Connecticut program as a promising alternative.
“It’s very consonant with the direction we recommend,” said Senta A. Raizen, director of the National Center for Improving Science Education. In a report on elementary schools issued last month, the center recommended the development of “authentic” forms of assessment that would probe students’ depths of understanding as well as factual knowledge. (See Education Week, Aug. 2, 1989.)
But, Ms. Raizen warned: “My concern is that this may aggravate the notion teachers have that hands-on science consists of observation and recording, period. If that’s all you do, you’re not getting very far.’'
“You need to ask why you observe, why you record,” she said.
Mr. Brewer of Vermont added that some educators have questioned whether the project will succeed in influencing instruction throughout Connecticut, since the state will be testing only a sample of schools.
“They are wise to start out the way they are, but if we are to be successful, we ought to do it on a large scale,” he said. “We hope teachers will teach to this test. The only way to do that is to report results by school.”
Regardless of whether the project affects all teachers, Mr. Wiggins said, it has succeeded in convincing policymakers that reforms in assessment are linked to changes in curriculum and instruction.
“Too often,” he said, “they are done in isolation, as though assessment is only a psychometric issue. It isn’t.”
“The issues invariably turn on political questions, not technical questions,” he added. “The technical issues are persnickety, but solvable. The question is, are we willing to spend money on public relations to convince the public that standardized tests are inadequate?”