U.S. Opts Out of International Performance Assessment
The United States has pulled out of an international performance-based assessment in mathematics and science, following objections to the project raised by members of a National Academy of Sciences panel.
The 20-nation study of 9- and 13-year-olds' knowledge and skills in those subjects, expected to be administered next March, will be directed by the Educational Testing Service under grants from the U.S. Education Department and the National Science Foundation.
Although the agencies have agreed to provide funds for the United States to participate in a multiple-choice test, they declined to fund participation in a separate component that would have measured students' abilities on tasks, such as conducting experiments. Only 12 of the 20 nations will participate in the performance-based component.
Members of the national academy's board on international comparative studies in education, which advises the federal government on such efforts, told officials at a private meeting this summer that performance assessment is still in the developmental stage, and should not be used in such a high-profile comparison.
"I don't think performance testing is yet at the stage where one would want to put it in place operationally," Norman M. Bradburn, the panel's chairman, said in an interview. "It's an important thing, and interesting things are going on with its development. But it's still in development."
Archie E. Lapointe, director of the Center for the Assessment of Educational Progress at the ETS, said he was "disappointed" by the decision to opt out of the study.
"My feeling is, let's try it," he said. "Let's see how it works. We'll learn in the process."
The 1991 international assessment will be the second multinational comparison of student achievement conducted by the ETS.
In 1988, the International Assessment of Educational Progress tested 13-year-olds from five countries and four Canadian provinces in math and science. Over all, U.S. students performed at the bottom of the international ranking in math, and near the bottom in science. (See Education Week, Feb. 8, 1989.)
The new study, which involves more countries, including the Soviet Union and China, will test 1,650 students from each of the participating nations. The sample includes students from 110 U.S. schools.
The state of Colorado also agreed to fund a larger sample of its students, to permit comparisons between their performance and that of their peers in other nations.
In addition to the United States, Slovenia (a republic of Yugoslavia), Brazil, France, Israel, Italy, Jordan, Portugal, and Spain have opted out of the performance-assessment component.
The new study responds to many of the questions researchers have raised about international assessments, according to Mr. Lapointe.
"We have done as much as we can think of, and the [academy's] board could recommend, to make sure the data will be accurate," he said.
For example, he said, the student samples will be drawn by Westat Inc., a Maryland-based private research firm, in order to ensure that the quality of the sample is the same in all participating countries.
In addition, the ETS will place monitors in 20 percent of the participating schools to see that the test is administered uniformly, Mr. Lapointe said.
"This is the first time there will have been that kind of quality control over the procedure," he said. "In the past, a lot of countries tended not to pay attention to time limits."
The director added that the new study will also use items developed jointly by the participating countries, rather than those developed for American students, as in the 1988 assessment.
The items on the performance-based component--which require students to conduct experiments and perform measurements, such as pour water from a large container into a smaller one--were developed in England and Scotland, he said. ETS officials field-tested them in the area around Princeton, N.J., where the firm is located.
Nevertheless, said James W. Guthrie, a member of the academy board, panel members were not convinced that the performance-based items were of the same quality as the multiple-choice questions.
"We didn't think it was ready yet," said Mr. Guthrie, professor of education at the University of California at Berkeley. "We had no philosophical objection to performance-based assessment; quite the contrary. We thought it was premature to impose this on a system."
Although he urged the inclusion of the items, Mr. Lapointe acknowledged that there is little research on the validity and reliability of performance assessments.
"We've had 40 or 50 years of [multiple-choice] testing," he said. ''They've been analyzed this side and the next. None of that exists for performance items."