Schools Search For Yardstick To Gauge Reforms' Effects
Boulder, Colo--In states that have poured millions of dollars into education reform, pressure is steadily building on public officials to demonstrate that the investment is producing better schools, educators agreed at a conference here last week.
But the state testing directors and others who participated in discussions on assessment sponsored by the Education Commission of the States acknowledged that they are unsure how or what to measure in order to evaluate the results of school reform.
And while they are searching for "indicators" or signs of progress to evaluate in their own states and to compare their efforts with those of other states, the officials said they worry that they are running out of time. The willingness of the public to spend more money on school improvement will soon evaporate, they said, if progress cannot be demonstrated.
In the past four years, policymakers have received "relatively little response" to their requests for evidence of gains made under the school-reform movement, said John T. Casteen 3rd, secretary of education for Virginia.
Legislators do not demand immediate results from their investments in education, he said, but they are asking with "increasing urgency" what results to expect and when to expect them. And they are "increasingly uneasy with results that are not comparable over time and over geographic boundaries."
Wilhelmina Delco, a Texas state representative, added that as legislators spend more and more of their limited state funds on education, they are beginning to wonder whether the education community is doing an adequate job of accounting for those funds.
Business leaders, in particular, are voicing "irritation ranging up to disgust" at some educators' insistence that they do not want to be pushed to measure the effects of the investment in schools, said Martha Darling, executive director of an education study for the Business Roundtable of Washington State.
Part of the problem is that the questions about the effects of school reform may not have answers, given the current knowledge about assessment, some participants said.
"We have the technology to see whether or not basic changes in achievement have taken place," said Frank B. Womer, professor of education at the University of Michigan, who served as director of the National Assessment of Educational Progress from 1967 to 1971. "I'm not at all sure whether we will be able to pin that achievement to specific reforms."
In general, he said, states have made little use of the evaluative tools that are available to direct educational policy, and have invested little money in assessing school reform.
Calvin M. Frazier, commissioner of education for Colorado, noted that his state has spent millions of dollars on education reform in the past few years, but is proposing to spend "one-eighth of 1 percent" to judge the outcome of that investment.
"We don't know how to evaluate multiple approaches to school reform," said Margaret E. Goertz, a research scientist with the Educational Testing Service.
"It takes much more extensive data, which the states don't routinely collect, and much more sophisticated analysis," she said.
So far, she noted, educators have not come up with alternatives to paper-and-pencil tests. As a result, she said, "there will be pressure to test more."
States are already "virtually testing anything that moves," as J. Robert Coldiron, chief of the division of educational quality assessment in the Pennsylvania State Department of Education, put it.
Beverly Anderson, with the ecs, noted that in 1970 only six states had statewide testing programs, and minimum-competency tests for students were virtually unknown. But she said that "if you look around now, there are only a couple of states in this country that don't have some kind of state assessment or minimum-competency test."
Ms. Goertz said that states are also adding more grade levels and subjects to existing tests and rais-ing test standards.
Similarly, by early 1984, 18 states had adopted some sort of testing requirement for new teachers, and 29 more states were considering it, according to the ecs States are also moving forward with increased testing of school administrators and more testing in higher education.
Roger Neppl, director of planning and evaluation for the Colorado State Department of Education, noted that this year no fewer than five bills concerned with testing were introduced in the Colorado legislature.
There was surprisingly little argument here against the use of assessments to provide state-by-state comparisons of educational progress, and some state testing directors even seemed willing to discuss the development of a national test of student achievement. Many said they believed that state-by-state comparisons of achievement are inevitable.
But one problem in devising such comparisons, they noted, will be the great diversity in who is tested and when; the kinds of data collected; the subjects tested and the test items used; and the way in which tests are scored and the results released.
Thomas H. Fisher, director of student assessment for the Florida State Department of Education, recently reviewed student testing programs in the Southeastern states for the Southern Regional Education Board. "Everybody is doing something different and nothing can be compared,'' he remarked.
Ms. Goertz said that in examining testing requirements for teachers for the National Institute of Education, ets officials "were very surprised by the diversity we found across the 50 states. We couldn't find two states that looked the same."
"When one talks of a national teacher examination," she added, "one wonders what the states are going to be willing to give up."
'Important and Appropriate'
The push is on to develop some indicators of educational progress. "The data currently used to assess the health of the educational system are inadequate," said William Pierce, executive director of the Council of Chief State School Officers, adding that if educators do not develop state comparisons based on "important and appropriate" measures of educational progress soon, "someone else will."
At their annual meeting last year, the chief school officers voted to work on a plan for developing such state-by-state comparisons. This November, the chiefs will consider adoption of the plan, which will include measures of student achievement.
The council is also establishing a center for the coordination of educational assessment and evaluation, for which it hopes to have hired a staff by Sept. 1.
Meanwhile, 8 of 41 states surveyed by the chiefs--California, Florida, Kentucky, Ohio, South Carolina, Utah, Washington, and West Virginia--have developed6some statewide indicators of their education systems' health, and a number of other states are moving in that direction.
By next year, the number of states with state "report cards" on educational progress will probably have doubled, predicted Jane Armstrong, a senior policy analyst with the ecs
California, for example, has developed a performance report that takes into account course enrollments; scores on state tests, sat and advanced-placement scores; dropout and attendance rates; participation in extracurricular activities; and the amount of homework and writing that students complete.
James R. Smith of the California State Department of Education said the indicators were developed in part to "buy time," since recent reforms may require 10 years or longer to come to fruition.
"I worry that people will want to see results too fast," said Gerald W. Bracey, director of research, evaluation, and testing for the Virginia State Department of Education. Policymakers may be tempted to use inappropriate measures to assess education reform, he said, "not just as an expedient to get re-elected, but also because they want to show their constituency some results."
"State leaders generally understand the inappropriateness of using such gross measures as the sat as the nation's educational scorecard," agreed Mr. Casteen of Virginia. "But, like most nonspecialists, they find themselves forced to use the measures available to them."
Vol. 04, Issue 39