Published Online:

In New York, State 'Indicators' Provoke Anger, Reform Plan in Largest District

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

While the Council of Chief State School Officers was voting last month to proceed with cross-state assessments of educational progress, the political volatility of such measures was being underscored in New York, where release of the state's first school-by-school "indicators" provoked angry charges of unfairness from New York City school officials.

Following reports that the state evaluation had found most of New York State's weakest schools in New York City, Schools Chancellor Nathan Quinones immediately called a press conference, denouncing the comparison of the city's schools to others in more affluent areas of the state and questioning the validity of the assessment program.

At the same time, however, he announced he was launching an intensive school-reform program in his district--including the establishment of minimum performance standards for schools.

Describing the evaluation system as "a punitive measure," Mr. Quinones said in an interview last week: "Some schools on the list may be extremely poor, and need to be shaken up, but for those schools that were making progress, it's extremely demoralizing."

The chancellor said he believed states are turning to such public evaluations because they "want to hold schools to a greater sense of accountability." But he suggested that there may also be "internal rivalry among some school officials to upstage one another at times, at both the local and state levels."

"If you want to have a fair comparison," he added, "then it should be a comparison among similar groups. If you're going to compare New York City youngsters, then it's valid to compare them with youngsters in other urban cities throughout New York State."

Assessing Student Performance

The state report assessed all 6,000 public and non-public schools in the state and identified the 10 percent "most in need of improvement." Of the approximately 600 schools on the list, 417 are in New York City.

Called the "Comprehensive Assessment Report," the school-by-school comparison was mandated by the New York Board of Regents' 1984 "Action Plan" for education reform. Its intent is to "professionally and publicly assess the performance of students and educational institutions" in the state, according to Robert Maurer, executive deputy commissioner of education. This year's study is the first of what will be annual reports from the education department to the state's public and non-public elementary, junior-, and senior-high schools.

The car study evaluates each school on the basis of such criteria as three years of student scores on the State Board of Regents' examinations, enrollment and graduation-rate information, dropout and attendance rates, and class size.

In addition to providing indicators of the quality of education in each school, the car requires the state education department to determine the 10 percent of all schools "in most need of assistance," according to Winsor A. Lott, director of the department's division of educational testing. The purpose, he said, is one not of "criticizing the work being done" by educators in those schools, but rather of "working with them to solve the problems."

Those schools identified as needing special assistance must submit an improvement plan to the state by next April.

The state made no public announcement of results but reported them to individual schools at the end of October; the schools have until Dec. 15 to make the results of their evaluation public.

Reporting of the scores, a state education official said, is "entirely a local matter" but one purpose of the car is "to help local citizens get a better understanding of what their schools are doing."

'Guided' By the Study

In developing the New York City performance standards--the first ever for the city's schools--Mr. Quinones said he would be "guided" by the car study, although he charged that it used a "relatively narrow range of criteria" to assess schools. The chancellor also said he would establish a standards-development commission, with adoption of those standards currently set for May.

Schools will then be asked to develop improvement goals and will be given three years to meet them. Those schools that fail to meet their goals will be subject to rezoning, reorganization, and possible closing, Mr. Quinones said.

Other elements of the reform effort, he said, will include lowering class size, increasing remediation, guidance, and dropout-prevention efforts, and constructing new and smaller school buildings.

Other 'Indicators' Programs

Mr. Quinones is not alone in being rankled by the local impact of statewide indicators. But in spite of opposition at the local level, at least nine states have educational-evaluation programs in place,and many more are expected to develop similar plans soon, education analysts say.

And while the specifics of the states' programs vary, some analysts say, they share the common goals of improving school performance statewide and increasing public awareness of the relative success of schools and school systems.

But Irene G. Bandy, Ohio's assistant superintendent of public instruction and a member of the ccsso steering committee that developed the cross-state assessment plan, says the former purpose is much more important than the latter.

She argues that indicators programs in the states are intended less to "compare educational delivery in districts" than to "look at the state profile of education," with the intent of "knowing who is there and what they're dealing with so they can adjust policies and practices to affect student outcomes."

Ohio's indicators program, for example, does not compare schools with each other, said Ms. Bandy, who conducted a 50-state survey of indicators projects for the state chiefs. "We have a statewide summary, and school districts have their own data. They look at themselves and determine whether they need changes."

Indicators programs that include comparisons of districts should proceed with caution, Ms. Bandy suggested. "Although competition many times increases awareness and outcomes," she noted, "in all fairness, you've got to take a look at whether you are in fact comparing like districts."

To those who argue that socioeconomic factors have to be considered when comparing education results in schools, however, New York's Mr. Lott suggests an opposing view: that certain "objective" evaluations of educational delivery are valid.

"There is a school of thought," he said, "that says you can teach anyone to read and write and do mathematics, and it is legitimate to say: 'Here is a building where they are not teaching kids to read and write and do mathematics, and they should develop a plan to do that."'

Comparisons within Groups

In California, educators have made comparison of schools a part of their three-year-old indicators program, but they did so after separating the 800 high schools they evaluate into five socioeconomic groups, ''in order to make comparisons more fair," according to Jeannemarie Solak, a consultant in the state education department's program evaluation and research division.

Each school receives an annual performance report, based on criteria such as course enrollments, achievement-test scores, and dropout and attendance rates, that shows the school's overall "score" for the current year, the change in that score from the previous year, and the school's percentile rank within its socioeconomic group, she said.

The California grouping plan was developed in line with the view of state education officials that educa-tional "achievement and background are connected," Ms. Solak said. "It is not equitable to compare schools that have a great deal of resources--both human and financial--with other schools that don't."

State Comparisons

Florida has chosen to develop an indicators plan whose main purpose is to compare the state's educational program with those of all other states.

The plan was triggered in 1981, when the state board of education resolved that Florida's education system "should be at least as good as the best one-fourth of the state educational programs," according to Thomas H. Fisher, administrator of testing programs in the education department.

Florida officials applaud the recent ccsso cross-state assessment initiative, Mr. Fisher said, because thus far their efforts to compare Florida with other states have been hampered by the lack of available comparative data.

"The idea of competition, which has worked so well in the athletic arena, can work in the academic arena," said Mr. Fisher. "If every state had a state board of education goal like I do, we would be chasing each other around, moving ever upward toward a goal of excellence."

Web Only

You must be logged in to leave a comment. Login | Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories