The recent news that 97 percent of New York City public schools got an A or B under the district’s grading system might be seen as reason for celebration, but critics suggest the grades hold little value and highlight the need to rethink the state assessment system.
The results, they say, reveal far more about flaws in the city’s so-called “progress reports”—and the state testing regime that largely drives them—than they do about the quality of education in the 1.1 million-student district.
Eighty-four percent of the city’s 1,058 public elementary and middle schools received an A on the city’s report cards this year, compared with 38 percent in 2008, while 13 percent received a B, city officials announced this month.
“It tells us virtually nothing about the actual performance of schools,” Aaron M. Pallas, a professor of sociology and education at Teachers College, Columbia University, said of the city’s grades.
Diane Ravitch, an education historian at New York University, was even sharper: She declared the school grades “bogus” in a Sept. 9 opinion piece for the Daily News of New York, saying the city’s report card system “makes a mockery of accountability.”
But Andrew J. Jacob, a spokesman for the New York City Department of Education, defended the ratings, even as he said the district’s demands on schools would continue to rise next year.
“A lot of our schools simply zoomed past what they had been able to achieve in the past,” he said, and thus helped their students “make a great amount of progress.”
He cautioned, however: “The fact that a school earned an A does not mean that it’s where it needs to be.” And Mr. Jacob also noted that the report cards provide plenty of information beyond a simple overall letter grade.
New York City employs a complex methodology to devise its overall letter grades, with the primary driver being results from statewide assessments in reading and mathematics, which themselves have encountered considerable skepticism lately.
The city’s grades are based on three categories: student progress on state tests from one year to the next, which accounts for 60 percent; student performance for the most recent school year, which accounts for 25 percent; and school environment, which makes up 15 percent. The school environment category includes the results of surveys of parents, students, and teachers.
Mr. Pallas of Teachers College argues that one key flaw with the city’s rating system is that it depends heavily on a what he deems a “wholly unreliable” measure of student growth on test scores from year to year that fails to account adequately for statistical error.
Further complicating matters is recent controversy about the reliability of the New York state testing system in gauging student achievement.
Since the implementation of a revamped curriculum and testing system in 2006, the state has been posting some apparently impressive gains in the proportion of students meeting state standards, called Level 3, or exceeding state standards, Level 4.
In June, for example, the state department of education announced that 86 percent of students in grades 3-8 statewide had a passing score on the math exam, up from 66 percent in 2006. For New York City, the passing rate in math for those grades was 82 percent, up from 57 percent three years earlier.
Some other New York cities saw even more dramatic growth. In Buffalo, the proportion of students who passed the math test rose to 63 percent, from 29 percent, over that time period; in Rochester, it grew to 63 percent, from 33 percent.
The state agency did caution in a press release that increases in the average scale scores for students in particular grades were “sometimes relatively smaller.” One explanation: More students who were just below the passing rate may have made slight gains to cross the threshold.
The reliability of the state’s assessment system has been debated for a variety of reasons, from concerns about whether the state may be manipulating the results to make schools look better to whether the state’s exams are especially vulnerable to teaching to the test.
“What people really want to know in New York is, can we trust these big increases?” said Daniel M. Koretz, an education professor at Harvard University. “And my answer is, I don’t know.”
Some observers say recent data from the National Assessment of Educational Progress, dubbed “the nation’s report card,” raise questions about the New York tests. In general, student achievement for 4th and 8th graders in New York state in both reading and math on NAEP has stayed relatively flat. Results from the 2009 reading and math tests are expected out in coming months.
“The New York state assessment has gotten further and further out of line with the National Assessment of Educational Progress, and it now just doesn’t seem to be reporting the same reality at all,” said Michael H. Holzman, an independent research consultant who has done work for the Schott Foundation for Public Education, which is based in Cambridge, Mass.
Ms. Ravitch has been especially outspoken about the state tests. Writing last week for the Bridging Differences blog she co-writes on edweek.org, she accused the New York state education department of rigging the testing system to produce higher scores and argued that the “double-digit gains are phony.” She said the proportion of correct answers a student must supply to advance to a higher level “has steadily fallen in many grades.”
“When states play games with cut scores and conversions from raw scores to scale scores, testing becomes a mighty scam,” she wrote.
Jonathan Burman, a spokesman for the state education department, did not explicitly address that and other criticisms when asked to respond for this story, but he said the state is working on changes to the testing program.
The New York state board of regents has indicated that the state “will raise standards next year, with a higher cut score required to demonstrate proficiency on the state exams,” he said in an e-mail.
State officials have previously said the number of correctly answered items required to meet state standards has declined on some of the state’s tests because the questions have become more difficult. In effect, the adjustments were made to ensure comparability from year to year, according to those officials.
“That makes perfect sense if the test has become sufficiently harder,” said Mr. Koretz, the Harvard professor. “But we don’t know if that’s true.”
Mr. Koretz said he is also concerned about whether the results are being skewed by the test-preparation practices of schools. He said there are a lot of “clone items” on the New York state tests that look virtually identical from year to year.
“I think that’s very risky in a high-stakes assessment, because it encourages people to look for shortcuts,” he said.
A version of this article appeared in the April 09, 1986 edition of Education Week as New York Test Scores Raise Eyebrows