Arizona’s Peoria United district is accustomed to doing well. The 33,000-student school system touts a fully certified teaching staff, a comprehensive K-12 curriculum, and student test scores well above state and national averages.
So when the state education department released the results of its new accountability system for schools earlier this month and not a single Peoria school appeared at the top of the list, district leaders wondered what had happened.
Arizona has joined a handful of states and some school districts that use an accountability tool called value-added assessment. Rather than simply ranking schools based on raw test scores, the education department measured academic gains this year by comparing the 1998 and 1999 standardized-test scores of about 300,000 students in 1,000 elementary and middle schools, including 45 charter schools.
“These data complete the picture of school effectiveness,” state schools Superintendent Lisa Graham Keegan said in a statement prepared for the Jan. 25 release of the department’s analysis. “This data is vital to parents as they can see the impact of instruction at their children’s schools. By looking at progress, this analysis captures the effect a school has on a student’s progress regardless of where he starts.”
Schools were given a rating of one to five stars in mathematics and reading, depending on how their academic growth over a year compared with that of other schools. The Peoria district, located in northwest Arizona in the cities of Peoria and Glendale, has no five-star schools. Most instead ended up in the middle of the pack, with three stars.
“I don’t have a great deal of confidence this is sending the right message to our communities,” said Linda R. Bromert, Peoria’s assistant administrator for elementary and academic services and K-12 assessment. “Those star ratings were assigned based on simple rank order, rather than any specific criteria set by the state. I really do applaud the state’s efforts to look at standardized-test scores to measure growth, but I disagree with their methodology.”
Method Called Fairer
But state education officials and some researchers say value-added analyses are a much fairer accountability tool than traditional means of reporting test results.
The common practice is to judge schools based on students’ average test scores, or the percent who score at or above a predetermined level. But the strong link between test scores and socioeconomic background has led some to argue that exam scores have more to do with a student’s background than how well schools are doing their jobs. ( “A Question of Value,” May 13, 1998.) Proponents of value-added assessment say a more reasonable way of determining a school’s performance is to focus on gains in the test scores of its students. That method also makes it possible to see which schools’ students are making the most academic progress—regardless of whether their test scores are higher or lower than those of other schools.
The value-added method was pioneered by University of Tennessee researcher William L. Sanders, who has tracked teacher- quality effects in Tennessee. Texas and North Carolina are using customized versions of the Sanders model, and Florida is expected to be using a similar system by next year, said Mark C. Duffy, a research analyst for the Consortium for Policy Research in Education, based at the University of Pennsylvania.
“What’s neat about this is it gives you a way to chart growth rather than setting an absolute target schools have to jump over,” Mr. Duffy said.
Arizona officials said they took an additional step to make sure the assessments were an accurate picture of each school’s performance compared with that of others. Because it may be easier to make gains in low-performing schools, the education department weighted the results to make the process fair for schools with higher test scores.
“We are not suggesting this report be taken in isolation,” said David R. Garcia, the director of research and policy for the department. “We do not want to put undue pressure on schools that started out well.”
Some Schools Pleased
Some schools find the results of Arizona’s assessment report reason to celebrate.
The Tempe Elementary School District in the Phoenix area serves 13,000 students, many of whom are poor and are struggling to learn English. But several Tempe schools got five- and four-star ratings, underscoring what district leaders say they’ve been saying for years: Test scores alone don’t tell the whole story.
“Typically when our district is compared to some of our neighboring districts, we find our scores are lower, and we take hits for it,” said Katherine Bareiss, a district spokeswoman. “This new assessment shows our students are making gains, even when we’re compared to other schools.”
A version of this article appeared in the February 09, 2000 edition of Education Week as Ariz. Ranks Schools By ‘Value Added’ to Scores