New Orleans, La.
John Deasy, the suprintendent of Los Angeles public schools, opened the annual meeting of the Association of Education Finance and Policy here today with a call for researchers to help school and district administrators making decisions in hot political environments.
“In districts that are pushing this [school improvement] work hard, research gets violently polarized,” Deasy said, adding that he regularly is presented with research by advocates rather than researchers.
“I think the piece that has been disappointing to me—as someone who reads and uses research very carefully—is its misuse as a tool as opposed to its primary purpose of helping me make a decision,” Deasy said. “I need to hear research in the most neutral setting possible to help me make decisions.”
However, Deasy said it is critical for administrators to integrate regular research and data analysis in any effort to improve schools. “Nobody’s going to say they are in favor of ‘education for some children,’ so you have to focus on the behavior of the system: Who’s getting into what courses, who’s getting kicked out of what schools, who is being put in front of these teachers?” he said.
At the same time, he added, researchers need to work harder to help teachers, principals, and others understand the results of those observations without immediately putting them on the defensive.
“The descent on the district to tell us how screwed up we are is not a helpful approach,” Deasy said. “Saying you know we have a problem and you’re going to descend on us, write it up, and the first time we see it is in the L.A. Times is not helpful. ... Give us time to digest [a study] before it shows up on the internet.”
Ironically, Mr. Deasy’s keynote was followed in the afternoon session by “Does the Market Value Value-Added?” a study of the effects of the high-profile release of Los Angeles district data on teacher effectiveness in 2010 and the resulting changes in the district’s evaluation system for teachers.
Co-authors Scott A. Imberman. of Michigan State University, and Michael F. Lovenheim, of Cornell University, tracked differences in housing-price trends in the neighborhoods associated with schools and teachers discussed in the high-profile media stories last year on teachers’ value-added effectiveness scores in different schools.
In an earlier discussion of the study, co-author Imberman noted that the Los Angeles public was confronted with three separate data dumps from three different groups: the initial stories from the Los Angeles Times, the school district’s response and explanation, and then an updated set of data from the newspaper. A family buying a home could look up the value-added scores of a school in the neighborhood to which they were considering moving, or even look at the specific effectiveness score of the teacher their child might have after a move.
However, Imberman and Lovenheim found that while homeowners and parents did seem to change their home-buying in response to changes in average test scores at a given school (and increased home prices as a result, school and teacher value-added reports did not change how families chose homes.
Each set of new data and the stories around it discussed the value added by teachers in slightly different ways, Imberman said, and “I’m not sure anyone believed any of them, frankly.”
The researchers concluded:
That we find no effect of school or teacher value-added information on home prices suggests these school quality measures are not valued by local residents, at least on the margin. This is a surprising result, given the strong relationship found in other studies between these measures and student academic and future labor market success. ... In some sense, however, the heightened controversy could have driven the public to ignore the value-added. Not only did the public debate and the widespread coverage of the L.A. Times' release in the media likely increase awareness of these methods, it also probably made the public more aware of the flaws in these measures."
Deasy, who came in as schools chief in 2011, during the teacher-data debate, said that the difference between the L.A. Times method for calculating teacher effectiveness and the district’s own value-added method caused widespread confusion among teachers and the public.
“It was a conflagration at the time,” Mr. Deasy added. “The entire oxygen around improvement shifted to publication, and that was a problem.”
In the end, the episode did help the district start a broader conversation around multiple measures that should be part of teacher evaluation, he said, as well as a separate conversation about how the district approaches its own research.