The best-laid plans for teacher evaluation systems don’t always survive contact with the classroom, and a new study suggests that how principals “tinker” can make a big difference in how teachers really get judged.
University of Connecticut researchers Morgaen Donaldson and Sarah Woulfin found principals were most likely to tweak, shorten, and adjust their classroom observations as they implement new evaluations. Some of these changes were designed to enhance teachers’ professional development in the system, but the researchers found that often principals were trying to ease the burden for accountability measures they were not yet convinced would help.
“For a lot of these evaluation reforms, there were grand plans and a lot of optimism from policymakers, but then the nuts and bolts of how do we make that a reality didn’t get put in place,” said Donaldson, an associate professor of educational leadership and the director for Center for Education Policy Analysis at the University of Connecticut.
Federal incentives at the end of the No Child Left Behind era have led to an explosion of new teacher evaluation systems across states and districts. Since 2009, when the federal Race to the Top grants first encouraged states to overhaul their measures of teacher quality, the Education Commission of the States estimates that nearly every state has changed its evaluation system—some multiple times. Yet a 2017 analysis by the National Council on Teacher Quality found that since 2015, “states have, in many cases, not only stopped advancing [in teacher evaluation systems] but also appear to have lost their sense of urgency.”
In the study released this week in the journal Educational Evaluation and Policy Analysis, the researchers repeatedly interviewed 44 principals over 2013-14, the school year following the launch of Connecticut’s System for Educator Evaluation and Development, or SEED. While the state laid out requirements and guidelines for the evaluations, the weight of implementing them fell on school leaders, including those the researchers tracked from 13 large, medium, and small districts across the state.
They found on average, principals made nearly seven changes to the evaluation systems over the course of the year, most often to the classroom observation process, rubric, and feedback discussions with teachers. Principals overwhelmingly reported trying to adapt the evaluation system to improve teachers’ learning, but in a few cases, they also admitted gaming the system by observing specific classes at a teacher’s request.
Principals reported many minor changes just to fit evaluations into the rest of their work, from pushing back deadlines for teacher conferences to outright skipping some of the evaluation requirements so that they could target a few teachers or instructional problems.
“I think we got our three formal [observations] in for everybody ... And I just had to call it quits,” one principal reported. “I just said, ‘Whatever’s done by Memorial Day is done, and we’re just not doing anymore.’ And I think that’s life.”
School leaders who thought well of the evaluation system at the beginning were more likely to focus their changes on using the evaluations to improve teacher professional development, Donaldson found, while principals who were less confident in the system tended to simply try to streamline or limit the process.
So how can states and districts improve how principals implement evaluation systems? Donaldson argued they need to work more closely with principals when designing the systems and focus principal training on practical strategies.
“A lot of districts jumped into teacher evaluation and started to do a fair amount of training for principals, but most of the training at least initially was on calibration, ... how to assess teaching with reliability,” she said. “I think district leaders really need to clarify what their goal is for teacher evaluation and what they want to achieve through it, and then think about what principals have to do to make that a habit.”
A version of this news article first appeared in the Inside School Research blog.