Proof Lacking on Success of Staff Development
Anecdotes about districts’ success stories with particular professional-development brands, services, and approaches are common in today’s marketplace. But is there proof that any of them actually work?
For the most part, the answer is no, according to scholars who have studied the link between postlicensure teacher training and student academic achievement. Reasons for that dearth of evidence include a general lack of rigor in education research, as well as specific obstacles that make studying professional development’s impact on student achievement a challenge.
Few studies of professional development employ scientifically rigorous methodologies. The research literature on the training, scholars say, is largely qualitative or descriptive, and therefore not capable of answering nuanced cause-and-effect questions.
At the same time, there are many problems with those programs and studies that do purport to tackle the student-achievement question.
Sorting Through the Jumble to Achieve Success
“First, the intervention itself should be workable, and some are not supported by theory or scientific action. Second, the program needs to be fully implemented if you want to see any effects, and in many cases, fidelity is a real challenge,” said Kwang S. Yoon, an analyst at the Washington-based American Institutes of Research who studies in-service training. “And the third piece is the intervention research itself. It may be weak in design.”
For a 2007 review commissioned by the U.S. Department of Education, Mr. Yoon and colleagues pulled more than 1,300 potentially relevant research studies of professional development conducted between 1986 and 2006.
Only 132 specifically focused on K-12 in-service training in reading, mathematics, or science. And of those, only nine studies met evidence standards set by the What Works Clearinghouse, the arm of the federal Institute of Education Sciences that reviews experimental research on program impact.
Time on Task
The review found that, across the nine studies, only the teachers who received a substantial amount of development boosted students’ scores. Three of the nine studies, those that examined summer institutes or workshops between five and 14 hours in length, showed no effect on student achievement. The studies that looked at teachers trained between 30 and 100 hours were correlated with positive student-achievement gains.
Advocates have seized on those findings as evidence that the amount of time currently spent on professional development is insufficient. But given the small sample sizes of most of those studies, Mr. Yoon said, they don’t provide conclusive answers to the question of how much time spent in training matters.
Rather, he said, the effects of professional development are likely functions of both the time and the quality of the specific training. “You can have many, many hours without much engagement,” he noted. “Any serious teacher change or teacher learning requires intensive treatment of some topic of significance.”
An added problem is that professional development is mediated through teachers’ own practices. A successful in-service-training program, therefore, must inculcate in teachers behaviors that improve outcomes for students.
“Even if professional development is effective and a teacher learned something, what makes them really improve their practice in the classroom when they are so busy and so tired?” Mr. Yoon said. “I don’t think there is a huge external incentive for the teachers to practice their new learning. ... I think that’s a huge [research] gap that we do not think to pay much attention to.”
These mini-profiles—including video interviews—are meant to provide insight, but not to serve as representative examples of the districts in which they teach or programs in question. Their diverse experiences highlight the challenges districts face in providing high-quality training matched to each teacher’s needs.
Some of the largest-scale studies have found that even when teachers have indeed changed practices in response to a professional-development intervention, those changes haven’t led to greater student learning.
A 2008 federally financed study used a randomized experiment to look at the impact of two early-reading intervention programs. It found that the intervention caused significant increases in teachers’ knowledge and changes in their teaching practices, but did not significantly enhance students’ reading achievement.
Many professional-development advocates say one way to ensure that teachers both have enough time for professional development and work to improve their own practice is through site-based professional learning communities. Such communities are formed of teams of teachers who meet frequently to review student-achievement information and tailor instruction accordingly.
High-quality studies specifically focused on the effect of the PLC format of professional development remain sparse, despite the model’s common-sense appeal. The studies that do exist suggest that the success of such endeavors might hinge on having a formal, systematic approach and possibly experts to help guide teachers.
A study published in 2009 in the Elementary School Journal found an achievement edge for schools whose learning teams relied on a set of formal protocols, a leadership structure that guided meetings, and a process for setting forth and solving problems. The study used a quasi-experimental methodology to compare students in nine Title I schools that used that specific framework with students in six other schools using a variety of other school improvement models.
Some scholars worry that the pendulum has already swung too far toward site-based development without proper attention to how the training is structured and led.
“For a long time, most professional development was guided or directed by a central office or a regional office, and those efforts lacked the contextual relevance that was really necessary,” said Thomas R. Guskey, a professor of educational psychology at the University of Kentucky. “Now, we’ve swung the other way and said we have to be completely site-based. ... Solutions can’t always come from inside, and oftentimes the findings from research can be particularly instructive, but teachers need guidance and direction on what can be done to bring it to bear in their classrooms.”
Russell M. Gersten, a professor emeritus of education at the University of Oregon, seconds the idea that researchers need to do more to investigate features that seem to yield the most effective site-based training. He and colleagues crafted and tested their own approach for building 1st grade teachers’ capacity to teach reading comprehension and vocabulary, through facilitated study groups that met to discuss empirical reading research and create aligned lesson plans and instruction.
A randomized study of the approach conducted by Mr. Gersten and his team found that teachers aligned their practices with the research, producing modest but statistically significant progress on measures of oral-vocabulary development. Released this year, the study compared teachers in 19 schools receiving federal Reading First funding in three states.
Gains didn’t show up in other areas, but Mr. Gersten said his team is working on a larger-scale study with more statistical “power” to see if the results can be replicated.
Vol. 30, Issue 11, Pages s4, s5Published in Print: November 10, 2010, as No Proof-Positive for Training Approaches