Middle school Teach For America teachers in Texas seem to be holding their own in the classroom, outperforming other novice teachers in math, according to a recently released study from the San Antonio, Texas-based Edvance, an independent evaluation firm.
TFA alumni also did better than other veteran teachers in that subject, the study found.
The study stands out for several reasons. For one, it looks at a very large sample size of more than 1,500 TFA teachers and alumni, nearly 500 schools, and more than 11,000 students in each subject area of reading and math. The study also looked across several Texas districts’ programs. Finally, the quasi-experimental methodology used allows for more confidence that the TFA teachers and not confounding factors are responsible for the student achievement, measured here using 2010-11 state standardized test scores.
That’s important, because one problem with much of the research on TFA is that it tends to compare groups of students taught by TFA and non-TFA teachers, without ensuring that those groups’ baseline characteristics are similar. The research is also mainly descriptive.
The research gold standard is a random-assignment experiment, using carefully matched treatment and control groups. But these are costly, time consuming, and logistically complicated. For that reason, only one study on TFA so far has employed a randomized design. That 2004 study, from Mathematica, was smaller in scale. It found that while student achievement overall remained low, students taught by TFA teachers did seem to outpace those taught by other novices in math.
In the meantime, scholars have developed techniques to approximate randomized studies, and the Edvance research uses one such method, known as propensity-score matching. Essentially, for each campus with TFA teachers in the sample, analysts located a non-TFA school with demographically similar characteristics. And then, for each set of students taught by a TFA teacher, they selected students with similar baseline characteristics taught by a non-TFA teacher, to create a control group. Among the characteristics looked at were whether students were economically disadvantaged, limited-English proficient, and what their prior test-score history looked like.
Then, the researchers examined whether, on average, students taught by the TFA teachers made more progress on state standardized tests than those taught by non-TFA students.
For middle school students, defined in this study as grades 6-8, the scholars found that those taught by TFA novices did better than their peers, with a positive effect size of .19 for novice TFA teachers, or about half a year of learning. The effects seemed to compound over time: TFA alumni had an effect size of .27, compared to non-TFA teachers with experience, in math. That amount, the study says, corresponds to close to an additional year of learning for students of TFA alumni.
Reading gains for students of TFA alumni were positive but somewhat weaker, with an effect size of .11. (There was no effect for TFA novices.)
“What we’ve learned is that at middle school, in math, TFA corps members who are early in their career are able to move the needle in mathematics,” said Herbert M. Turner III, the vice president of research for Edvance. “And if they continue teaching in Texas, they’re able to move the needle even more.”
The ‘if,’ of course, is a big one, since TFA’s commitment is just two years. There’s evidence to suggest that more than half of TFA teachers leave their initial teaching positions after this mark, and only about 15 percent remain by their fifth year, according to a study by the University of Connecticut’s Morgaen Donaldson.
As always, there are limitations of the study. Data on the teachers’ certification, degree type, and length of training weren’t available because of privacy concerns. (One of the criticisms of the earlier Mathematica study centered on the lower-than-average level of training and certification of the control group of teachers.)
Finally, the study didn’t find the elementary-level effects that the Mathematica study did. And the weaker results in reading also raise questions about what TFA could do to improve its programming in that area, the study concludes. (Across most teacher-quality literature, reading effects have tended to be smaller than math effects, possibly because language development is probably more dependent on home factors than explicit math instruction.)
Like all studies on TFA, this one is likely to be heavily dissected. Still, it is worth noting that few other teacher-training programs have undergone a comparable level of outside research scrutiny.
A version of this news article first appeared in the Teacher Beat blog.