In Huntsville, Alabama, the school board has decided to spend $1.7 million to bring Teach For America interns to district classrooms. This has prompted an assistant professor at the University of Alabama, Huntsville, to raise some critical questions. This is the third post in this series.
Guest post by Philip Kovacs.
On the web page where Teach For America shares research, they boldly state: “A large and growing body of independent research shows that Teach For America corps members make as much of an impact on student achievement as veteran teachers.” I will show this is an absurd claim simply by analyzing the reports made public on their “research” page. I will not look at or include other research which shows that TFA has negative effects on student test scores in some places, as others have already done so.
However, it pains me to engage in this analysis because it forces me to enter a conversation that I don’t believe we should be having, a conversation that is undergirded by the belief that the best way to evaluate teachers is to look at student scores on tests that have been shown to be invalid, unreliable and flat out ridiculous.
When we reduce children to numbers and teachers to spreadsheet-managers we undermine education by dehumanizing the process. In our test-driven world, teachers have become bureaucrats; we’ve reduced “learning” to scoring, and, shamefully, we’ve reduced schooling to sorting, equating a slight change in a test score with “achievement.” Take a moment to consider the achievements in your life...now quantify them and tell me in the comments below that the number equals the reality of that achieving moment.
Educare, (from the Latin, meaning “to draw out,”) has left the building, and education today involves shoveling as much product as possible into subdued vessels followed by the obligatory weighing to see if the vessels hold more value, where value has nothing to do with the values that lead to happiness or success robustly defined.
Rather than staring at spreadsheets, myopically focused on data, I propose re-imagining educational reform by asking the question “what makes a great teacher a great teacher?” Yes, there are other questions that must be asked, but the teacher is the issue at hand, so before turning to problematizing TFA, allow me to wax poetic and imagine answers to the question, “what makes a great teacher a great teacher?”
Great teachers help children identify, access and utilize information from various knowledge systems in order to effect progressive change in a given space over a given amount of time.
Great teachers are well-organized multi-taskers who help children: synthesize ideas, evaluate and appraise work, discriminate amongst results, and critique and defend critiques.
Great teachers are individuals who encourage children to take calculated risks and engage passionately with work both chosen and assigned. Great teachers ask children to employ creative curiosity, to think meta-cognitively, to act patiently as deconstructive-reconstructive thinkers who question, pose problems, imagine alternatives, and transfer ideas and artifacts from one realm into another.
Finally, great teachers make a commitment to the children and communities they serve, going above and beyond classrooms to help all students develop and grow.
Unfortunately, until we move schools away from the industrial model that we have relied on for far too long, we’ll continue to prevent the above from happening, talking as we do about children as numbers and teachers as producers, paying for all of it with an educational enterprise that reduces children to “mere articles of commerce”. The above “great teacher attributes,” un-testable, will remain ignored while we listen to philanthrocapitalists talk about an educational bottom line that they know nothing about. And when they are done talking, they’ll force their reforms down our throats.
In an effort to increase the amount of light being directed towards the “research” that is being used to justify the replacement of credentialed professionals with fast-tracked do-gooders, allow me to briefly address TFA’s reports, and then I will return to the often grueling but always rewarding work of helping dedicated individuals mature into great teachers.
These reports are available on TFA’s “research” page.
This report is an apples and fruit basket comparison that relies on Value Added Measurement (VAM) and opens with the following disclaimer: “The analysis contained within this report is not based on a comprehensive set of measures upon which the quality of teacher training programs should be ranked.” Importantly, a peer-reviewed study found “several logical and empirical weaknesses of the [VAM]” used in the report.
One major problem is that the TFA group is compared to the entire state rather than teachers in the same schools. Indeed, even comparing teachers in one school to teachers in the same school can be problematic, especially in schools where students have multiple teachers or in schools where some teachers are not inclined to help TFA recruits swim, rather than tread water, during their first year.
For purposes of the “Report Card,” comparing the gains on tests made by teachers in low performing schools to gains on tests made by teachers in higher performing schools is problematic because it is easier and more impressive to raise scores from, for example, the 45th to 67h percentile than from the 85th to 90th, which is no doubt going on in many upper level schools and classrooms. Given that TFA recruits are only in the lowest performing schools, they are much likelier to be working with students who are easier to move, making it appear they have larger gains, on one small indicator of a child’s education.
Taking this into consideration, it appears that TFA has an impact in math and reading instruction, though that is likely to disappear once discrepancies in population are taken into account. The authors’ warning, the peer reviewed report arguing that the TVAAS “contains several logical and empirical weaknesses,” and the “apples to fruit basket” comparison make the report problematic at best.
“Portal Report From University of NC Chapel Hill”
While TFA members outperform recent UNC graduates in 5/9 indicators, the study does not include teachers with more than 5 years of experience. This is problematic because many teachers continue to grow as professionals in multiple ways. Those who do not continue to grow do not deserve the title of teacher. Therefore we must ask ourselves, how many great teachers were left out of the comparison group?
To be fair, if you believe the VAM used in this analysis, which you should not, high school math scores show great gains for TFA recruits on this particular measurement.
One out of four middle school indicators show larger improvements for TFA and there is no difference at the elementary level, on test scores using VAM, than the control group: graduates of UNC with less than 5 years of experience.
These results are mixed but this study is also problematic for those who claim TFA has a long-term positive effect on school districts because of the documented high turnover of TFA recruits: 85% are gone after four years. And this makes the “Portal Report” not just problematic, but damning. In the report’s own words:
The final and in some ways most important finding of this study is that first year teachers perform worse than those with four yeas or experience in 10 out of 11 comparisons, and in their second year as teachers perform worse in 6 out of 11 comparisons. To provide perspective, we estimated that elementary students taught math by a first year teacher lose the equivalent of 21 days of schooling when compared to similar students taught by teachers with four years of experience.
This report finds TFA recruits to be better than novice teachers (less than 3 years experience) in math and science instruction.
This finding, however, is problematic by the authors’ admission. They could only match 84% of students to teachers their teachers. From the “What Works Clearing House:" "...differences in performance in TFA and non-TFA classes may be influenced by differences in student ability in specific subjects. As a result, the study may not accurately measure the effect of having a TFA teacher.”
From the authors, further problematizing the report (emphasis mine): “When both teacher quality and student performance are systematically related to student ability and motivation, the relationship between teacher and student performance cannot be reliably estimated.”
This report also acknowledges that teachers with high performing students may have reduced effects because of the difficulty of raising high performing students test scores, an issue I pointed out in the Tennessee data.
TFA recruits look slightly better on math tests but no better than reading, making the report mixed.
The small sample size is problematic because the authors are comparing smaller groups of TFA to groups of teachers that are about five times as large. The larger the sample, the closer to the middle the results are going to be. And again, as with the Tennessee data and the “Portal Report,” the effects of teachers working with higher performing students are going to look lower than the recruits working with lower performers.
The authors of this report come close to acknowledging issues with their own data, as they conclude with words that are hardly an endorsement. On page 46 they note (emphasis mine): "...a more focused approach in observing differences between math and reading classrooms by TFA and non-TFA-led classrooms might generate insight into why TFA teachers may show positive results with respect to math achievement and why non-TFA teachers may show similar results to TFA teachers for reading achievement.”
“May show”? If that is the case, TFA’s claim needs the word “may” inserted. Tellingly, the authors suggest a more focused approach to observing classrooms. I support finding a more focused approach than looking at test score data. We might start by looking at children.
According to this report, TFA teachers are making a positive increase in math scores, doing about the same as certified teachers on reading scores, and doing worse with Hispanic students on both reading and math tests. The results are therefore mixed, unless the reader believes we should dismiss the Hispanic students. The study was conducted in Austin, Texas.
Rendering the entire report problematic are the words on the first page (emphasis mine): “Given data limitations and the requirements of the rider, the analyses were limited to descriptive and inferential statistics. As such readers are encouraged to interpret the findings related to student achievement with caution.”
“With caution” is nowhere near “shows.”
This report is problematic at best but ultimately contradicts TFA’s claim regarding veteran teachers. From page 31 of the report (emphasis mine): “TFA teachers produce student achievement gains in middle school math that exceed those of teachers from other pathways with comparable experience.” Not, contra TFA’s claim, those with more experience.
Furthermore and importantly, from pages 23-24 of the report (emphasis mine): “However, this [where the “this” means gains on middle school math tests] is largely eliminated once the much higher attrition of TFA teachers is taken into account.”
Said differently, when you start to consider how quickly TFA recruits abandon ship, the gain is negated. Concerned tax-payers might stop to consider why traditional routes into classrooms produce teachers who stay so much longer. Regardless, this report should probably be taken off of the TFA “research” page given the author’s addendum.
This report argues: “In all areas except for social studies, TFA corps members were statistically significantly more effective than other new teachers.” The key phrase here is “new teachers.” The authors do not reveal whether or not the “new teachers” contained unlicensed, uncertified teachers teaching out-of-field, as is common in poor, hard to staff schools.
When compared to experienced teachers, the authors report that there was “no significant difference” between the test data for TFA recruits and traditionally trained, experienced, teachers. Furthermore, the authors do not define what “experienced” means. Does that body of “experienced” teachers contain only teachers with three years of experience? After all, they are, by the report’s own admission, more experienced than most TFA recruits because most TFA recruits leave after year three. Arguably, if the authors had used only teachers with five years of experience in the control, the results would have been significantly different.
All that being said, this is the only report, out of a total of 12 (see my previous post for analysis of the other five reports) that even remotely supports TFA’s claim that “Teach For America corps members make as much of an impact on student achievement as veteran teachers.”
There is simply no “large and growing body of evidence” suggesting TFA corps members make as much of an impact on student achievement as veteran teachers.” In fact, there is a growing body of research suggesting otherwise, that TFA members have a negative impact on student achievement. But to treat that at length, I will have to ask Mr. Cody for another post.
A note to the authors of these reports...I don’t doubt you or your methods, but I do believe your research is being misused and misrepresented. Furthermore, in all due respect, you should really acknowledge the issues with using Value Added Measures to determine student performance on tests. And all of us need to stop equating achievement with gains on test scores.
What do you think of the evidence Teach For America has presented regarding their effectiveness?
Previous posts in this series:
Philip Kovacs Takes on TFA in Huntsville
Philip Kovacs: Huntsville Takes a Closer Look at Teach For America’s “Research”
Philip Kovacs is an assistant, tenure tracked professor at the University of Alabama, Huntsville.
The opinions expressed in Living in Dialogue are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.