Teach For America: Two Testimonials
To the Editor:
As a Teach For America alum, I was very interested in your article on the national study of this program by the Mathematica research group (“Study Finds Benefits in Teach for America,” June 16, 2004).
Yes, TFA’s structure has drawbacks, including the required two-year commitment, which can be a contributing factor to the high turnover rate of these teachers, and the limited training that the program provides to its new recruits. But in response to concerns about turnover, Teach For America typically provides the school with a new TFA teacher if the original placement chooses to leave at the completion of his or her commitment. While there still is some disruption to the school’s staff, it is minimized by this practice.
What critics of the program don’t realize is that TFA teachers are placed in schools that very few teachers want to work in. Many TFA schools, including the one where I worked, in the Mississippi Delta, must hire TFA teachers or the students will have rotating substitutes “teaching” their classes all year.
Too many well-educated people complain about the state of education, but fail to “dig in” and actually work in low-income and minority areas. TFA teachers may be young and inexperienced, but we are willing to teach where few others will go, and we work hard at providing a quality education for our students.
Teaching students that have been forgotten by most middle- and upper-class Americans is extremely difficult. It is astonishing that our students are only scoring in the 15th percentile, but no one has discovered the magic bullet that takes a student from the 15th percentile to the 70th in one year. Instead, TFA teachers are working to improve our students’ educations percentile by percentile, day by day.
Teach For America teachers fill many needs in schools with teacher shortages, and if we manage to improve student scores, as the study found, then we are making progress with children who need it the most.
To the Editor:
Both of my children, 5th and 9th graders attending the public schools in Horseshoe Lake, Ark., were taught by Teach For America teachers this past year. I am pleased with my sons’ progress, and can report that the TFA teachers at their schools faced each class with energy and enthusiasm.
Teach For America may have its shortcomings, but in my sons’ schools, the program’s teachers behaved with professionalism and taught with skill. I would welcome more of them.
Horseshoe Lake, Ark.
Reader Takes Issue With Hopkins Dropout Study
To the Editor:
I would like to voice some concern about the report “Locating the Dropout Crisis” published in June by researchers from the Center for Social Organization of Schools at Johns Hopkins University in Baltimore (“‘Dropout Factories’ Identified in Hopkins Study,” July 14, 2004).
Let me start with the term “dropout factories,” which is used to categorize schools not graduating significant percentages of students. Such a derogatory term is best reserved for columnists and commentators; it is not the kind of judgmental language expected from researchers. Such a characterization does a serious injustice to schools and educators working diligently to overcome both school and student shortcomings. Notwithstanding that there are low-performing schools and staff members, the reality is that some of the schools designated as dropout factories should more appropriately be applauded for attaining the results they do.
I would suggest that the report supports conclusions that were drawn before the data were collected, confuses cause and effect, and fails to use the data to identify how some “low performing” schools are actually created. Unfortunately, the study may create a sense of validity from the sheer volume of data collected. It also has the aura of validity because of the institution with which it is associated. Considering the conclusions drawn, this confidence may be ill-placed.
Is the focus of the Center for the Social Organization of Schools minority issues? It would appear to be. Had the focus been poverty, I would suggest, the outcome of the research would have been exactly the same. Is there a high correlation between poverty and minority and immigrant students? Yes. The real question, it would seem, would be to ask if minority and immigrant students are unsuccessful in school because they are minority and immigrant or because they are poor. The report indicates that minorities from more affluent areas do as well as nonminorities. The data, to me, suggest that the real problem is poverty.
To imply that this problem of low-performing students is primarily related to the Brown v. Board of Education decision simply leads educators away from finding solutions to the real problem: children raised in homes with little or no support for learning.
New York state is singled out in the full report both for its low-performing and its high-performing schools. Delving into the whys and wherefores of each situation leads to consideration of such concepts as charter schools, vouchers, and the like. But the effect of creating highly competitive schools is to draw the best and brightest from the schools ostensibly serving all students. The elite schools draw students with the ability and, in most cases, the parental support to be successful. What is the effect on other schools when those students, virtually guaranteed of graduating, are removed from a school’s population? The number of “unsuccessful” students then represents a larger percentage. Pull enough students out and only the “unsuccessful” are left.
There is no question that all schools can and should be improved. Nor is there any question that with the right mixture of staff, resources, students, and so on, some schools will be capable of outperforming other schools with similar students. The questionable assumption is that this “chemistry” can be replicated wherever needed. It cannot.
If we are to use research to improve our schools, we need to determine what is cause and what is effect. If our low-performing schools are low-performing because they have a larger proportion of students with characteristics that seem to mark unsuccessful students in all schools, research should be able to isolate those characteristics so that we might address them—for all students in all schools. This research, unfortunately, does not do that.
Joseph H. Crowley
Warwick Area Career & Technical Center
To the Editor:
Your article on the research being done at Johns Hopkins University on dropouts reports on a noteworthy study that found 2,000 American high schools with notably poor “promotion power.” Not surprisingly, the 15 states showing the worst dropout rates include California, Florida, New York, Ohio, and Texas—all states with large urban centers that have high immigrant and minority populations. The researchers also found “the number of schools with weak promoting power"—as evidenced by disappearing students—rose 75 percent since 1993, while the overall number of high schools rose just 8 percent during the decade. This dropout epidemic correlates with the trends toward high-stakes testing and the harsh penalties for school failures.
The real meaning of the phrase “no child left behind” seems to depend on which children we are talking about, and how far behind we let them fall.
Betty Raskoff Kazmin
Retired Los Angeles Teacher
Small-School Benefits Should Not Be Lost
To the Editor:
In response to your front-page article (“Chicago to ‘Start Over’ With 100 Small Schools,” July 14, 2004):
Smaller schools can make a significant difference in Chicago (and elsewhere) so long as school organizers have the authority to hire competent and committed teachers, and operate outside the public school bureaucracy. But the small-school effort will be marginalized if these schools are forced into the “Chicago public school” mold. All Renaissance 2010 schools should be given the same latitude in the design of their academic programs as is given to charter schools (one-third of the new schools are slated to be charter schools under the initiative).
Recent small-school start-ups in the Chicago public school system have not been given sufficient time and resources for planning. These schools need to receive full approval from the district at least one year before they are scheduled to open. Also, the district should assign at least one full-time staff member to coordinate year round with the small school’s planners.
Michael H. Farley
Director of Education
George C. Marshall Foundation
Lake Forest, Ill.
To the Editor:
I am all in favor of small schools that can provide individual attention to students, but I am unsure if housing several small schools in a single building is an effective way to create more personalized learning environments.
It seems to me that crossovers, transitional times, entry and exit times, and lunch and auditorium use will corrupt the “ideal” of the small-school design. Also, it may be difficult for students to perceive that they are actually a part of a small school when, in fact, they go each day to a large school that is now divided up into many schools-within-schools.
I worry that the many advantages of small schools, where students can make use of personalized learning environments, may be lost in the school-within-a- school design.
Education’s Clearinghouse: One Researcher Tells What’s Not Working
To the Editor:
Your front-page article on the What Works Clearinghouse (“‘What Works’ Research Site Unveiled,” July 14, 2004) caught my attention immediately, since, like many researchers, I have been awaiting the initial reports. “Study reviews” are now available on two topics, peer-assisted learning strategies and effective middle school math curricula.
Frankly, I was shocked at the outcomes reported, for many reasons. Since I know nothing of middle school mathematics instruction, I cannot evaluate what it means that only one study met the evidence-based standards of this service. But I know considerably more about peer-assisted learning, enough that I was surprised to learn that a study I had co-authored was selected as one of the eight investigations passing muster with respect to peer-assisted learning.
To be certain, it was a pretty good study, but one simply not designed to be revealing about peer-assisted learning. It was, more correctly, an experimental evaluation of reciprocal instruction of reading-comprehension strategies, which involves peer-assisted learning as one component in a multicomponent treatment. There is no way to draw a conclusion about peer assistance or cooperative learning per se from this study.
How could the Consumer Reports for research in education make such a conceptual error, and include in its analysis of research on peer-assisted learning a study that is insufficiently analytical to inform about peer-assisted learning? Just as disturbing is that 173 studies, most of which ostensibly pertain to peer-assisted learning, were listed as not relevant to the topic. One of the worst aspects of the analyses is that no information is provided for why some reports when screened did not meet the analysts’ criteria. To me, many of these seemed like pretty good studies of peer-assisted learning.
The clearinghouse products must be independently peer-reviewed before they are given any credibility, especially since passing peer review is the gold standard for scientific analyses, and the service’s purpose is about excellence in educational science. Self- labeling as the “Consumer Reports of educational research” is not good enough, nor is sponsorship by the U.S. Department of Education.
There is no doubt that the intent of the clearinghouse documents is to impact educational policy and practice. However, before anyone takes these products seriously, they should be reviewed carefully by the minds and eyes of top-notch educational scientists, individuals with strong track records of contributing to the science. I strongly suspect that, if this were to occur, the clearinghouse would be dealing with a set of very critical reviews. Given the $18.5 million investment in this enterprise, if it is flawed, the Bush administration and the U.S. Congress should be served notice as its sponsors, as should the American public.
Finally, I was offended by the “Reader’s Digest style” of making a condensed report of my study. Lynda Lysynchuk, Nancy Vye, and I wrote a much better article than the article that is the condensation. It is arrogance in the extreme for the clearinghouse to think that its staff can portray excellent studies in condensations.
I learned years ago as an editor—a rank I held for 14 years at three different journals— that it is always a mistake for anyone but the original author of a paper to do a short version of it. Third parties always make embarrassing mistakes. For certain, the individual(s) who prepared the condensation of my paper—the one that they believed is revealing about peer-assisted learning—has/have plenty to be embarrassed about.
Michigan State University
East Lansing, Mich.
A version of this article appeared in the July 28, 2004 edition of Education Week as Letters