Letters to the Editor
Proceed With Caution on Whole-School Reform
To the Editor:
As the $150 million in federal funds begins flowing this summer, "comprehensive school reform" will move from the buzzword of the hour to a national strategy involving 2,500 schools ("Comprehensive School Reform Will Need Comprehensive Support," and "Comprehensive School Reform Can Debunk Myths About Change," June 24, 1998; "Reform Effort Expects Wave of New Takers," April 22, 1998). While I am a supporter of this movement (I work with a whole-school design model), I believe there are several reasons to proceed with caution.
First, the recent RAND Corp. report, which found that only half of 40 schools studied were implementing or fulfilling the core practices of the New American Schools design models, gives one pause. One interpretation of these findings is that the most effective designs demonstrate higher levels of implementation. An equally plausible explanation is that ease of implementation suggests lack of comprehensiveness and superficiality of the changes a design requires of schools. On the other hand, a school design that "breaks the mold" might be arduous for a school to adopt, demanding deep-seated rather than skin-deep change. The ultimate proof, as RAND points out, will be in improved student performance, not merely ease of implementation.
Second, once a school adopts a school design, all its needs tend to be filtered solely through the lens of the design. In theory, whole-school designs are supposed to be able to coordinate all aspects of school restructuring--professional development, curriculum, assessment, parental involvement. But can a school design really be all things to all schools? For instance, several school designs do not provide curriculum, nor do they promote specific instructional strategies. Instead, these designs provide content-free "frameworks" or "guides" that facilitate the writing of curriculum. But suppose a school has a critical need, such as improving reading comprehension, that can't be met by the design? Where's the incentive for the design to direct the school to another service provider that might address its need? In this highly competitive marketplace of school designs, will they do the right thing and admit when they can't meet schools' professional-development needs?
Third, it's remarkable to suggest, as Robert Slavin does in your April news article, that there will be many poor-quality school designs, and that those schools which adopt these "garbage" designs will be no worse than before. Mr. Slavin seems to have bought into the popular myth that most schools are so bad any change is better than no change.
Yet there's plenty of evidence to suggest that change, if carelessly implemented, can actually make things worse. As Michael Fullan has pointed out, superficial innovations that are introduced in an atmosphere of crisis can make matters worse. Indeed, I work with many teachers who are in a perpetual state of panic, caught between crushing district mandates and the need to raise standardized-test scores. Unless a school design can address these concerns while it works in a meaningful way to improve teaching and learning, it can actually drain resources and attention away from school improvement.
If Mr. Slavin is right about the potential for low-quality school designs--and I think he is--then it's worrisome that the sorting out of strong designs from weak ones will be so public a process. To make matters worse, the subjects of this national experiment are America's most vulnerable children: Title I students.
Finally, the notion of "comprehensive" is not without its problems. There's a "sink or swim" mentality--schools are encouraged to embrace comprehensive change at breakneck speed. But it's far from settled whether it is possible or even wise for a school to undergo restructuring on multiple fronts simultaneously. What's the effect of sweeping, rapid change? When does too much of a good thing lead to burnout and unnecessary stress?
There's a lot to celebrate in the movement toward whole-school change, but there are also reasons we should continue to subject it to critical analysis.
'Gain Scores' Tell Us More About a Teacher's Impact
To the Editor:
Much appreciation to Dale Ballou and Michael Podgursky for insightful answers to the "unanswered questions concerning the national certification of teachers" ("Some Unanswered Questions Concerning National Board Certification of Teachers," June 10, 1998). It's odd, however, in their discussion of "identifying superior teachers," that while they point up the need to relate teaching quality to advances in student learning, the authors ignore the growing interest in using the "gain scores" of students aggregated by their teachers for assessing the teaching-learning process.
The University of Tennessee statistician William L. Sanders has for 11 years gathered "value added" test results for all of Tennessee's students, by each subject teacher. Mr. Sanders found that "high gain" teachers--that is, teachers whose students averaged growth of a full year or more for each school year--were 85 percent likely to produce high value the next year, regardless of the background, poverty, and ethnicity of their students. And conversely, weak teachers produced few test gains or even negative progress--likewise, on a consistent basis.
Gain scores tell us more about how much students learned between September and June than periodic normed tests. Traditional tests produce a score of where students are academically, say, in May, not how far they have actually come since September.
Take two students. Sally is reading on the 2nd grade level going into the 5th grade; her teacher helps her gain two years' progress in just nine months through focused activities, tutoring, extra help, teamwork, and lots of attention. On a "value added" test--subtracting her incoming score from her end-of-the-year score--Sally gained 18 months in reading in just a single, nine-month period, a remarkable achievement usually obscured by traditional test data reported by level. Sally remains two years below grade level in her reading, however, and would still score "low" on normed reference tests--and the teacher might look bad.
In contrast, Loren came into the 5th grade reading at the 7th grade level; he then tested at the 6th grade at year's end. He is a full year ahead but actually lost a year's standing (a negative gain score) because his teacher failed to motivate him and even appeared to confuse some of his ideas and concepts.
Why not, as Messrs. Ballou and Podgursky suggested, select outstanding, national-board-certified teachers based in part on how much their students learn--but acknowledge those particular educators who add value? As in Tennessee, data can assess progress and provide an incentive for high-powered teachers to work closely with low-scoring students. Value-added measures may even be cheaper and faster than trying to evaluate boxloads of portfolios, peer recommendations, and fancy videotapes that overlook students' learning.
Until we know how much progress students make under the tutelage of teachers by year, we cannot easily identify those professionals who really make a difference. The United States should thus recognize and license a national corps of teachers who add real value to all students' achievement. These teachers deserve it, and so do the kids.
Bruce S. Cooper
Professor and Vice Chair
Division of Administration, Policy,and Urban Education
New York, N.Y.
Administrators Have Woes, But Less Pay Isn't One
To the Editor:
I was surprised to read in Paul D. Houston's Commentary that prospective and potentially excellent school administrators elect not to pursue that career due to the low level of remuneration they would receive as compared with what they would earn as teachers ("The ABCs of Administrative Shortages," June 3, 1998). After careful reading, I noted that the teachers referred to in the essay were "shocked to find they had to take a cut" in their daily rate of remuneration.
This seemed incredible to me. Many of the students I work with who are taking classes to become certified in school administration tell me just the opposite. A recent student of mine put it quite bluntly when he said, "I need to make more money, and in schools the only way to do that is to go into administration." I thought that perhaps my personal experiences were somehow unique or unusual. I decided to look into things a bit more.
I checked the salary of one of the local superintendents in my area. His salary was $98,435 per year. Five days per week for 50 weeks per year (he was given a two-week vacation each year) yielded a "daily rate" of about $394. I checked the principals' salaries. The average in the district was about $69,000, yielding a daily rate for these individuals of $276. The average teacher's salary in the same district was $35,789 for nine months of work. Five days per week for nine months yielded a daily rate of $198, about half of what the superintendent earned and about 72 percent of what the principals earned.
I then checked national statistics. In looking at data from 1997, I found that superintendents earned, on average, about $94,229, high school principals earned about $69,277, junior high principals about $64,452, elementary principals about $60,922, and teachers about $34,153. The national differentials between teachers' salaries and those of administrators were similar to those in the local district I had selected at random.
I have had many positions in public schools over the years, and I certainly do not deny that the life of the school administrator is hardly an easy one. It is very demanding, and those who work hard and remain dedicated to children and the profession deserve much more credit and money than they receive. However, to suggest that the alleged impending shortage of administrators is due to the notion that teachers (those most likely to pursue an administrative career) earn more as teachers is simply not the case.
James H. Quillen Chair of Excellencein Teaching & Learning
East Tennessee State University
Johnson City, Tenn.
Why Not Fund the Schools Through a Consumer Tax?
To the Editor:
In "Fatally Flawed School Funding Formulas" (Commentary, June 17, 1998), Lewis C. Solmon and Michael Fox mention seven guidelines for policymakers to use in shaping funding formulas. They list adequacy, equity, efficiency, school performance, district stewardship, accountability, and a funding framework that promotes learning. All of these are important considerations individually, and together provide an integrated structure to ensure that school funding is prudently spent to improve the education of all students. The guidelines are to be used to convince taxpayers that the money collected from them is being used wisely.
I believe we should keep the guidelines but change our approach to school funding from direct to indirect means. Members of the taxpaying public pay for many things every time they purchase consumer goods. They fund television programming, including salaries of staff and actors, even though the commercial advertisers select the actual programming. They fund the sports industry, including players' signing bonuses. They fund multimillion-dollar salaries of corporate executives and elaborate corporate entertaining and incentive schemes.
Why is it reasonable to tacitly agree to allow American business to shape the cultural life of the country by using a certain amount of the cost of the food, detergent, cars, or clothes we purchase without insisting that the public good be funded in the same manner?
I would suggest that state officials use collected funds to be spent on public goods the same way businesses fund the entertainment and sports industries: Incorporate a school tax on consumer goods. This would not be a sales tax, but would be incorporated into the base purchase price of consumer goods, just the way money used for advertising and public relations is now incorporated. At the end of a fiscal year, firms would declare their business expenses for advertising and public relations. An equal amount would be declared as a business expense for school support and would be distributed among the states according to factors of school-age population, need, and company sales in the individual states.
The sports industry certainly benefits from funds spent on coaches, playing fields, and uniforms. Brand-name clothing is a "must" for schoolchildren. The music industry depends heavily on purchases by those 18 years old or younger. Teens drive cars and trucks, purchase insurance, and so on. All profit directly from schools and past and present members of the school community. Therefore, all should be committed to assisting in the collection of monies to fund schools on the basis of the guidelines outlined by Messrs. Solmon and Fox.
College of Education
Middle Grades, Not Middle Schools, at Issue
To the Editor:
In their Commentary, "International Competitiveness in Science" (June 17, 1998), Gerald LeTendre and David Baker sound a warning about what they perceive to be an unfounded wave of criticism of both science instruction in U.S. middle schools and middle schools as organizations. They voice particular concern about the criticism that is based on the Third International Mathematics and Science Study and that uses the comparative rankings and average scores for countries in this study as the sole basis for policy recommendations. They further decry the indiscriminate commingling of mathematics and science results in the recommendations.
Taken at face value, there is merit in the authors' concern about premature or misdirected policy recommendations. If the nation were currently suffering from such a phenomenon, their Commentary might serve as a timely warning. But the basis for their concern is unclear. The authors cite few specific examples of the alleged wave of unfounded critique and abuse of data, and so it is difficult to judge the accuracy of their claims. Moreover, one of the three specific instances they do cite is a report that I wrote for the U.S. Department of Education. Messrs. LeTendre and Baker claim that my report "proposes sweeping reforms for U.S. middle schools based on the TIMSS-scores slump."
The fact that the authors cited my report, "Improving Mathematics in the Middle School: Lessons from TIMSS and Related Research," or IMMS, as an example of the problem that spurred their Commentary leads me to question the validity of their concerns. Even a casual reading of my report would reveal that it does none of the things that seem to be of great concern to Messrs. LeTendre and Baker.
Consider the following facts about the report:
- IMMS does not make policy recommendations solely on the basis of comparative rankings and average scores (what the authors call "the TIMSS-scores slump"). Instead, the report is based on a review of three separate analyses conducted as part of TIMSS for Population 2 (grades 7 and 8). In addition to the student-achievement data, from which the average scores and rankings are derived, the report is based on data drawn from the curriculum and textbook analysis and from the analysis of videotaped lessons in three countries. As the title of the report implies, IMMS links the findings reported in TIMSS to other relevant research on students' mathematics achievement, on curriculum materials, and on instructional practice in the upper-elementary and middle grades.
- IMMS does not propose reforms for middle schools as organizations. Instead, the report offers recommendations for mathematics education in the middle grades. In this respect, the report's title (which contains the words "middle schools") may be unfortunate and convey an unintended message to those who read only the title. But the content of the report is quite clear in this regard.
The reported analysis does not point to the organization of middle schools as a contributor to the problem, nor are the recommendations intended to change the organization of middle schools. Rather, IMMS (and a companion report that contains more details) contains an analysis of evidence and a set of recommendations concerning critical aspects of mathematics teaching and learning in the middle grades (grades 5-8). The recommendations apply equally well when those grades are found entirely or partially in a middle school, a junior high school, or a K-8 school.
- IMMS does not commingle mathematics and science. Instead, it focuses clearly on mathematics. Given the authors' concern about others who commingle, I find it curious that they themselves appear to have done this by choosing to cite my paper in a Commentary which has science as its subject-matter focus.
I agree with Messrs. LeTendre and Baker that unfounded educational critiques and policy recommendations that are based on superficial analyses of available evidence are indeed a problem. But many critiques and recommendations are well-founded and based on careful analysis; it is unwise to dismiss these without serious consideration. As "Improving Mathematics" clearly documents, there are compelling needs to be addressed in order to improve the teaching and learning of mathematics in the middle grades, and there are specific steps that can be taken toward improvement.
If the actions needed to improve mathematics teaching and learning in the middle grades are in conflict with fundamental characteristics of the organization of the middle school--if a focus on specific subject matter does not mesh well with an educational philosophy that emphasizes interdisciplinary studies--progress will likely be made only if experts in mathematics education and in middle school education do not dismiss each other's concerns, but instead engage in a serious effort to resolve the problems.
Edward A. Silver
Professor of Education
University of Pittsburgh
Weighted-Grading Dilemmas: A Firsthand Account
To the Editor:
I couldn't agree more with the issues raised by your article on weighted grades ("Weighted Grades Pose Dilemmas in Some Schools," June 17, 1998). We experienced the most disturbing aspect of weighted grades when our son transferred for his senior year of high school. The school district he had attended did not have weighted grades or courses that were designated as "honors." Our son has always excelled in school and consistently took accelerated and advanced courses. He was immediately accepted into honors courses for his senior year at the new school. During the registration process, we were assured that grades from equivalent courses would be adjusted to weighted grades when transferred to his transcript.
When we began the process of applying for scholarships, we found that the assurances we had been given were false. Our son's official transcript showed no additional credit for any courses taken in his first three years. The result was a grade point average of 3.9 rather than the 4.3 he would have received had he been given weighted credit for courses labeled accelerated or advanced. Possibly more damaging was that his class ranking was below the top 20 percent, rather than in the top 5 percent. Although we tried to fight this discrepancy, our son was forced to file scholarship applications with a note explaining what had happened.
After fighting this battle through the high school and then the district office, we found that "district policy" was that weighted transcript grades could only be given when the prior school transcript also reported weighted grades. Nothing could be done if the former school did not award weighted grades. There was no other option, despite numerous conversations between the two schools and, eventually, the superintendent. (The superintendent did offer to write a letter to the colleges to which our son applied.) Our son earned admission to the university he wanted to attend, but he did not receive any merit scholarships or awards.
The most disappointing comment we heard, over and over in this process was: "What is happening to your son is unfair, but there is nothing we can do about it." I told the superintendent that the district should be erring on the side of the student rather than on the side of policy. And if the policy results in this type of damage, it has to be changed. Even though my son has moved on and is doing well in his university studies, it was heartbreaking to see his public education career end this way.
On the other hand, our daughter is beginning her junior year in the same district, and has attended honors courses in all of her core subjects for the past two years. We have found that these courses are nothing more than the local label for the advanced courses we encountered in the past. So our daughter will benefit from the same system that penalized our son.
High/Scope Study Raises Direct-Instruction Questions
To the Editor:
Your recent article on direct reading instruction in Houston, "Drilling in Texas" (On Assignment, June 10, 1998), was a fascinating presentation. The crux, however, was in the last six paragraphs, where the question shifted to "Can the children understand?" rather than "Can the children name the words?" Most of the current spate of articles and commentaries--such as the one in April by columnist William Raspberry of The Washington Post, "Direct Instruction Working Very Well"--focus on the test-score results. (Mr. Raspberry does point out the disquieting data on high teacher-turnover rates.)
I do not dispute the data you give, nor do I even question that they might be obtained by other schools doing the same procedure. As Mr. Raspberry said: "Direct instruction really does sound awful. But it works." While there is an effort nationally to integrate teaching phonics and literature-based reading programs, there are strong advocates on both sides.
My concern is over the other consequences of using direct-instruction methodology. Most of your readers are probably familiar with the High/Scope Perry Preschool study, in which we tracked 123 randomly assigned children from ages 3 through 27. This study found that programs that allow children to initiate their own learning within a teacher-organized environment show significant outcomes for decades.
Of special importance is reduction in both juvenile and adult crime, welfare use, increased employment, and increased home and car ownership. These factors are indicators of a stable family and community life. Further, the program is a sound economic investment. These are extraordinary outcomes for high-quality early-education programs for disadvantaged children.
Your readers probably are not familiar with a second High/Scope long-term study of the outcomes of different preschool curriculum models. We have just reported on a group of young people, now age 23, who were randomly assigned when 3 years old to three different curriculum methodologies. The direct-instruction system was one, the High/Scope methodology and the traditional nursery school approach the other two. The direct-instruction system calls for children to carry out teacher direction systematically and persistently. This approach is described in your article. The other two methods call upon children to plan their own learning, to work cooperatively, and to consider consequences of their actions. The High/Scope approach is documented and systematic; the nursery school approach was developed by two skilled teachers as an expression of their training in understanding children.
The results from the study at age 23, two decades after it started, find serious negative social consequences for children enrolled in a direct-instruction system. The felony arrests were three times greater, suspensions from work were significantly higher, problems with their families were greater, and the schools saw 47 percent of the youngsters in that program as emotionally impaired and in need of special assistance. These outcomes did not occur at the same rates for children in the other two programs.
In spite of the fact that the SRA Reading Mastery group had two years of high-quality academic-skill instruction produced by a scripted program, staffed by teachers with master's degrees committed to the approach, and home visits every two weeks to involve the families actively in the program, these students are not prepared to deal with either the economic or social expectations in adulthood.
While SRA Reading Mastery may be effective with elementary children in the short term, as outlined in your article, the broader conclusions are not encouraging. There is very strong evidence from our rigorous High/Scope study, from the large-scale though less rigorous comparative study in Washington run by Maurice Sykes and Rebecca Marcon, and from an intense comparative study done in Lisbon, Portugal, by Maria Emilia Nabuco and supervised by Oxford University lecturer Kathy Sylva, that direct-instruction systems are not useful in the development of children as students and citizens.
As you write about school reform efforts, long-term performance and outcomes are a vital consideration.
It is possible to orchestrate basic skills and arrange for silent students in hallways, but these are not marketable skills. Independent thinking, group work, and problem-solving, as well as the use of basic skills in daily performance, are required for adult success.
David P. Weikart
High/Scope Educational Research Foundation
On Annenberg Challenge, Author Was 'Dead Wrong'
To the Editor:
About the only thing Evan Keliher got right in his Commentary is his opening premise: "It could be that I'm dead wrong" ("If It Wasn't Around in the Middle Ages, It's a Fad!," June 17, 1998).
Mr. Keliher is dead wrong about the Annenberg Challenge. He got his information not from The Los Angeles Times but from a syndicated column written by Patrick Reilly, who does not work for the Times. Mr. Reilly's attack on the Annenberg Challenge contained a great deal of misinformation--and much that was deceptive, misleading, and downright libelous.
It's doubly disappointing that Mr. Keliher and Education Week chose to give national prominence to third-hand information that should never have been published in the first place.
The most damaging and incorrect bit of information Mr. Keliher repeats is the assumption that Annenberg sites have had no success. In fact, there has been a great deal of success; Mr. Reilly simply didn't bother to look at any of the Annenberg program's data. In citing no one but Mr. Reilly--who didn't even manage to correctly cite how many years the challenge has been operating--Mr. Keliher has given us gossip, not journalism.
San Francisco's Hewlett-Annenberg Challenge--which is in its third year, not its fifth and nowhere near its final year--is documenting the progress not only of individual reforms, but of individual schools and of our project as a whole. The newly released Stanford University report on the progress made by the Bay Area School Reform Collaborative states that "core BASRC strategies are beginning to transform the way schools think about their work. Important issues of implementation exist in some areas ... [y]et we also see that where schools have identified successful activities to address reform, the results have been powerful."
Two cases in point from among our schools:
Four years ago, San Francisco's John Muir Elementary had the second-worst reading scores in the district. Working with the BASRC, the school received funds for an improved library, books, computers, and a parent-educator, among other things. Reading scores have risen by 16 points.
Thurgood Marshall, an inner-city San Francisco high school, has focused its reform effort on math and science. The school, which serves two of the city's poorest neighborhoods, just graduated its first class. Eighty-five percent of its graduates are going to college; 20 percent of those who applied won coveted spots at the University of California, Berkeley.
Our sister Annenberg sites have posted some equally impressive gains. For example:
- Eighty-two percent of Philadelphia's schools improved over last year's baseline, with 71 out of 208 schools meeting their two-year performance targets in one year.
- At 11 New York City Annenberg schools, 81 percent of the first graduating classes were accepted into college, a rate well above that of high school graduates in the rest of the city and state.
- Two Challenge schools, one in Chicago and one in New York, were among five nationwide honored by the U.S. Department of Education as New Urban High Schools.
It's sad that Mr. Keliher, a retired teacher, feels so pessimistic about school reform. But despite his assertions, there have been advances since the Middle Ages. For example, one of the important reforms we urge on our schools and our pupils is critical thinking. That means stopping to ask, "Whose information is this?" and, "What evidence do we have that this is true?"
Writers and editors also need to practice critical thinking--which means checking their facts and using original sources. Those skills are taught in schools of journalism, along with courses in journalistic ethics and libel. Since Mr. Keliher is a retired teacher, he may not have gone to journalism school. But editors--particularly those who work for publications as prestigious as Education Week and The Los Angeles Times--should know better.
San Francisco, Calif.
To the Editor:
Evan Keliher displays both wisdom and graciousness in acknowledging in the very first sentence of his Commentary, "It could be that I'm dead wrong." He is, profoundly so.
Mr. Keliher's unrelenting diatribe on school reform, and, in effect, public education, has little if anything to do with the reality of change and progress wrought by reformers in some of the nation's most troubled schools. Has every effort been successful? Not by a long shot. But does that mean we should turn our backs on millions of at-risk students who grow up in poverty, often in broken homes and broken communities?
Of course, in a Commentary an author has license to vent ad hominem arguments without regard to facts and data. On the other hand, one would think that a former public school teacher would recognize that change takes time, that there are no magic wands, and that countless dedicated teachers and administrators, working collaboratively with parents and the larger community, have begun to give a new generation of students an opportunity to succeed in school and in life.
Across the Annenberg Challenge, which is but one of many important reform efforts now under way, we're seeing improved test results and improved graduation rates in some of our inner-city schools. In addition, with Annenberg support, thousands of teachers have participated in professional-development programs, upgrading their skills. All this, though our grants, in most cases, have been operational in schools for only a year or two (not five, as Mr. Keliher implies).
To give up on reform is to give up on young people who need the most help. Thankfully, the vast majority of teachers, and, dare I say, former teachers are simply not ready to do that. They know that. Mr. Keliher should, too.
Director, Special Projects
Vol. 17, Issue 42, Pages 41, 43-45Published in Print: July 8, 1998, as Letters to the Editor