The 10 Worst Educational Disasters of the 20th Century: A Traditionalist's List
I am sure my fellow educational traditionalists might substitute some of their own items for the following list of disastrous developments in the field, and might renumber others. But I think all of us would agree that the list includes phenomena that have made major contributions to the decline of public education during the last half of the 20th century, which, contrary to popular opinion, includes the year 2000. The list:
10. Multiculturalism. What started out as an innocent, well-meaning attempt to add or enhance inclusiveness in curricula (for example, celebrating the achievements, holidays, cultures, and other attributes of those besides European-Americans) is now out of control, having mutated into a leveling of all cultures even to the point of denigrating things American. (The National Education Association, not too long ago, proposed the replacement of Thanksgiving Day with a kind of "multicultural-awareness day.") The melting pot as a metaphor is scorned, and in its place we are even told to preserve native languages in American classrooms in hope of achieving a "mosaic." Sorry, but the result, instead, will be a nation of ethnic divisions, closer to a Bosnia.
|Tenure does more damage by protecting the mediocre and incompetent than it does in safeguarding the rights of those who should remain in the profession.|
9. Failure To Challenge Gifted Students. Of course, "gifted" is a term both subjective and controversial. Putting the definition issue aside, however, we must challenge appropriately the group of students that emerges from whatever selection process districts use. Many times this group is not challenged. Why not? Because educators assume one or both of the following: First, that those intellectually "quicker" are also highly motivated and need little structure; second, that it is wrong to provide for the gifted a curriculum the result of which would place them visibly ahead of "the rest." Instead, we have "enrichment" programs often involving the arts or technology, where students have many choices and are left to learn at their own paces. A student with an IQ of 150 is not necessarily more willing to exert the energy necessary to learn than those deemed "average," except, of course, in an area of strong personal interest, which is also true for most students anyway.
A related phenomenon—and actual policy in some districts—is the deliberate refusal to group gifted students with peers, with the straight-faced alibi given that they can grow so much from helping those less able.
8. The Misinterpretation of Bloom's Taxonomy. We all learned this in Education 101: knowledge, comprehension, application, analysis, synthesis, evaluation. However, Benjamin Bloom's hierarchy of learning levels, from lowest to highest, has often been misapplied. Teachers have been taught that pupils should be exposed to the higher levels and shun the lower levels, which are given such pejoratives as "rote memorization of facts," "regurgitation," and so on. Of course, to ignore these "basics" is like trying to build the roof of a house before laying the foundation. Thus, we see students of geography working on a group project such as the making of a three-dimensional, Play-Doh map of Australia without ever having to learn (thus, be held accountable for on a later test) any information about that continent; or English students not understanding much of Shakespeare's use of English (for example, that "Wherefore art thou Romeo?" meant, in 1600, "Why are you called 'Romeo'?" not, "Where are you, Romeo?"), but being asked to write an English sonnet. Both of these practices have been justified by claiming that "synthesis"—often incorrectly defined as simply building or creating things, rather than forming an original communication or work based on the mastery of ideas—is a higher level of learning according to Bloom than low-level "knowledge." A dearth of basic facts and information makes mastery of "higher levels" of knowledge impossible.
The "caving" of the College Board destroyed an accurate and valuable barometer of the nations's educational achievement that had served well for more than half the century.
7. Rabid Developmentalism. No, we should not expect many 3-year-olds to be expert calligraphers, to solve quadratic equations, or even to hold a pen correctly. But in the field of early-childhood education, developmentalism is practically the only philosophy in town; it is rigidly applied and often overlooks unpleasant empirical evidence. E.D. Hirsch's excellent The Schools We Need (1996) provides a detailed, scholarly study of this problem. Yes, we can observe stages of natural development where most children (that is, the average) can optimally learn various concepts; however, do not forget that this is a generality, and that a little prodding will not produce a mental meltdown. "Learn at your own pace" and "She'll learn this when she's ready" are wonderfully sounding phrases, but society can't wait forever. Some children do read very early, some like the challenge of memorizing math facts and other data, some prefer that too-maligned ditto sheet to playing with clay or building with blocks. The irony is that in Europe, the home base of the developmental guru Jean Piaget, most young children are performing skills for which many American educators say they are developmentally unready.
6. Faulty Educational Theories. Sometimes grouped under the inexact label "progressive education," a number of romantic hypotheses metastasized in the education schools and, of course, resulted in the indoctrination of thousands of educators: egalitarianism, overemphasis on self-esteem, feeling over thinking, and a contempt for authority in general and direct instruction in particular. Somehow the question "How can we educate students for a democratic society by using authoritarian methods?" came to be answered, "We can't," or, "We shouldn't"; however, the nation that has led the world in technology since Edison's time and defeated Hitler produced its educated citizens from classrooms where pre-1970 teachers were, for the most part, authoritative, respected leaders of their respectful students.
The philosophy that mankind is basically good and not in need of direction is not a recent one, being seen in the writings of Rousseau and in those of Dewey's whelps, but during the late 1960s, it exploded. Teachers in many universities were told to avoid being authority figures. Arrange the chairs in a circle to show how we're all equal. I'm your moderator, not your teacher. Notice how I don't dress in a coat and tie, or skirt, so that I don't look like "The Establishment."
As the adult-child distinction began to be minimized, so also grew the tendency to reduce all knowledge to pure subjectivity: There is no real answer to most questions, so what do you think? At first, this shunning of objectivity was relegated to the arts (poetry interpretation, literary criticism, and the like), but more recently it has been seen in the form of creative spelling, optional answers to math problems, and even the politicization of the hard sciences. At its worst, we see in many universities the prevalence of deconstructionism, the absence of any common, objective meaning.
Closely akin to egalitarianism is the much-written-about overemphasis on a student's self-esteem. This movement has many manifestations, all masking true achievement and trying to minimize real differences in students' abilities. Heterogeneous grouping, aimed at eliminating the hurt feelings of the less able and/or less motivated, furnished the fertile ground for "teaching to the middle." Mainstreaming, even when special instructors were added to aid the severely disabled, made effective learning even less possible. When not all students made the honor roll, the honor roll was either eliminated or not published. Class ranking was watered down by equating the index of a student taking remedial reading and basic math with that of one choosing Advanced Placement English and calculus. Curving grades became common. When disparities still could not be disguised, "multiple intelligences" were discovered.
|As the adult-child distinction began to be minimized, so also grew the tendency to reduce all knowledge to pure subjectivity.|
5. The Re-Norming of the SATs. As Paul Copperman points out in The Literacy Hoax (1977), the Scholastic Aptitude Test was one of the few constant, reliable measures of verbal and mathematical competence since it began in 1941. A score of 500 on either the math or verbal sections, which was the original norm, meant essentially the same thing in 1941, 1951, 1961, 1971, 1981, and 1991. Not so in 1996, when an adjustment was made for the major decline in student scores since 1963, their peak year. In my own school, teachers saw the mean score on the SATs (taken by gifted 7th graders under a special program) jump 83 points from 1995 to 1996, yet the academic ability of the two classes was approximately the same. If, in the face of a declining number of home runs, the baseball commissioner were to require all ballparks to move their fences in to the 250-foot mark, the results would be about the same. The "caving" of the College Board was disappointing; it destroyed an accurate and valuable barometer of the nation's educational achievement that had served well for more than half the century.
4. Anti-Merit Faculty-Compensation Systems. Ask any competent building principal to name the five teachers she would gladly let go, given the choice. Then ask her to name the five teachers she would keep if she had to let go all except these. Next, ask essentially the same two questions of the president of the school's PTA and the teachers' union building representative. Then ask some random students who have spent a few years at that school, "Who were your best teachers, the ones that taught you the most and really cared about you? Who were the ones who were your least favorites?" Compile all these responses, and you will see an amazing correlation: Most will list the same stars and duds.
Since practically everyone recognizes these differences— despite the inability to rate educators using some productivity measure appropriate to the business world—what prevents us from compensating accordingly? Answer: reluctant principals and resistant unions. They bleat, "Who are we to actually judge someone on the basis of one or two classroom visits?" True professionals do not advance economically strictly on the basis of seniority and number of college credits. The most perceptive teachers realize, and may admit tacitly, that their mediocre peers, often more highly paid owing to more years of service, hurt the profession and cause taxpayers to vote against referendums.
Those legendary days are gone when, if parents heard that their son was disciplined in school, he would get a second dose at home.
3. Teacher Tenure. Probably having its genesis during the paranoia of the McCarthy era and based on the practice at colleges and universities—a very unparallel model—tenure was later granted to elementary and secondary teachers. The practice may well have grown because it seemed to school boards an inexpensive benefit. Whatever its origin, tenure does more damage by protecting the mediocre and incompetent than it does in safeguarding the rights of those who should remain in the profession.
Tenure is fundamentally unfair, a one-way street for the employee. What recourse is there for the principal or superintendent who has a teacher, without notice, turn his keys in to the office in October and never return? There is no criminal, civil, or financial penalty, and school attorneys are even advising administrators against writing negative references. On the other hand, if a principal asked for a teacher's keys without notice ...
2. Students' Rights Court Decisions. It began with one or two U.S. Supreme Court decisions in the 1960s. Due v. Florida A&M University (1963) and Tinker v. Des Moines (1969), respectively, gave to students procedural constitutional due process rights and certain rights of free expression. The latter decision stated that the word "persons" in the U.S. Constitution included minor students. The precedent of legally equating students with adults opened the floodgates for similar decisions during the next decades, which had the cumulative effect of shattering the in loco parentis status that had existed in the public school system since its earliest days.
The parent-school partnership is in greatly weakened condition as this century ends. Teachers fear the absence of backup from administrators in disciplinary matters, and administrators fear lawsuits from parents. I believe a case can be made that union demands for teacher tenure grew at the same rate that teachers' control over students was taken away.
The legendary days when, if parents heard that their son was disciplined in school, he would get a second dose at home, are gone. In almost every opinion poll, parents list poor school discipline at or near the top of their complaint list. Yes, families and society have changed radically since the 1950s, but the courts have gutted a very strong means of positive behavior reinforcement that schools once had.
1. Church-State Court Decisions. I can remember when in 1963 the Supreme Court ruled in Abington Township v. Schempp that Bible reading and prayer were banned in public school classrooms. Although the strongest support for the decision came from the famous atheist Madeline Murray, the American people did not revolt. After all, Christianity was not a state religion, so why should this sect have a theological monopoly in the public schools? What was not foreseen, however, was that the censorship of prayer and sectarian theology would also result in the banishment of values. It was one thing to forbid devotions and the presentation, as facts, of doctrinal beliefs such as the Resurrection, but it was quite another matter to forbid any presentation of moral absolutes such as "Thou shalt not kill."
The vacuum inevitably was filled with moral relativism, situation ethics, and "values clarification." Powerful and effective words like "evil," "wrong," "immoral," and "sin" were replaced with feeble terms such as "inappropriate" and "bad choice." Additionally, school boards soon found that this "wall" of separation had other implications. Teachers could conduct themselves in their private lives in ways which, a decade earlier, might have resulted in their loss of employment. Now the once universally understood term "proper role model" became purely a matter of opinion.
Traditionalists do not believe that the First Amendment's intent was to create a secular society. John Adams stated that the United States was founded on the premise that its people embraced universal moral and religious beliefs.
Like the students' rights decisions, this 1963 religious decision caused the proliferation of similar restrictions. Schools soon found that they could punish and/or remove disruptive students only by wading through thick procedural red tape (given the noble term "due process"), and that their preventive measures based on moral absolutes were gone. No longer could they even tell a cheating student that his act was morally wrong.
Because of the undermining of America's moral foundation, we see the guns-and-metal-detectors world of 2000, where veteran public school teachers and administrators yearn for the days when the worst offenses involved spitballs and smoking (cigarettes) in the lavatory. The public school system will never recover unless the American people restore to it at least some measure of ideological unity.
Kenneth M. Weinig is the founding headmaster of The Independence School in Newark, Del., and has been a teacher and an administrator for 34 years.
Vol. 19, Issue 40, Page 31