I am not sure who said it (possibly Northrop Frye), but it has been remarked that there are three characters from literature who are known all over the world: Hamlet, Alice, and Sherlock Holmes. I don’t know how Shakespeare and Lewis Carroll felt about their creations, but it is known that Arthur Conan Doyle believed Holmes to be something of a nuisance and thought that his stories about Holmes did not represent his best work. The irony, of course, is that no one reads what Doyle regarded as his best work, but everybody reads about Holmes.
Something like this will happen, on occasion, to lesser--much lesser--authors. For example, me. Among the many pages I have written about education over the years, I have included a few ideas that seemed to me to rise above the others and could be expected to be given special consideration by readers. But the ideas have been largely ignored, mostly on the grounds that readers believed they were included as an attempt at humor. This is depressing for two reasons: first, that readers believed my sense of humor was so uninspired; second, that their sense of possibilities was so limited. You may judge for yourself. Here is one of the ideas. We could improve the quality of teaching overnight, as it were, if math teachers were assigned to teach art, art teachers science, science teachers English. My reasoning is as follows: Most teachers, especially high school and college teachers, teach subjects they were good at in school. They found the subject both easy and pleasurable. As a result, they are not likely to understand how the subject appears to those who are not good at it, or don’t care about it, or both. If, let us say, for a semester, each teacher were assigned a subject that he or she hated or always had trouble with, the teacher would be forced to see the situation as most students do, would see things more as a new learner than as an old teacher. Perhaps he or she would discover how boring the textbooks are, would learn how nerve-racking the fear of making mistakes is, might discover that a question that has unsuspectingly aroused his or her interest must be ignored because it is not covered by the syllabus, might even discover that there are students who know the subject better than he or she could ever hope to. Then what?
All in all, I believe the experience would be chastening and even eye-opening. When teachers returned to their specialties, it is possible they would bring with them refreshing ideas about how to communicate about their subject, and with an increased empathy for their students.
Here is another idea, not meant to be funny: We can improve the quality of teaching and learning overnight by getting rid of all textbooks. Most textbooks are badly written and, therefore, give the impression that the subject is boring. Most textbooks are also impersonally written. They have no “voice,’' reveal no human personality. Their relationship to the reader is not unlike the telephone message that says, “If you want further assistance, press two now.’' I have found the recipes on the backs of cereal boxes to be written with more style and conviction than most textbook descriptions of the causes of the Civil War. Of the language of grammar texts, I will not even speak. To borrow from Shakespeare, it is unfit for a Christian ear to endure. But worse than this, textbooks are concerned with presenting the facts of the case (whatever the case may be) as if there can be no disputing them, as if they are fixed and immutable. And still worse, there is usually no clue given as to who claimed these are the facts of the case, or how “it’’ discovered these facts (there being no he or she, or I or we). There is no sense of the frailty or ambiguity of human judgment, no hint of the possibilities of error. Knowledge is presented as a commodity to be acquired, never as a human struggle to understand, to overcome falsity, to stumble toward the truth.
Textbooks, it seems to me, are enemies of education, instruments for promoting dogmatism and trivial learning. They may save the teacher some trouble, but the trouble they inflict on the minds of students is a blight and a curse. On one occasion when I made this argument before a group of teachers, one of them asked, “But if we eliminated textbooks, what would replace them?’' My answer--again, not meant humorously--was as follows: “When Jonas Salk’s vaccine eliminated polio, did anyone ask, ‘But what will replace it?’ '' You might think I was being a wise guy, and so I was. The teacher deserved a better answer, and I will come to one.
But first, I will offer one other idea that has been widely and consistently ignored. This one, I confess, was originated inadvertently by Reed Irvine, who heads a right-wing group called Accuracy in Media (AIM). The group’s purpose is to monitor newspapers, radio, and television in a search for left-wing bias, which, when found, is to be exposed and condemned. A few years ago, Mr. Irvine began to extend his surveillances by forming a group known as Accuracy in Academia (AIA), whose purpose is to expose left-wing bias in the classroom. The idea is to have students secretly and carefully monitor the lectures and remarks of their teachers so that the latter’s inaccuracies, clichés, and unjustified opinions may be brought to light. It is probably not entirely irrelevant to note that those of a “liberal’’ bent reacted with disdain, chagrin, and righ-teousness to the thought of student spies evaluating everything their teachers say. Perhaps it was the secrecy of it all that disturbed them. I hope so because, putting the secrecy aside, Accuracy in Academia is about the best idea yet invented for achieving what every teacher longs for: first, to get students to pay careful attention, and second, to get them to think critically. Of course, the flaw in Irvine’s idea is that he wishes students to think critically in only one direction. But this is easily corrected. All that is necessary is that at the beginning of each course, the teacher address students in the following way:
During this term, I will be doing a great deal of talking. I will be giving lectures, answering questions, and conducting discussions. Since I am an imperfect scholar and, even more certainly, a fallible human being, I will inevitably be making factual errors, drawing some unjustifiable conclusions, and perhaps passing along my opinions as facts. I should be very unhappy if you were unaware of these mistakes. To minimize that possibility, I am going to make you all honorary members of Accuracy in Academia. Your task is to make sure that none of my errors goes by unnoticed. At the beginning of each class, I will, in fact, ask you to reveal whatever errors I made in the previous session. You must, of course, say why these are errors, indicate the source of your authority, and, if possible, suggest a truer or more useful or less biased way of formulating what I said. Your grade in this course will be based to some extent on the rigor with which you pursue my mistakes. And to ensure that you do not fall into the torpor that is so common among students, I will, from time to time, deliberately include some patently untrue statements and some outrageous opinions.
There is no need for you to do this alone. You should consult with your classmates, perhaps even form a study group that can collectively review the things I have said. Nothing would please me more than for one or several of you to ask for class time in which to present a corrected or alternative version of one of my lectures.
I am banking on readers’ agreeing that these three ideas are neither humorous nor impractical. Neither do I see them as gimmicks. To try to renew a teacher’s sense of the difference between teaching and learning, to eliminate packaged truths from the classroom, and to focus student attention on error are part of an uncommon but, I believe, profound narrative capable of generating interest and inspiration in school. It is, in fact, a refutation of a story that infuses so much of schooling as we know it and have known it for so long. I am referring to the story that says the following in a hundred ways to students: You come to school to learn important facts and enduring truths. Your teacher knows many of these, your textbooks still others. It is not your business to know where they came from or how. It would, in any case, be a waste of valuable time to burden you with the mistakes of those who thought they had discovered important facts and enduring truths. School is not a place for documenting error but for revealing the true state of affairs.
Do I exaggerate? I don’t think so. The sentences above, with some variations and a few addenda, express the attitude of most schools toward knowledge (especially, by the way, most colleges) and explain why a course or two in “critical thinking’’ is quite irrelevant. They also explain the easy appeal of the “cultural literacy’’ project developed by E.D. Hirsch Jr. The idea there is for students to become ac-quainted with a thousand facts without pausing to know whose facts they are, how we come to know them, why they are deemed important, and by whom. This leads quite directly to the state of mind sometimes called “justificationism.’' As used, for example, by Henry Perkinson in The Possibilities of Error, the word refers to the tendency of most people to engage in a rigorous and “natural’’ defense of their own beliefs, not so much to explain their beliefs as to justify them.
Although we are all accustomed to such performances, is there not something strange about this--this idea of education in which everyone is encouraged to justify, fight for, and defend what they believe, as if we did not know that our beliefs are flawed, imperfect, badly in need of improvement? It would take a satirist of Swiftian talent to show, first, how unseemly it is and then how deeply it offends the way people learn. I risk contradicting John Dewey’s most famous aphorism by saying that though we may learn by doing, we learn far more by not doing--by trial and error, by making mistakes, correcting them, making more mistakes, correcting them, and so on. We are all in need of remedial work, all the time. And this includes teachers, students, and textbooks.
Can you imagine a school organized around this principle--that whatever ideas we have, we are in some sense wrong? We may have insufficient facts to support an idea; or some of the facts we have may be incorrect, perhaps generated by a festering emotion; or the conclusions we have drawn may not be entirely logical; or some definition we are employing may not be applicable; or we may be merely repeating an idea we have heard expressed by some authority and have not examined its implications carefully. Can you imagine schools whose epistemological story does not aim at producing a flotilla of fanatics but, rather, people who proceed to learn with full consciousness of their own fallibility, as well as the fallibility of others?
How could such schools be created? Any plan would, of necessity, have its origin in a new way of educating teachers, because it would require a refocusing of the purpose of teaching. As things stand now, teachers are apt to think of themselves as truth tellers who hope to extend the intelligence of students by revealing to them, or having them discover, incontrovertible truths and enduring ideas. I would suggest a different metaphor: teachers as error-detectors who hope to extend the intelligence of students by helping them reduce the mistakes in their knowledge and skills. In this way, if I may put it crudely, teachers become less interested in making students smart, more interested in making students less dumb. This is not a question of semantics. Or, if it is, it is not “mere’’ semantics. It is, in fact, the point of view taken by those who practice medicine and law. Physicians do, of course, have a conception of what is good health, but their expertise resides in their ability to identify ill health and to provide remedies for it. That is why, upon being consulted, their first and most important question is, What’s wrong?
The same may be said of lawyers, whose expertise resides in their ability to identify injustice and to pursue methods to eliminate it. In fact, to be realistic about the matter, for most physicians, good health is defined as the absence of illness; for most lawyers, justice is defined as the absence of injustice. Physicians and lawyers, we might say, function as painkillers. The good ones know how to relieve us of illness and injustice. I am suggesting the role of painkiller for teachers whose purpose would be to relieve students of the burdens of error--in their facts, their inferences, their opinions, their skills, their prejudices.
It would not be easy to educate teachers to approach matters in this way. Unlike the study of sickness and injustice, the study of error has rarely been pursued in a systematic way. But this does not mean that the subject has no history. There are many honorable books that take human error as a theme. The early dialogues of Plato are little else but meditations on error. Acknowledging that he did not know what truth is, Socrates spent his time exposing the false beliefs of those who thought they did. Erasmus’ In Praise of Folly also comes to mind, as does Jonathan Swift’s Gulliver’s Travels. In a more modern vein, one thinks of Jacques Ellul’s A Critique of the New Commonplaces, Stephen Jay Gould’s The Mismeasure of Man, I.A. Richards’ Practical Criticism, Mina Shaughnessy’s Errors and Expectations, and S.I. Hayakawa’s Language in Thought and Action.
Such books are not normally included as part of the education of teachers. Were they to be used, teachers would be likely to come to three powerful conclusions. The first is that everyone makes errors, including those who write about error. None of us is ever free of it, and we are most seriously endangered when we think we are. That there is an almost infinite supply of error, including our own, should provide teachers with a sense of humility and, incidentally, assurance that they will never become obsolete.
The second conclusion is that error is reducible. At pres-ent, teachers consume valuable time in pointless debates over whether intelligence is fixed, whether it is mostly genetic or environmental, what kinds of intelligences exist, and even how much intelligence one or another different race has. Such debates about error are entirely unnecessary. Error is a form of behavior. It is not something we have; it is something we do. Unlike intelligence, it is neither a metaphor nor a hypothetical construct whose presence is inferred by a score on a test. We can see error, read it, hear it. And it is possible to reduce its presence.
The third conclusion is that error is mostly committed with the larynx, tongue, lips, and teeth--which is to say, error is chiefly embodied in talk. It is true enough that our ways of talking are controlled by the ways we manage our minds, and no one is quite sure what “mind’’ is. But we are sure that the main expression of mind is sentences. When we are thinking, we are mostly arranging sentences in our heads. When we are making errors, we are arranging erroneous sentences. Even when we make a nonverbal error, we have preceded the action by talking to ourselves in such a way as to make us think the act is correct. The word, in a word, brings forth the act. This fact provides teachers with a specific subject matter in which they may become “experts’': Their expertise would reside in their knowledge of those ways of talking that lead to unnecessary mischief, failure, misunderstanding, and even pain.
I believe Bertrand Russell had something like this in mind when he said that the purpose of education is to help students defend themselves against “the seductions of eloquence,’' their own “eloquence’’ as well as that of others. As I have previously mentioned, the ancient Greeks--that is, the Sophists--believed that the study of grammar, logic, and rhetoric would provide an adequate defense. These arts of language were assumed to be what may be called “metasubjects,’' subjects about subjects. Their rules, guidelines, principles, and insights were thought to be useful in thinking about anything.
In poking fun at those who saw no purpose in learning about language, Erasmus (in his In Praise of Folly) wrote with sharp irony: " . . . what use of grammar, where every man spoke the same language and had no further design than to understand one another? What use of logic, where there was no bickering about the double-meaning words? What need of rhetoric, where there were no lawsuits?’'
He meant to say that as humans we will always have difficulty understanding one another, will always bicker about the meaning of words, will always claim we have been injured by another. There is nothing that happens among humans that is not instigated, negotiated, clarified, or mystified by language, including our attempts to acquire knowledge. The Greeks, and, indeed, the medieval Schoolmen, understood well something we seem to have for-gotten--namely, that all subjects are forms of discourse and that, therefore, almost all education is a form of language education. Knowledge of a subject mostly means knowledge of the language of that subject. Biology, after all, is not plants and animals; it is a special language employed to speak about plants and animals. History is not events that once occurred; it is a language describing and interpreting events, according to rules established by historians. Astronomy is not planets and stars but a special way of talking about planets and stars, quite different from the language poets use to talk about them.
And so a student must know the language of a subject, but that is only the beginning. For it is not sufficient to know the definition of a noun, or a gene, or a molecule. One must also know what a definition is. It is not sufficient to know the right answers. One must also know the questions that produced them. Indeed, one must also know what a question is, for not every sentence that ends with a rising intonation or begins with an interrogative is necessarily a question. There are sentences that look like questions but cannot generate any meaningful answers, and, as Francis Bacon said, if they linger in our minds, they become obstructions to clear thinking. One must also know what a metaphor is, and what is the relationship between words and the things they describe. In short, one must have some knowledge of a metalanguage--a language about language--to recognize error, to defend oneself against the seductions of eloquence.
Here I should suggest some other means of educating students to be error-detectors--for example, that all subjects be taught from a historical perspective. I can think of no better way to demonstrate that knowledge is not a fixed thing but a continuous struggle to overcome prejudice, authoritarianism, and even “common sense.’' Every subject, of course, has a history, including physics, math, biology, and history itself. William James once said that any subject becomes “humanistic’’ when taught historically. He almost certainly meant to imply that there is nothing more human than the stories of our errors and how we’ve managed to overcome them, and then fell into error again, and continued our efforts to make corrections--stories without end. Robert Maynard Hutchins referred to these stories as the Great Conversation, a dynamic and accurate metaphor, since it suggests not only that knowledge is passed down from one thinker to another but is also modified, refined, and corrected as the “conversation’’ goes on.
To teach about the atom without including Democritus in the conversation, electricity without Faraday, political science without Aristotle or Machiavelli, astronomy without Ptolemy, is to deny our students access to the Great Conversation. “To remain ignorant of things that happened before you were born is to remain a child,’' Cicero said. He then added, “What is a human life worth unless it is incorporated into the lives of one’s ancestors and set in a historical context?’' When we incorporate the lives of our ancestors in our education, we discover that some of them were great error-makers, some great error-correctors, some both. And in discovering this, we accomplish three things. First, we help students see that knowledge is a stage in human development, with a past and a future. Second (this would surely please professor E.D. Hirsch Jr.), we acquaint students with the people and ideas that comprise “cultural literacy’'--that is to say, give them some understanding of where their ideas come from and how we came by them. And third, we show them that error is no disgrace, that it is the agency through which we increase understanding.
Of course, to ensure that the last of these lessons be believed, we would have to make changes in what is called “the classroom environment.’' At present, there is very little tolerance for error in the classroom. That is one of the reasons students cheat. It is one of the reasons students are nervous. It is one of the reasons many students are reluctant to speak. It is certainly the reason why students (and the rest of us) fight so hard to justify what they think they know. In varying degrees, being wrong is a disgrace; one pays a heavy price for it. But suppose students found themselves in a place where this was not the case? In his book Mindstorms, Seymour Papert contends that one of the best reasons for using computers in the classroom is that computers force the environment to be more tolerant of error. Students move toward the right answer (at least in mathematics) by making mistakes and then correcting them. The computer does not humiliate students for being wrong, and it encourages them to try again. If Papert is right, then we do, indeed, have a good reason for having students use computers. Of course, if he is right, it is also an insult to teachers. Is it only through the introduction of a machine that the classroom can become a place where trial and error is an acceptable mode of learning, where being wrong is not a punishable offense?
Suppose teachers made it clear that all the materials introduced in class were not to be regarded as authoritative and final but, in fact, as problematic--textbooks, for example. (And here is my more serious answer to the teacher who wondered what we would do without them.) It is best, of course, to eliminate them altogether, replacing them with documents and other materials that are under the control of the teacher. (What else is the Xerox machine for?) But if elimination is too traumatic, then we would not have to do without them, only without their customary purpose. We would start with the premise that a textbook is a particular person’s attempt to explain something to us and thereby tell us the truth of some matter. But we would know that this person could not be telling us the whole truth. Because no one can. We would know that this person has certain prejudices and biases. Because everyone has. We would know that this person must have included some disputable facts, shaky opinions, and faulty conclusions. Thus, we have good reason to use this person’s textbook as an object of inquiry. What might have been left out? What are the prejudices? What are the disputable facts, opinions, and conclusions? How would we proceed to make such an inquiry? Where would we go to check facts? What is a “fact,’' anyway? How would we proceed in uncovering prejudice? On what basis would we judge a conclusion unjustifiable?
Professor Hirsch worries about such an approach, indeed, condemns it, because he believes that by learning about learning, students are deflected from getting the facts that “educated’’ people must have. But to proceed in this way permits students to learn “facts’’ and “truths’’ in the text as one hopes they will, but it also permits them to learn how to defend themselves against “facts’’ and “truths.’' Do we want our students to know what a noun is? The text will tell them, but that is the beginning of learning, not the end. Is the definition clear? Does it cover all cases? Who made it up? Has anyone come up with a different definition?
Do we want students to know what a molecule is? The text will tell them. But then the questions begin. Has anyone ever seen a molecule? Did the ancients believe in them? Was a molecule discovered or invented? Who did it? Suppose someone disbelieved in molecules, what then?
Do we want students to know about the causes of the Revolutionary War? A text will give some. But from whose point of view? And what sort of evidence is provided? What does objectivity mean in history? Is there no way to find out the “real’’ truth?
If students were occupied with such inquiries, they would inevitably discover the extent to which facts and truth have changed, depending upon the circumstances in which the facts were described and the truths formulated. They will discover how often humans were wrong, how dogmatically they defended their errors, how difficult it was and is to make corrections. Do we believe that our blood circulates through the body? In studying the history of biology, students will discover that 150 years after Harvey proved blood does circulate, some of the best physicians still didn’t believe it. What will students make of the fact that Galileo, under threat of torture, was forced to deny that the Earth moves? What will students think if they acquaint themselves with the arguments for slavery in the United States?
Will our students become cynical? I think not--at least not if their education tells the following story: Because we are imperfect souls, our knowledge is imperfect. The history of learning is an adventure in overcoming our errors. There is no sin in being wrong. The sin is in our unwillingness to examine our own beliefs, and in believing that our authorities cannot be wrong.
Far from creating cynics, such a story is likely to foster a healthy and creative skepticism, which is something quite different from cynicism. It refutes the story of the student learner as the dummy in a ventriloquism act. It holds out the hope for students to discover a sense of excitement and purpose in being part of the Great Conversation.
Since I began this with three ideas that were not taken as seriously as they were intended, I will end it with another one that is likely to have the same fate. I suggest the following test be given in each subject in the curriculum. We might think of it as the “final’’ exam:
Describe five of the most significant errors scholars have made in (biology, physics, history, etc.). Indicate why they are errors, who made them, and what persons are mainly responsible for correcting them. You may receive extra credit if you can describe an error that was made by the error-corrector. You will receive extra extra credit if you can suggest a possible error in our current thinking about (biology, physics, history, etc.). And you will receive extra extra extra credit if you can indicate a possible error in some strongly held belief that currently resides in your mind.
Can you imagine this question being given on the SATs?
A version of this article appeared in the August 01, 1995 edition of Teacher as The Error of Our Ways