Opinion
Education Opinion

Good English And Bad

By Bill Bryson — September 01, 1990 9 min read
  • Save to favorites
  • Print

English is often commended by outsiders for its lack of a stultifying authority. Otto Jesperson as long ago as 1905 was praising English for its lack of rigidity, its happy air of casualness. Likening French to the severe and formal gardens of Louis XIV, he contrasted it with English, which he said was “laid out seemingly without any definite plan, and in which you are allowed to walk everywhere according to your own fancy without having to fear a stern keeper enforcing rigorous regulations.”

Without an official academy to guide us, the English-speaking world has long relied on self-appointed authorities such as the brothers H.W. and F.G. Fowler and Sir Ernest Gowers in Britain and Theodore Bernstein and William Safire in America, and of course countless others. These figures write books, give lectures, and otherwise do what they can (i.e., next to nothing) to try to stanch (not staunch) the perceived decline of the language. They point out that there is a useful distinction to be observed between uninterested and disinterested, between imply and infer, flaunt and flout, fortunate and fortuitous, forgo and forego, and discomfort and discomfit (not forgetting stanch and staunch). They point out that fulsome, properly used, is a term of abuse, not praise, that peruse actually means to read thoroughly, not glance through, that data and media are plurals. And from the highest offices in the land they are ignored.

In the late 1970s, President Jimmy Carter betrayed a flaw in his linguistic armory when he said: “The government of Iran must realize that it cannot flaunt, with impunity, the expressed will and law of the world community.” Flaunt means to show off; he meant flout. The day after he was elected president in 1988, George Bush told a television reporter he couldn’t believe the enormity of what had happened. Had President-elect Bush known that the primary meaning of enormity is wickedness or evilness, he would doubtless have selected a more apt term.

When this process of change can be seen happening in our lifetimes, it is almost always greeted with cries of despair and alarm. Yet such change is both continuous and inevitable. Few acts are more salutary than looking at the writings of language authorities from recent decades and seeing the usages that heightened their hackles. In 1931, H.W. Fowler was tutting over racial, which he called “an ugly word, the strangeness of which is due to our instinctive feeling that the termination -al has no business at the end of a word that is not obviously Latin.”

So if there are no officially appointed guardians for the English language, who sets down all those rules that we all know about from childhood—the idea that we must never end a sentence with a preposition or begin one with a conjunction, that we must use each other for two things and one another for more than two, and that we must never use hopefully in an absolute sense, such as “Hopefully, it will not rain tomorrow”? The answer, surprisingly often, is that no one does, that when you look into the background of these “rules” there is often little basis for them.

Consider the curiously persistent notion that sentences should not end with a preposition. The source of this stricture, and several equally dubious ones, was Robert Lowth, an 18thcentury clergyman and amateur grammarian whose A Short Introduction to English Grammar, published in 1762, enjoyed a long and distressingly influential life both in his native England and abroad. It is to Lowth that we can trace many a pedant’s most treasured notions: the belief that you must say different from rather than different to or different than, the idea that two negatives make a positive, the rule that you must not say “the heaviest of the two objects,” but rather “the heavier,” the distinction between shall and will, and the clearly nonsensical belief that between can apply only to two things and among to more than two. (By this reasoning, it would not be possible to say that St. Louis is between New York, Los Angeles, and Chicago, but rather that it is among them, which would impart a quite different sense.) Perhaps the most remarkable and curiously enduring of Lowth’s many beliefs was the conviction that sentences ought not to end with a preposition. But even he was not didactic about it. He recognized that ending a sentence with a preposition was idiomatic and common in both speech and informal writing. He suggested only that he thought it generally better and more graceful, not crucial, to place the preposition before its relative “in solemn and elevated” writing. Within a hundred years, this had been converted from a piece of questionable advice into an immu-table rule.

Today in England, you can still find authorities attacking the construction different than as a regrettable Americanism, insisting that a sentence such as “How different things appear in Washington than in London” is ungrammatical and should be changed to “How different things appear in Washington from how they appear in London.” Yet different than has been common in England for centuries and used by such exalted writers as Defoe, Addison, Steele, Dickens, Coleridge, and Thackeray, among others. Other authorities, in both Britain and America, continue to deride the absolute use of hopefully. The New York Times Manual of Style and Usage flatly forbids it. Its writers must not say, “Hopefully the sun will come out soon,” but rather are instructed to resort to a clumsily passive and periphrastic construction such as “It is to be hoped that the sun will come out soon.” The reason? The authorities maintain that hopefully in the first sentence is a misplaced modal auxiliary—that it doesn’t belong to any other part of the sentence. Yet they raise no objection to dozens of other words being used in precisely the same unattached way—admittedly, mercifully, happily, curiously, and so on.

Considerations of what makes for good English or bad English are to an uncomfortably large extent matters of prejudice and conditioning. Until the 18th century it was correct to say “you was” if you were referring to one person. It sounds odd today, but the logic is impeccable. Was is a singular verb and were a plural one. Why should you take a plural verb when the sense is clearly singular? The answer—surprise, surprise—is that Robert Lowth didn’t like it. “I’m hurrying, are I not?” is hopelessly ungrammatical, but “I’m hurrying, aren’t I?”—merely a contraction of the same words—is perfect English. Many is almost always a plural (as in “Many people were there”), but not when it is followed by a, as in “Many a man was there.” There’s no inherent reason why these things should be so. They are not defensible in terms of grammar. They are because they are.

Nothing illustrates the scope of prejudice in English better than the issue of the split infinitive. Some people feel ridiculously strongly about it. When the British Conservative politician Jock Bruce-Gardyne was economic secretary to the Treasury in the early 1980s, he returned unread any departmental correspondence containing a split infinitive. (It should perhaps be pointed out that a split infinitive is one in which an adverb comes between to and a verb, as in to quickly look). I can think of two very good reasons for not splitting an infinitive.

    1. Because you feel that the rules of English ought to conform to the grammatical precepts of a language that died a thousand years ago.
    2. Because you wish to cling to a pointless affectation of usage that is without the support of any recognized authority of the last 200 years, even at the cost of composing sentences that are ambiguous, inelegant, and patently contorted.

It is exceedingly difficult to find any authority who condemns the split infinitive—Theodore Bernstein, H.W. Fowler, Ernest Gowers, Eric Partridge, Rudolph Flesch, Wilson Follett, Roy H. Copperud, and others too tedious to enumerate here all agree that there is no logical reason not to split an infinitive. Otto Jesperson even suggests that, strictly speaking, it isn’t actually possible to split an infinitive. As he puts it: “’To’ … is no more an essential part of an infinitive than the definite article is an essential part of a nominative, and no one would think of calling ‘the good man’ a split nominative.”

Lacking an academy as we do, we might expect dictionaries to take up the banner of defenders of the language, but in recent years they have increasingly shied away from the role. A perennial argument with dictionary makers is whether they should be prescriptive (that is, whether they should prescribe how language should be used) or descriptive (that is, merely describe how it is used without taking a position). The most notorious example of the descriptive school was the 1961 Webster’s Third New International Dictionary (popularly called Webster’s Unabridged), whose editor, Philip Gove, believed that distinctions of usage were elitist and artificial. As a result, usages such as imply as a synonym for infer and flout being used in the sense of flaunt were included without comment. The dictionary provoked further antagonism, particularly among members of the U.S. Trademark Association, by refusing to capitalize trademarked words. But what really excited outrage was its remarkable contention that ain’t was “used orally in most parts of the U.S. by many cultivated speakers.”

So disgusted was The New York Times with the new dictionary that it announced it would not use it but would continue with the 1934 edition, prompting the language authority Bergen Evans to write: “Anyone who solemnly announces in the year 1962 that he will be guided in matters of English usage by a dictionary published in 1934 is talking ignorant and pretentious nonsense,” and he pointed out that the issue of the Times announcing the decision contained 19 words condemned by the Second International.

One of the undoubted virtues of English is that it is a fluid and democratic language in which meanings shift and change in response to the pressures of common usage rather than the dictates of committees. It is a natural process that has been going on for centuries. To interfere with that process is arguably both arrogant and futile, since clearly the weight of usage will push new meanings into currency no matter how many authorities hurl themselves into the path of change.

But at the same time, it seems to me, there is a case for resisting change—at least slapdash change. Even the most liberal descriptivist would accept that there must be some conventions of usage. We must agree to spell cat c-a-t, and not e-l-e-p-h-a-nt, and we must agree that by that word we mean a small furry quadruped that goes meow and sits comfortably on one’s lap and not a large lumbering beast that grows tusks and is exceedingly difficult to housebreak. In precisely the same way, clarity is generally better served if we agree to observe a distinction between imply and infer, forego and forgo, fortuitous and fortunate, uninterested and disinterested, and many others. As John Ciardi observed, resistance may in the end prove futile, but at least it tests the changes and makes them prove their worth.

Perhaps for our last words on the subject of usage we should turn to the last words of the venerable French grammarian Dominique Bonhours, who proved on his deathbed that a grammarian’s work is never done when he turned to those gathered loyally around him and whispered: “I am about to—or I am going to—die; either expression is used.”

A version of this article appeared in the September 08, 1982 edition of Education Week as Good English And Bad

Events

School Climate & Safety K-12 Essentials Forum Strengthen Students’ Connections to School
Join this free event to learn how schools are creating the space for students to form strong bonds with each other and trusted adults.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education Briefly Stated: March 20, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: March 13, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: February 21, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read
Education Briefly Stated: February 7, 2024
Here's a look at some recent Education Week articles you may have missed.
8 min read