If one word could summarize American education in the 20th century, that word would be “more.” Today there are more students attending school for more of the year and for a longer period of their lives than at any time in our history.
It’s a stunning achievement--yet one that is often overlooked, given Americans’ rising educational aspirations.
“Education is opening the paths for people, and that’s what our system of public education has done,” says Daniel J. Boorstin, the distinguished historian and a former librarian of Congress.
“Lessons of a Century: A Nation’s Schools Come of Age,” a collection of Education Week articles, including this one, chronicling the successes and setbacks of American education over the past 100 years, is available from Education Week Press.
At the beginning of the century, a majority of Americans ages 7 to 13 attended school. But only one in 10 remained in school beyond age 14, and fewer than 7 percent of 17-year-olds graduated from high school.
But while access to schools is no longer an issue, access to quality is. Despite enormous progress, the opportunities students have to learn, and how well they are expected to do so, vary significantly based on where they live, what their parents earn, and the color of their skin. It’s as if, having invited everyone to the banquet, some are served an appetizer and others a five-course meal.
“I think the issue now,” says John I. Goodlad, an education professor at the University of Washington, “is to make our schools commonly good.”
New Role for High Schools
Early in the 20th century, “access” primarily meant securing enrollment in a public high school. By the late 1800s, reformers had largely won the battle for publicly financed elementary schools. But public high schools were still a rarity. They remained primarily an urban and middle-class phenomenon into the 1870s, with selective admissions based on entrance exams and modest enrollments. (“The Foundation of Universal Education.”)
Where high schools existed outside the big cities, they were typically small and varied greatly in curriculum and organization, notes Marvin Lazerson, an education historian at the University of Pennsylvania. Most students left school by age 13 to get a job. And a high school diploma was not required for college or the workplace.
But from the late 1800s until 1940, high schools underwent a remarkable transformation. On average, the Stanford education historian David B. Tyack points out, Americans built one new high school a day from 1890 to 1918. High school enrollment swelled from about half a million in 1900 to 2.4 million in 1920 and more than 6.5 million in 1940. By that year, more than three-fourths of young Americans ages 14 to 17 were in secondary school. And, for the first time, more than half of 17-year-olds had earned a diploma.
David F. Labaree, an assistant professor of education at Michigan State University, captures the early years of that explosive growth in his history of Central High School in Philadelphia. When the school opened its doors in 1838, it was the only public high school in the city. Two-thirds of its students came from middle-class homes. By 1885, in response to a public clamor for secondary education, the school board had added Central Manual Training School a few blocks away, which served a higher proportion of students whose parents were among the “employed” middle class rather than proprietors or professionals. Five years later, it built Northeast Manual Training School, and the enrollment at Central High itself had doubled. By 1915, Philadelphia had 13 secondary schools, including two normal schools (for training teachers), a vocational school, and 10 regional comprehensive high schools, of which Central was one.
Historians point to a web of economic and social factors to explain the boom in secondary education. From 1890 to 1920, 18 million immigrants poured into the country, and many more Americans abandoned the farms for the nation’s increasingly crowded and industrialized cities. Those population shifts provided the concentrated pool of young people needed to sustain large high schools. At the same time, new technologies replaced jobs once held by children, shrinking the labor market for minors. Laws restricting or banning child labor also pushed youngsters out of the workplace, while compulsory-attendance laws pulled them into the schools. Massachusetts enacted the first compulsory-attendance law in 1852. By the turn of the century, 33 states and the District of Columbia had followed, with laws aimed primarily at 8- to 14-year-olds.
In 1922, as part of a strategy to 'Americanize' the Oregon schools, the Ku Klux Klan and the Scottish Rite Masons organized a voter referendum requiring all children between 8 and 16 to attend public schools.
Often, the targets of compulsory-attendance laws were the poor, immigrant children arriving daily on American shores. Many educators saw a desperate need to instill “civilized” values in these alien newcomers. Often, the concern was less to prepare an educated workforce than to preserve American democracy and morals.
In 1922, as part of a strategy to “Americanize” the Oregon schools, the Ku Klux Klan and the Scottish Rite Masons organized a voter referendum requiring all children between 8 and 16 to attend public schools. Parents or guardians who refused to comply were subject to a maximum punishment of a $100 fine and a 30-day jail term.
“The assimilation and education of our foreign-born citizens in the principles of our government, the hopes and inspiration of our people, are best secured by and through attendance of all children in our public schools,” stated a resolution in favor of the ballot measure. “We must now halt those coming to our country from forming groups, establishing schools, and thereby bringing up their children in an environment often antagonistic to the principles of our government.”
The measure was opposed--unsuccessfully--by members of the Evangelical Lutheran Synod, Seventh Day Adventists, private school principals, members of the Presbyterian Church, and the Catholic Civil Rights Association. But in 1925, the U.S. Supreme Court declared the law unconstitutional, saying it forced children to “accept instruction from public teachers only.”
“Americanization,” however, was never the only goal of public education. As the elementary schools filled up, parents also came to view the high schools as offering their children a distinct economic and social advantage.
As Robert S. and Helen Merrell Lynd observed in Middletown, their classic 1929 study of Muncie, Ind.: “This thing, education, appears to be desired frequently not for its specific content but as a symbol--by the working class as an open sesame that will mysteriously admit their children to a world closed to them, and by the business class as a heavily sanctioned aid in getting on further economically or socially in the world.”
It was the Great Depression, however, that gave children a final shove out of the workforce and cemented the idea of high school as a virtual necessity in the minds of Americans. The value of children in the job market plummeted. And they competed with unemployed adults for those jobs for which they remained eligible. In response, many states raised the age of compulsory school attendance from 14 to 16.
“From my own experience at that time, all of a sudden, the enrollments above the elementary school began to increase enormously,” Goodlad recalls.
The question of what kind of education would suit this much larger and more diverse population of high school students had perplexed educators since the late 19th century. But the issue would become increasingly pressing as the ranks of students swelled.
On one side of the argument was the 1893 report from the Committee of Ten, chaired by President Charles W. Eliot of Harvard University. The group, composed largely of college presidents, argued that all students should master an equally rigorous curriculum: “Every subject which is taught at all in secondary school should be taught in the same way and to the same extent to every pupil so long as he pursues it, no matter what the probable destination of the pupil may be, or at what point his education is to cease.”
The committee recommended at least four years of English, four years of a foreign language, and three years each of mathematics, science, and history.
Eliot was hardly a populist, however. At the time the committee wrote its report, only a small fraction of young Americans graduated from high school. And the committee itself described secondary schools as being for a small--but important--proportion of children who could profit from a prolonged education and whose parents could afford it.
Opposing the committee were men who favored a much more utilitarian curriculum for students who were not bound for college. In 1906, a report by the Massachusetts Commission on Industrial and Technical Education argued that the schools were “too exclusively literary” for the great body of youths. What most youngsters needed, the commission concluded, was “training of a practical character” that would prepare them for jobs in industry.
In 1918, the National Education Association published the “Cardinal Principles of Secondary Education,” which endorsed different curricula for different students, with separate courses of study in such fields as industrial and fine arts. It suggested that secondary education be oriented around seven broad objectives: health, command of fundamental processes (literacy, numeracy), worthy home membership, vocation, citizenship, worthy use of leisure, and ethical character. The academic disciplines were barely mentioned.
Reformers of the day wanted to stem the tide of young people dropping out of high school, a problem they attributed to the dominance of bookish, academic coursework. By providing students with more options that would “best fit our boys and girls for life,” the reformers saw themselves as increasing access to education. True, those pathways would have different standards and different academic content, but at least everybody would come away with a diploma.
Tracking Takes Hold
Over time, such arguments proved persuasive. High schools responded to the polyglot mix of youths by stratifying and diversifying the curriculum.
“At the moment that the expansion really booms, you find high schools like Central and a lot of the others that were already in existence shifting to a tracking system,” Labaree observes. By 1919, the Philadelphia school had four courses of study: academic, commercial, mechanical, and industrial. Those would continue into the late 1930s, when Central High became once again a selective-admissions school.
To Labaree, tracking served an invidious purpose. It enabled public high schools to nominally meet their democratic agenda (serving all youths), while preserving the value of the high school credential for the few who made it into the academic track.
Tracking was reinforced--and made easier--by the new generation of intelligence tests, which purported to be able to assign children to different programs based on their abilities. In 1917, a Harvard psychologist, Robert Yerkes, devised the first multiple-choice intelligence test, called the Army Alpha, to help screen and place military recruits during World War I.
The use of multiple-choice tests that could be administered on a mass scale quickly spread to the schools. By 1932, three-fourths of large U.S. cities reported using such written, standardized intelligence tests to assign pupils. Typically, working-class, minority, and immigrant children scored worst on such exams and ended up in the lowest tracks.
On the ‘General’ Track
A national leader in the adoption of a differentiated curriculum was Detroit. In their forthcoming book, Democracy’s High School Re-examined, historians David L. Angus and Jeffrey E. Mirel detail how the Motor City shifted from a predominantly academic curriculum in the high schools to a watered-down mixture of course offerings.
By the 1920s, they note, Detroit’s system reflected that in many other big cities, with four tracks: college preparatory, commercial (or business), vocational (industrial arts and home economics), and general. Still, three-quarters of student course-taking remained in the academic categories.
Soon, though, the Depression brought thousands of working-class children into the schools. Between 1929 and 1934, enrollment in Detroit’s comprehensive high schools jumped by more than 43 percent. And the general track, designed to serve students without specific college or career plans, boomed. “By the middle of the 1930s, educators have pretty much given up on vocational education because none of these kids were getting jobs,” says Mirel, a professor of education at Emory University. “So the general track becomes a place where you can keep them in school, not demand a tremendous amount, and at least give them a diploma.”
By 1934, only about 62 percent of student course-taking in Detroit was in the academic category. And school officials lowered standards and weakened the content of the remaining academic courses. Science laboratory courses gave way to large lecture classes in biology, chemistry, and physics. In social studies, required courses in civics and economics were replaced by classes that dealt with such topics as juvenile delinquency, finding a job, and traffic safety.
By the late 1940s, the general track was serving the largest percentage of Detroit students of any of the four tracks--a pattern that would do a long-term disservice to the African-Americans who were migrating from the South and filling the city’s schools.
Decline in Academics
The same pattern occurred nationally. In 1928, Mirel and Angus say, more than 67 percent of the courses taken by American students were academic. Six years later, the proportion of academic course-taking had fallen to 62 percent. Over the next two decades, the proportion continued to drop, to 57 percent in 1961.
Winifred Weislogel, who was a teenager in Elizabeth, N.J., in 1943, remembers having to choose a high school track. “In 9th grade, you had to choose between a general course, a commercial course, or an academic course,” says the 71-year-old Arlington, Va., resident. “The general course included things like basic shop and basic math and arithmetic,” she recalls. The commercial course offered typing, shorthand, and bookkeeping, “and then there was the academic course, which was where you were expected to take at least one foreign language, mathematics, science, English, history.”
“I took the commercial course and then switched” to the academic track, she adds. “In those days, not quite so many people thought they would go on to college, and a lot of us who did were the first in our families to do so.”
Weislogel graduated from Barnard College in New York City in 1949 and eventually went into the Foreign Service. She retired from the U.S. State Department in 1983.
Calls for Realignment
With the passage of the GI Bill during World War II, a college education suddenly became affordable for millions of Americans, and questions of access largely shifted to higher education. Those changes had two notable effects on the high schools. (“GI Bill Paved the Way for a Nation of Higher Learners.”)
First, increased access to college devalued the high school diploma in the job market. Second, it temporarily resurrected concerns about the quality of the college preparation that many young Americans were receiving. A prominent historian, Arthur Bestor, led the attack against the “anti-intellectual” nature of high schools in his 1953 book, Educational Wastelands: The Retreat From Learning in Our Public Schools.
Bestor was particularly alarmed by the “life adjustment” movement of the 1940s, which involved lessons in “the problem of improving one’s personal appearance,” the “problem of developing and maintaining wholesome boy-girl relationships,” and other practical-sounding studies. Such programs were primarily aimed at the “60 percent” of American youths whom educators assumed were not fit for either college or technical careers.
Bestor lambasted educators for de-emphasizing mathematics, science, history, and foreign languages in favor of such topics. The onset of the Cold War also created demands for reinvigorating the high school curriculum, particularly in math, science, and foreign languages. But as much emphasis was placed on accelerating education for the best students as on changing the content of what most young people learned.
Even without tracking, access to education would hardly have been equal. For much of the century, large segments of the population were excluded from a high-quality public education.
Differences in educational opportunity based on race, class, ethnicity, geography, and gender would come to dominate the nation’s consciousness from midcentury onward.
“It’s hard for us to remember how fatally unequal schools were, say, about 1930 or 1940,” Tyack of Stanford University says. “If you think, not of the big cities, but of the rural areas, there were whole big swaths of the country and whole groups of people--especially black people--who had no access to high schools.”
Before World War II, he notes, many rural areas in the South, the Dust Bowl, and other impoverished regions had no public high schools. In 1899, the Supreme Court had ruled in Cummings v. School Board of Richmond County, Ga., that the doctrine of “separate but equal” did not mean that black students automatically had the right to a high school because white students had one.
In 1913, a survey of Atlanta’s black schools found that enrollment exceeded seating capacity by 2,111 children, and that students without seats were forced to stand or sit on the floor. Because of the use of double sessions, 5,000 children received only three hours of instruction a day, in classrooms filled with 60 children per session. Despite repeated petitions from its black citizens, the Atlanta school board made no provision for black high schools until 1920, when the district had 7,100 black students and 21,300 whites.
In his book on education during the Depression, Public Schools in Hard Times, Tyack and his colleagues Robert Lowe and Elisabeth Hansot describe the conditions of black children in a Southern school, where they experienced an “educational inequality so gross that one questions whether the term ‘education’ is appropriate.” Four grades of youngsters squeezed into an unheated shack, where they crowded onto broken benches. While the youngest children were learning to write their names and draw pictures of dogs and cats, the older ones memorized lines from an antiquated grammar lesson and multiplied and checked their sums by counting on their fingers.
Even into the 1960s, conditions in black schools, particularly in the South, remained grossly unequal. James D. Anderson, the head of the department of educational policy studies at the University of Illinois at Urbana-Champaign, recalls his years as a student at Carver High School in Utah, Ala., in a county that was about 80 percent African-American.
“The white high school had a gymnasium, science labs, football fields with lights--most of the things you would expect in an American high school,” says Anderson, who was the valedictorian of the Class of 1962. Carver, which served grades 1-12, was an amalgam of brick and wood-frame buildings dating to the 1920s.
It lacked janitorial services, so Anderson and the other boys scrubbed and waxed the floors themselves and raised and lowered the high windows with long sticks. In the winter, they carried buckets of coal from a pile out back to keep the school warm. “We had only one science teacher, and he taught biology, physics, and chemistry. We didn’t even have a gymnasium until my senior year,” Anderson recalls.
“One of the things people don’t realize,” he adds, “is how recent access has been for some populations in this country. My mother’s generation didn’t even have a high school. They only went to 9th grade. And so it was really my generation, in the 1960s, that was the first generation of African-Americans to have universal public high school education.”
Push for Quality
The civil rights movement that began in the 1950s and the federal “war on poverty” in the 1960s, combined with aggressive intervention by the federal courts, finally caused Americans to confront such inequities. Slowly, then with gathering speed, the schoolhouse door opened for large numbers of previously excluded youths: African-Americans; handicapped children; those who spoke little or no English. (“Immigrants: Providing a Lesson in How To Adapt,”) and (“Bringing Special Education Students Into the Classroom.”)
Through court battles and helped by federal legislation, poor children, migrant children, girls, disabled youngsters, and those wanting bilingual education clamored for--and got--special programs to serve their needs.
But the education historian Diane Ravitch argues that, while many Americans gained greater access to schools, too little attention was paid to the content of what they learned once they got there. In The Troubled Crusade: American Education, 1945-1980, she writes: “Sometimes those who led the battles seemed to forget why it was important to keep students in school longer; to forget that the fight for higher enrollments was part of a crusade against ignorance, and that institutions would be judged by what their students had learned as well as by how many were enrolled.”
Indeed, the social turmoil of the 1960s temporarily overwhelmed the schools, which tried to respond with a host of changes to make education more “relevant,” more engaging, and less structured than it had been before. Often, though, such innovations came at the expense of a strong, core curriculum and an emphasis on high achievement. It wasn’t until the 1970s, and the back-to-basics movement, that states began to impose minimum-competency tests to ensure that high school graduates could at least read, write, and compute at an 8th grade level.
The proportion of 17-year-olds with a high school diploma peaked in 1968-69, at 77.1 percent, in part as a result of the civil rights movement and federal anti-poverty efforts and in part as young people stayed in school to avoid the draft during the Vietnam War.
Although that figure had fallen to 69.7 percent in 1996-97, such figures do not include the many young people who obtain alternative credentials, such as a General Educational Development diploma. The United States, notes Cuban of Stanford University, is remarkable as a “second chance” system. The eventual graduation rates, including GED holders, are now “higher than they’ve ever been,” says Tom Snyder, the director of annual reports for the National Center for Education Statistics. In 1996, 87.3 percent of Americans ages 25 to 29 had a high school diploma.
By the 1980s, most states were rushing to raise the standards for a high school education by mandating new tests, lengthening the school year, raising salaries and entrance requirements for beginning teachers, and tightening their graduation standards. There is evidence that this burst of activity, coming after the earlier gains from the civil rights battles and the war on poverty, has begun to pay off.
Test scores have risen gradually since the early 1970s. Academic course-taking among high school students is up. More than half of all high school graduates now take four years of English and three years each of math, science, and social studies. Some of the biggest gains have been posted by poor and minority students. From 1971 to 1984, black students ages 9 to 13 showed significant gains in scores on the National Assessment of Educational Progress, a federal program that tests a sampling of students in core academic subjects.
The enormous performance gap between students in high-poverty and low-poverty schools--more than two grade levels in math--also has begun to close. And young African-Americans now graduate from high school at about the same rate as their white counterparts, although the rate for Hispanics remains significantly lower.
Despite these advances, Americans remain dissatisfied with their public schools. And their aspirations, both for themselves and for what schools should accomplish, continue to rise.
In part, that’s because schools continue to reflect the inequalities in the larger society. “On the one hand, kids have gotten access to education, and that includes all kids,” observes Kati Haycock, the executive director of the Education Trust, a nonprofit advocacy group for poor and minority students. “That said, if you ask the question, ‘Do they have access to schools of equal quality?’ the answer is decidedly not, and the results show that.”
In particular, she argues, the rigor of the academic courses taken by many poor and minority students in urban areas does not match that offered in more affluent suburbs. Equally troubling, the neediest and least skilled students typically end up with the least experienced, least qualified teachers.
Meanwhile, the whole notion of “access” is shifting. An overwhelming majority of Americans now believe that a college education, not just a high school diploma, is necessary to get and keep a good job.
“Now, to get a good job, you need to go to college,” Labaree of Michigan State says. “And now that college is filling up, you’ve got to go to graduate school. The race continues.” President Clinton, for one, has declared: “Our goal must be nothing less than to make the 13th and 14th years of education as universal to all Americans as the first 12 are today.”
Whether the goal of universal access to higher education is justified is hotly debated by academicians and economists. “It’s nice to be worried about universal access to higher education,” Mirel of Emory University says. “But there’s only one way that’s going to happen, and that’s if we really begin to make a concerted effort to make our middle schools and our high schools institutions that offer outstanding college-preparatory education.”
At the same time, the most highly educated group of parents in history has brought a demanding, consumer-oriented attitude to their relationship with the public schools. Increasingly, parents are clamoring not for access to a common school, but for access to the school of their choice--whether it’s a private school, a charter school, or a magnet school in a neighboring district.
Labaree cautions that, as such attitudes deepen, there is a danger that public education may increasingly be viewed as “a public subsidy for private ambition.”
If such views prevail, he and others caution, it could serve to widen the educational gap between the haves and have-nots in American society, rather than supporting the schools’ traditional democratic purpose.
Focus on Standards
That’s one reason Mirel and others contend the movement by states to adopt academic standards for what students should know and be able to do is so important.
By focusing on the substance of what young people are learning in their courses--and how well they are learning it--the standards movement has the potential to ensure that all students have access to a top-quality curriculum.
Standards, “wisely developed and applied, can greatly benefit American education,” Mirel and Angus write. “Such measures could constitute major steps toward equalizing educational quality and ensuring that all American students, particularly poor and minority students, have access to the same challenging programs and courses that students in the nation’s best schools now receive.”
But others are skeptical that the movement will live up to its promise. By 2004, for example, all students in Virginia will be expected to pass new tests based on detailed academic standards to graduate from high school. Test results also will help determine whether students are promoted to the next grade, and whether schools maintain their accreditation.
“I think all kids can learn,” says R. Wayne Ellis, a high school math teacher in Richmond, Va., “but I don’t think they can all learn at the level we’re setting right now, as far as everybody is going to learn algebra when they’re in 9th grade, and everybody can learn geometry in 10th grade, and so on.”
“I think we need to take a serious look at the rate of student learning,” adds Ellis, who teaches at the city’s Huguenot High School. “And I think the standards are not age-appropriate.”
The declaration that “all children can learn” may be even more difficult to fulfill given the changing demographics of the United States. In 1997, the country’s population of children roughly equaled the record set when the baby boomers were coming of age--69.5 million in 1997, compared with 69.9 million in 1966.
An increasing proportion of U.S. children are poor and minority, groups that historically have been the least well-served by the public schools.
“The unfinished agenda will continually be the low-income minorities, basically poor people,” Cuban of Stanford says. “And that inequity, which reflects the larger inequalities in this society, will remain what I would call a persistent kind of illness in the American educational system and the body politic.”
A version of this article appeared in the January 11, 1999 edition of Education Week as The Common Good