The ambivalence that people feel toward children can be heard in Linda Bird-Davies’ voice when she talks about her career.
She began a family-child-care program in the mid-1970s so she could be at home with her daughter. Now, more than 20 years later, Bird-Davies is the director of a Los Angeles child-care center--part of an industry that has made it possible for parents to be separated from their children for long hours while they work.
It’s a setting where young children--currently viewed as resilient and independent--have been forced to adapt to the fast-paced schedules of families in the ‘90s. But it’s one in which Bird-Davies says she sometimes feels sorrow when she knows a child just wants to be with his or her mother.
The truth is, Americans’ feelings about children and their place in society have always been mixed. In the introduction to his 1982 book The Rise and Fall of Childhood, C. John Sommerville, an associate professor of history at the University of Florida, writes: “We do not seem to know exactly how we feel about children.”
And as the end of the 20th century approaches, a confluence of changes--social, economic, technological--has led many authors and child-development experts to warn that the very concept of childhood is in danger. Youngsters are scooted off to daily programs, bombarded by electronic media, and saddled with increasing responsibilities.
“Children are much more mature, and their parents treat them that way,” Bird-Davies says.
In the preface to the 1988 revised version of The Hurried Child, David Elkind, a professor of child development at Tufts University, writes: “Our new conception of children and youth is epitomized in the metaphor of the Superkid. Like Superman, Superkid has spectacular powers and precocious competence even as an infant. This allows us to think that we can hurry the little powerhouse with impunity.”
As the Victorian era was coming to a close in the early 1900s, children were depicted through artwork and advertising as innocent, almost angelic creatures to be pampered and protected from corrupt influences.
For the more advantaged classes, that idealistic image often held true, says Robin Love, a child-development professor at San Jose State University who teaches a course on the changing notions of childhood in this century.
But the early 1900s were also an era of child labor. Such labor played a major part in the Industrial Revolution, and it was widely assumed that children over the age of 9 should be working and contributing to the family income. Before 1920, children from poor and immigrant families, some as young as 3, worked in mills and factories as long as 10 hours a day.
Ultimately, the increasingly widespread use of a modern medium of communication--photography--helped alert the public to the abuses of child labor and strengthened the hand of reformers who sought to curtail such practices.
“The faces staring out from magazines and newspapers were shocking contradictions to what had become the mainstream American definition of childhood. Neither healthy, nor innocent, nor happy, they must have seemed not to be children at all,” writes Mary Lynn Stevens Heininger in A Century of Childhood: 1820-1920.
The end of child labor and the spread of compulsory-education laws were part of a movement to preserve childhood as a special time in life. Organizations devoted to children’s issues were founded, among them the Child Welfare League of America in 1921 and the National Association for the Education of Young Children in 1926.
As a result of such efforts, a separate justice system for juveniles, child-protection laws, and even playgrounds emerged.
Ironically, at the end of the century, states and communities are again rethinking the way they handle juvenile offenders: Now, the prevailing sentiment is that minors who have committed violent crimes should be subject to the same laws and receive the same punishments as adults.
A Scientific Approach
The early 20th century also saw the beginning of the child-study movement and acceptance of the belief that children should not all be held to one standard of development.
Religious influences--viewing children as either inherently evil or inherently innocent--gradually were overshadowed by more secular and scientific approaches to childhood, says Martha Minow, a law professor at Harvard University and one of the creators of a course there that examines the role of children in society.
Throughout the first two decades of the century, a variety of measurements were created to gauge children’s development, both physical and psychological. And research on child development rapidly splintered into different schools of thought.
Sigmund Freud, the founder of psychoanalysis, made popular the idea that the problems of adults could be traced to their childhood years.
Of Freud, Sommerville writes: “Childhood was not, in his view, simply a convenient time for establishing ‘good’ behavior patterns; rather, it was the most fateful part of life. The neuroses that troubled his patients in Austria seemed always to go back to their earliest years.”
The increasing prestige accorded what were seen as scientific approaches to child development was reflected, too, in the influence of another school of child-rearing--behaviorism. The belief was that children, almost like Pavlov’s dog, could be conditioned to behave a certain way. Parents were advised by psychologists such as John B. Watson to keep their babies on a strict feeding schedule. Early toilet training was encouraged, while affection, rocking, and cuddling were not.
In the early decades of the century, the federal government weighed in for the first time with the establishment of the Children’s Bureau, which was housed in the U.S. Department of Labor, and the publication of a manual called Infant Care. The 1914 document was intended to provide mothers with practical information on such topics as child health and nutrition, but it also emphasized strict routines and rules.
Greater attention was being paid to hygiene, sanitation, and routine health examinations. Documents such as the “Children’s Charter,” from the 1930 White House Conference on Child Health and Protection, spoke of providing children with “pure food, pure milk, and pure water.” (“Children at the White House.”)
A reaction was building, meanwhile, to the cold and inflexible methods of child-rearing practiced by so many parents early in the century. It was a generation raised on those methods that, as its members became parents themselves, embraced wholeheartedly the much different advice of Dr. Benjamin Spock. His Baby and Child Care, first published in 1946, told parents to eschew rigid methods and trust their own instincts. Spock was later criticized as being too permissive--some even blamed him for the youthful rebelliousness of the 1960s--but today’s parents still turn to updated versions of his international best seller for counsel.
Embedded in this trend, which Sommerville calls “the liberation of children,” was the message that childhood should be fun and that children should be allowed to enjoy it.
Parenting Advice Goes Commercial
The publishing success and celebrity status enjoyed by Dr. Spock point out another aspect of child-rearing: While advice for parents has always been available, in the 20th century it became a commercial enterprise. A bewildering array of books and magazines--spouting a variety of opinions on how to “parent"--now lines bookstore shelves. Many of the current recommendations on child-rearing, Minow notes, are “child centered,” meaning that the child’s viewpoint is considered vital.
But Kay Hymowitz, a contributing editor of City Journal, a magazine published by the Manhattan Institute, argues that giving children a lot of freedom to make decisions and being concerned about their self-esteem have created in the 1990s what she calls “egotistical” children.
Hymowitz is writing a book she says is about “the end of childhood,” to be released next fall. She says that what used to be, in the 19th century, an obsession with protecting children began to shift in the 1950s and 1960s to the opposite extreme, with parents being unable to restrain their children.
While “people don’t like bratty kids,” she says, Americans think that precociousness in young children is cute and often encourage, for example, an early interest in the opposite sex.
Changes in the structure and well-being of families have had, of course, a significant effect on children. For one, children are simply more likely to live long enough to become adults.
In 1900, children had only a 79 percent chance of living past age 15, sociologist Peter Uhlenberg points out. By 1979, those chances had increased to 98 percent.
“As infant mortality has declined, childhood has become a more clearly differentiated stage of life, and families have increasingly focused upon children and emphasized the nurturance of children,” Uhlenberg says in a chapter called “Death and the Family” in the 1985 book Growing Up in America: Children in Historical Perspective.
With child-labor laws in place, fathers typically became the sole breadwinners. Americans left behind farming jobs and moved to cities for work, writes Donald J. Hernandez in an article on changing demographics published in a 1995 “Future of Children” report from the David and Lucile Packard Foundation.
And while oral contraceptives were not available until the 1960s, the family-planning movement grew throughout the century. By 1930, the average number of children in a family had dropped from seven to only two or three.
By 1940, most children over the age of 7 were enrolled in public school. But the history of organized programs for younger children is far more complicated.
During the early part of the century, babies and young children were predominantly cared for by their mothers, except in the case of poor mothers who needed to work. Most child-care arrangements were informal.
But “day nurseries,” operated by charity groups, also were available. Emily D. Cahan, the author of Past Caring: A History of U.S. Preschool Care and Education for the Poor, 1820-1965, estimates that there were 700 day nurseries in the United States by 1916. She also documents the substandard conditions under which many of those programs operated.
The nurseries eventually attracted critics, who said that children should not be separated from their mothers. That sentiment contributed to the rise of “pensions” that many states paid to lower-income mothers to keep them at home with their children and out of the workforce.
Nursery schools, which served as a type of laboratory for the child-study movement, were developed in the 1920s and 1930s. And kindergartens, developed by the German teacher Friedrich Froebel in the 1830s, began to spread through the United States in the early part of this century.
While such influential figures as Froebel, the Italian physician Maria Montessori, and the Swiss psychologist Jean Piaget are best known for their theories on education, they have also shaped the way society in general views and treats children.
It was Froebel who emphasized that children learn through play. He introduced the idea of teachers as “facilitators,” instead of authoritarians.
Child-size tables, chairs, and cups--now common in many homes with young children--can be traced to Montessori in the early 1900s. In the 1920s, Piaget identified stages of development that children go through as they move from exploring the world with their senses to understanding abstract concepts.
Women in the Workforce
Experts often point out that the fields of early-childhood education and child care have developed simultaneously, but with little connection between the two.
The United States faced its first major child-care dilemma during the Second World War: With millions of fathers in the military, many mothers confronted the decision of whether to stay home or go to work in defense factories.
Simply being a mother was patriotic enough, the nation’s leaders stressed. Still, tens of thousands of women placed their children in special wartime child-care centers operated by the federal government and went to work.
While most of those centers closed in the years after World War II, the end of the war didn’t necessarily mean the end of demands for child-care services.
Between 1940 and 1960, Hernandez writes, the number of homes with mothers and fathers both working increased significantly. The economic benefits of being employed, combined with escalating divorce rates and an increase in never-married mothers, led to a greater need for child care, and the demand for organized programs is expected to grow in the 21st century.
Since 1975, the percentage of working mothers has increased dramatically, according to the Labor Department’s Women’s Bureau. In 1975, 47.3 percent of mothers with children under 18 were in the labor force. Last year, the figure was 72 percent.
And current U.S. Census Bureau figures show that more than 10 million children under 5--about half the children in that age group--are either in child care or have parents who are seeking child care of some sort.
It’s a trend that the media, even women’s magazines, largely ignored until publications like Working Mother came on the scene in the late 1970s.
Now, in the last decade of the century, child care and educational programs for young children have become major political issues at the local, state, and federal levels.
Growing interest in brain development, combined with changes in welfare policy, have led many states to expand child-care and preschool programs. The issue has also received more attention as women seek more flexible work schedules that allow them to balance their home and job responsibilities.
In Images of Childhood, published in 1996, Maris A. Vinovskis notes that movements that focus on children often begin with efforts to address poverty. A 20th-century example is Head Start, the popular federal initiative that began as a summer program for low-income children in 1965 and has grown to a full-year, and in some cases full-day, program serving about 800,000 children.
While research on the effectiveness of Head Start has been mixed, many Americans have come to view early education as an essential part of the childhood experience and a necessity for future success in school and life.
Another place youngsters came to spend more and more of their time was at home--in front of the television set. And almost from the time TV was introduced into American households in the late 1940s, the effects of the “electronic babysitter,” the “boob tube,” the “vast wasteland” on children have been a near-constant subject of national worry and debate.
Throughout the latter part of the century, parents have had to confront such issues as violence, profanity, and sexual explicitness on television as well as the sheer number of hours their children are watching.
In the 1993 book Children and Television: Images in a Changing Sociocultural World, TV is described as a part of the American household that competes with traditional means of socialization, such as the family, school, and church.
And concerns have been raised over how children are portrayed on television and through the media in general. Advertising images of teenage girls wearing makeup and dressed in alluring fashions have once again helped create a concept of children as “miniature adults,” as they were viewed through much of history.
Cultural and societal changes--not just working mothers, but significant levels of drug use, sexual activity, and violence among children and teenagers--have led many experts and social critics to argue that the lines between childhood and adulthood have been irrevocably blurred.
But it’s not just negative influences that have robbed children of their special childhood years, says Elkind. The efforts of parents and the practices of schools can also force children to grow up too fast.
In The Hurried Child, Elkind writes that pushing children into sports and other activities at a young age, as well as emphasizing reading skills during the preschool years, can create children who are overloaded with adultlike decisions. And schools, by pushing more demanding curricula and testing into lower and lower grades, may actually be harming children, he contends.
But children aren’t the only ones influenced by technology and the information age. Adults’ impressions of children have shifted because of the media, says Cynthia Scheibe, the director of the Center for Research on the Effects of Television at Ithaca College.
For one, educational television programs--especially “Sesame Street"--have raised expectations of what children should know when they enter kindergarten, Scheibe says.
And news reports--often disturbing accounts of violence and drug use by young people--also shape adults’ views of children, says Lillian Brinkley, the principal of the Willard Model Elementary School in Norfolk, Va., and a 38-year veteran of education.
But while sensational cases often make it seem as if children are out of control, Brinkley says that hasn’t been her experience. Students wind up in her office for the same reasons they did 30 years ago--fighting, name-calling, setting off the fire alarm.
While the society children live in has changed dramatically, “there is still that line that runs down the middle,” Brinkley says. “As filled as their world is with all kinds of things, they still want to know that there is someone there to provide some direction for them.”
A version of this article appeared in the February 24, 1999 edition of Education Week as Changing Versions of Childhood