Bridging the Summer Slump

By Debra Viadero — September 01, 1994 7 min read

Teachers have always known that after a long, lazy summer break, their students need to spend most of September reviewing material they learned the year before. What has been less obvious is that the amount of review students need could have a lot to do with their socioeconomic status.

While learning slows down for all students when school is out, a small but growing number of studies show that it practically grinds to a halt for those who come from disadvantaged homes. Moreover, some of those students even lose ground. “Being out of school can actually be detrimental to some kids,” says Nancy Karweit, a researcher at Johns Hopkins University’s Center for Research on Effective Schooling for Disadvantaged Students. “Teachers don’t see their students the prior spring. I don’t think they have any clue there is this differential in learning rates over the summer.”

As far back as 1906, researchers documented a “summer effect” on learning for all students. Some of those early studies pointed out, for example, that over the summer, young students forget more of what they learned in mathematics than in other subjects, such as reading. But it wasn’t until the 1960s and ‘70s that researchers began to notice the connection between the intensity of the summer effect and students’ economic backgrounds. In 1972, Barbara Heyns, who is now a sociology professor at New York University, conducted one of the first and most comprehensive studies of the phenomenon.

Heyns undertook her work in response to the prevailing view at the time that schooling had no effect on closing the learning gaps between black and white, and rich and poor, students. Her idea was to use the summer period as a sort of experimental control—a stand-in for no school—that could provide a point of comparison for looking at whether schooling makes a difference for disadvantaged children.

Toward that end, Heyns tracked nearly 3,000 Atlanta 6th and 7th graders over a period of two years. She looked at their scores on standardized achievement tests given in May and October of each year, surveyed their parents for information on their socioeconomic status, and interviewed a smaller sample of 500 students to find out how they had spent their summers.

She found that children of every income level and all racial groups learned at much slower rates over the summer than they did during the school year. But advantaged children continued to make academic gains even when school was closed. Disadvantaged children, by comparison, either seemed to tread water academically or to lose some learning. “Those rates of inequality sometimes doubled and tripled,” Heyns says. From October to May, poorer children were able to make great strides in narrowing the achievement gap that separated them from their more advantaged peers. However, they weren’t able to completely catch up before another summer rolled around, causing the achievement gap to widen further.

“Heyns’ data . . . suggest that, if all schools were to be closed down, converting childhood into one endless summer, the eventual achievement gap between initially advantaged children and initially disadvantaged children would be even greater than it now is,” writes Christopher Jencks, one of Heyns’ colleagues, in a foreword to a 1978 book on the study.

Researchers Karl Alexander and Doris Entwistle have carried that conclusion a step further. In a 1992 paper, they suggest that the differential slowdown in learning that occurs over the summer may account for most of the gap between the achievement of disadvantaged and relatively advantaged students throughout the course of their schooling.

In their study, which comes out of an ongoing, longitudinal study of 790 Baltimore students who entered 1st grade in 1982, both disadvantaged and relatively advantaged students began their school careers at pretty much the same academic level. Verbal and mathematics achievement scores on standardized tests given to those students at the beginning of 1st grade showed only modest differences between the two socioeconomic groups. During school months, both groups learned at about the same rate. It was only during the summer that the achievement gap began to open up. “That gap between high [socioeconomic status] students and low-SES students increases steadily over the years,” says Alexander, a sociology professor at Johns Hopkins in Baltimore. “And that mostly reflects the more substantial strides upper-SES kids make during the summer months.”

Researchers do not know with certainty why learning seems to slow down so much for poor students when they are not in school. They do, however, have some suspicions. “It seems clear there are differences in what society is giving kids,” says Idorenyin Jamar, a postdoctoral fellow at the University of Pittsburgh’s Learning Research and Development Center. “Advantaged kids seem to get more structured activities and more enrichment.” After all, she notes, summer camp, ballet lessons, museum visits, and trips to Europe all cost money.

In her study, Heyns pinpointed several kinds of summer activities that seemed to contribute to greater gains in summer learning for all students. Primary among those activities was reading. Somewhat inexplicably, however, her data suggest that owning a bicycle and being allowed to take a trip alone were also factors linked to greater learning gains. The latter, she writes in her book Summer Learning and Effects of Schooling (Academic Press, 1978), may hold true because those children were more mature or motivated to begin with, and, therefore, their parents trusted them to travel alone. She speculates that child-rearing styles that encourage children to be independent may also contribute to learning.

Surprisingly, summer school did not make much of a difference, Heyns found. At the time she conducted her research, Atlanta had extensive summer school programs, which more than one-fourth of the district’s students attended on a voluntary basis. Those half-day programs were located mostly in schools with large numbers of disadvantaged children. But the offerings varied widely, the sessions lasted only six weeks, and the thrust was largely recreational. Other studies have drawn similar conclusions. “You’d think, well, give them summer school and that should remedy the situation,” says Karweit of Johns Hopkins. “In general, the literature doesn’t seem to indicate that it’s been effective.”

One possible exception to the dismal trend may be the Summer Training and Employment Program, operated by Public/Private Ventures, a Philadelphia-based foundation. Targeted to disadvantaged students between ages 14 and 16, STEP takes place over two summers and offers a combination of paid employment, general education, and life-skills training. By the end of their first summer, students gain an average of four months in reading and eight months in mathematics. In the long run, however, the program has been less successful at curbing dropout rates among participants. For that reason, STEP’s founders have created successor summer programs aimed at older students. Those newer programs have not been formally evaluated.

Some educators suggest that year-round schooling may offer a solution to the summer effect. Although the data have yet to provide conclusive evidence that moving to a year-round calendar improves students’ academic achievement, several studies suggest that it does—particularly for disadvantaged students. Researchers, however, have a difficult time teasing out whether the improvements are due to shortened vacation breaks, more days of schooling, or some other factors. “Some kids just don’t like school,” says Jamar of the University of Pittsburgh, “and if you give them more of the same, I’m not sure it would make a difference.”

Despite the potential significance of the summer effect on student learning over all, the subject has garnered relatively little attention within the research community. Moreover, existing studies suffer from some shortcomings. Heyns’ study, for example, measured achievement through tests given in October and May. That means two months of instructional time were included in the “summer” period. Jamar also questions the use of traditional standardized tests to measure student achievement because they may not tap the full range of children’s capabilities.

Still, the bottom line in all of the studies is a positive one for educators, Karweit says. “It says what a good job schools are doing for disadvantaged kids.”

A version of this article appeared in the September 01, 1994 edition of Teacher as Bridging the Summer Slump