When the U.S. Department of Education published the results from the newest “nation’s report card” on reading last month, some of the usual statistics were missing. Gone were the data showing what students did in and out of school, how much television they watched at home, what instructional practices their teachers used, and what books they read for fun.
Theomitted from the printed version of the 2002 reading report is available from the . Click on to begin using the data tool.
Such background analyses had become a staple of the National Assessment of Educational Progress’ reports on achievement in reading. The analyses appeared in NAEP reading reports published in 1992, 1994, 1998, and 2000. The statistics are still readily available on the NAEP Web site—they’re just not in the hard copy of the 2002 report.
While Education Department officials say the analyses were left out to streamline the production of NAEP reports and to prevent misinterpretations of the data, some researchers and advocates are troubled by their disappearance.
“I think it’s a terrible loss,” said Harold H. Wenglinsky, an associate professor at City University of New York’s Baruch College of Public Affairs. He is a frequent user of the data yielded by the background questions.
“NAEP is the only common yardstick we use to see how different groups are doing and how different kinds of schools perform,” Mr. Wenglinsky added. “Are schools with small class sizes performing better than schools with larger class sizes, or are schools with inexperienced teachers performing better than schools with more experienced teachers?”
The missing questions on television watching were of particular concern to the TV Turnoff Network, a nonprofit group working to reduce students’ television- watching time.
“Obviously, we’d like to see more of that background data,” said Frank Vespe, the executive director of the Washington-based group. He noted that the NAEP data in previous years had consistently suggested a link between too much TV watching and lower reading scores. In more recent years, some of the trend data showed some students, at least, were cutting down on weekly television viewing.
“Those are two things that you want to track,” Mr. Vespe said.
Federal education officials have been rethinking how to package the background data for all the NAEP reports for some time. Last year, the National Assessment Governing Board, which sets policy for the congressionally mandated exams given to samplings of students in key subjects, cut back on the number of background questions being asked.
At its meeting next month the governing board will take that effort a step further. It is set to take final action on a proposal to refocus the “noncognitive” questions on topics that directly relate to academic achievement or factors that other research has shown to be linked to such achievement. (May 14, 2003.)
Under the proposal, the Education Department could still produce supplemental reports that draw on the background data. The report topics, however, would be determined by a research advisory board.
In addition, the plan also calls for divorcing the descriptive statistics that the background questions yield from any correlations with NAEP test scores. The problem with the correlations, federal officials say, is that they are widely taken to suggest cause and effect, when they merely indicate possible links.
For example, a question asked in the 2002 reading-test administration—but not included in the current report card— asked 4th grade teachers to indicate what kinds of reading materials they used in their classrooms.
The results suggest that students in classrooms that rely heavily on either trade books or on a combination of trade books and basal readers score higher on the reading tests than do students who primarily use basal readers.
Yet, said Grover J. “Russ” Whitehurst, the director of the department’s Institute of Education Sciences, “we know more affluent districts in general are more likely to use trade books, and that districts in poverty are more likely to use basals.”
“The data don’t tell the whole story,” added Mr. Whitehurst, whose institute oversees much of the Education Department’s research. He wants to save such statistics for more thorough analyses in separate reports, rather than squeeze them into NAEP reports meant to provide mostly descriptive information.