Assessment

‘Report Card’ Lacking Usual Background Data

By Debra Viadero — July 09, 2003 3 min read
  • Save to favorites
  • Print

When the U.S. Department of Education published the results from the newest “nation’s report card” on reading last month, some of the usual statistics were missing. Gone were the data showing what students did in and out of school, how much television they watched at home, what instructional practices their teachers used, and what books they read for fun.

The background data omitted from the printed version of the 2002 reading report is available from the National Center for Eduation Statistics. Click on search options to begin using the data tool.

Such background analyses had become a staple of the National Assessment of Educational Progress’ reports on achievement in reading. The analyses appeared in NAEP reading reports published in 1992, 1994, 1998, and 2000. The statistics are still readily available on the NAEP Web site—they’re just not in the hard copy of the 2002 report.

While Education Department officials say the analyses were left out to streamline the production of NAEP reports and to prevent misinterpretations of the data, some researchers and advocates are troubled by their disappearance.

“I think it’s a terrible loss,” said Harold H. Wenglinsky, an associate professor at City University of New York’s Baruch College of Public Affairs. He is a frequent user of the data yielded by the background questions.

“NAEP is the only common yardstick we use to see how different groups are doing and how different kinds of schools perform,” Mr. Wenglinsky added. “Are schools with small class sizes performing better than schools with larger class sizes, or are schools with inexperienced teachers performing better than schools with more experienced teachers?”

The missing questions on television watching were of particular concern to the TV Turnoff Network, a nonprofit group working to reduce students’ television- watching time.

“Obviously, we’d like to see more of that background data,” said Frank Vespe, the executive director of the Washington-based group. He noted that the NAEP data in previous years had consistently suggested a link between too much TV watching and lower reading scores. In more recent years, some of the trend data showed some students, at least, were cutting down on weekly television viewing.

“Those are two things that you want to track,” Mr. Vespe said.

Federal education officials have been rethinking how to package the background data for all the NAEP reports for some time. Last year, the National Assessment Governing Board, which sets policy for the congressionally mandated exams given to samplings of students in key subjects, cut back on the number of background questions being asked.

Refocusing Questions

At its meeting next month the governing board will take that effort a step further. It is set to take final action on a proposal to refocus the “noncognitive” questions on topics that directly relate to academic achievement or factors that other research has shown to be linked to such achievement. (“NAEP Board Wants to Reduce Background Queries,” May 14, 2003.)

Under the proposal, the Education Department could still produce supplemental reports that draw on the background data. The report topics, however, would be determined by a research advisory board.

In addition, the plan also calls for divorcing the descriptive statistics that the background questions yield from any correlations with NAEP test scores. The problem with the correlations, federal officials say, is that they are widely taken to suggest cause and effect, when they merely indicate possible links.

For example, a question asked in the 2002 reading-test administration—but not included in the current report card— asked 4th grade teachers to indicate what kinds of reading materials they used in their classrooms.

The results suggest that students in classrooms that rely heavily on either trade books or on a combination of trade books and basal readers score higher on the reading tests than do students who primarily use basal readers.

Yet, said Grover J. “Russ” Whitehurst, the director of the department’s Institute of Education Sciences, “we know more affluent districts in general are more likely to use trade books, and that districts in poverty are more likely to use basals.”

“The data don’t tell the whole story,” added Mr. Whitehurst, whose institute oversees much of the Education Department’s research. He wants to save such statistics for more thorough analyses in separate reports, rather than squeeze them into NAEP reports meant to provide mostly descriptive information.

Related Tags:

Events

Teaching Profession K-12 Essentials Forum New Insights Into the Teaching Profession
Join this free virtual event to get exclusive insights from Education Week's State of Teaching project.
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
Mathematics K-12 Essentials Forum Helping Students Succeed in Math

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Download 6 Ways to Curb Grade-Change Requests From Students and Parents (DOWNLOADABLE)
No one likes dealing with grade-change requests. Here are some tips to help teachers avoid them altogether.
1 min read
Close up of a schoolgirl showing her C- grade on a test at elementary school.
E+/Getty Images
Assessment Opinion Our Grading System Was Setting Students Up to Fail—Until This Change
Our first reaction to standards-based grading was despair. Then, slowly, things began to change.
Matthew Ebert
5 min read
A student climbs up stairs as letter grades fall around her. In the background a teacher is grading a test.
Vanessa Solis/Education Week via Canva
Assessment In Case You Missed It: How Schools Are Measuring Student Success
Explore stories about grading practices, what truly reflects student achievement, and more.
5 min read
Grading and assessment SR
Robert Neubecker for Education Week
Assessment Quiz Quiz Yourself: How Much Do You Know About Standardized Testing & Improving Student Outcomes?
Answer 7 questions about improving standardized testing and student outcomes.