Assessment

‘Report Card’ Lacking Usual Background Data

By Debra Viadero — July 09, 2003 3 min read
  • Save to favorites
  • Print

When the U.S. Department of Education published the results from the newest “nation’s report card” on reading last month, some of the usual statistics were missing. Gone were the data showing what students did in and out of school, how much television they watched at home, what instructional practices their teachers used, and what books they read for fun.

The background data omitted from the printed version of the 2002 reading report is available from the National Center for Eduation Statistics. Click on search options to begin using the data tool.

Such background analyses had become a staple of the National Assessment of Educational Progress’ reports on achievement in reading. The analyses appeared in NAEP reading reports published in 1992, 1994, 1998, and 2000. The statistics are still readily available on the NAEP Web site—they’re just not in the hard copy of the 2002 report.

While Education Department officials say the analyses were left out to streamline the production of NAEP reports and to prevent misinterpretations of the data, some researchers and advocates are troubled by their disappearance.

“I think it’s a terrible loss,” said Harold H. Wenglinsky, an associate professor at City University of New York’s Baruch College of Public Affairs. He is a frequent user of the data yielded by the background questions.

“NAEP is the only common yardstick we use to see how different groups are doing and how different kinds of schools perform,” Mr. Wenglinsky added. “Are schools with small class sizes performing better than schools with larger class sizes, or are schools with inexperienced teachers performing better than schools with more experienced teachers?”

The missing questions on television watching were of particular concern to the TV Turnoff Network, a nonprofit group working to reduce students’ television- watching time.

“Obviously, we’d like to see more of that background data,” said Frank Vespe, the executive director of the Washington-based group. He noted that the NAEP data in previous years had consistently suggested a link between too much TV watching and lower reading scores. In more recent years, some of the trend data showed some students, at least, were cutting down on weekly television viewing.

“Those are two things that you want to track,” Mr. Vespe said.

Federal education officials have been rethinking how to package the background data for all the NAEP reports for some time. Last year, the National Assessment Governing Board, which sets policy for the congressionally mandated exams given to samplings of students in key subjects, cut back on the number of background questions being asked.

Refocusing Questions

At its meeting next month the governing board will take that effort a step further. It is set to take final action on a proposal to refocus the “noncognitive” questions on topics that directly relate to academic achievement or factors that other research has shown to be linked to such achievement. (“NAEP Board Wants to Reduce Background Queries,” May 14, 2003.)

Under the proposal, the Education Department could still produce supplemental reports that draw on the background data. The report topics, however, would be determined by a research advisory board.

In addition, the plan also calls for divorcing the descriptive statistics that the background questions yield from any correlations with NAEP test scores. The problem with the correlations, federal officials say, is that they are widely taken to suggest cause and effect, when they merely indicate possible links.

For example, a question asked in the 2002 reading-test administration—but not included in the current report card— asked 4th grade teachers to indicate what kinds of reading materials they used in their classrooms.

The results suggest that students in classrooms that rely heavily on either trade books or on a combination of trade books and basal readers score higher on the reading tests than do students who primarily use basal readers.

Yet, said Grover J. “Russ” Whitehurst, the director of the department’s Institute of Education Sciences, “we know more affluent districts in general are more likely to use trade books, and that districts in poverty are more likely to use basals.”

“The data don’t tell the whole story,” added Mr. Whitehurst, whose institute oversees much of the Education Department’s research. He wants to save such statistics for more thorough analyses in separate reports, rather than squeeze them into NAEP reports meant to provide mostly descriptive information.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
IT Infrastructure Webinar
A New Era In Connected Learning: Security, Accessibility and Affordability for a Future-Ready Classroom
Learn about Windows 11 SE and Surface Laptop SE. Enable students to unlock learning and develop new skills.
Content provided by Microsoft Surface
Classroom Technology K-12 Essentials Forum Making Technology Work Better in Schools
Join experts for a look at the steps schools are taking (or should take) to improve the use of technology in schools.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Budget & Finance Webinar
The ABCs of ESSER: How to Make the Most of Relief Funds Before They Expire
Join a diverse group of K-12 experts to learn how to leverage federal funds before they expire and improve student learning environments.
Content provided by Johnson Controls

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Opinion What the Digital SAT Will Mean for Students and Educators
The college-admissions test will be fully digital by 2024. Priscilla Rodriguez from the College Board discusses the change.
6 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Opinion Searching for Common Ground: What Makes a Good Test?
Rick Hess and USC Dean Pedro Noguera discuss standardized testing—what it’s for, where it’s gone wrong, and how to improve it.
5 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Spotlight Spotlight on Assessment in 2022
This Spotlight will help you understand how to use assessment data to guide student learning and examine the debate over standardized tests.
Assessment State Test Results Are In. Are They Useless?
While states, districts, and schools pore over data from spring 2021 tests, experts urge caution over how to interpret and use the results.
9 min read
FILE - In this Jan. 17, 2016 file photo, a sign is seen at the entrance to a hall for a college test preparation class in Bethesda, Md. The $380 million test coaching industry is facing competition from free or low-cost alternatives in what their founders hope will make the process of applying to college more equitable. Such innovations are also raising questions about the relevance and the fairness of relying on standardized tests in admissions process.
A sign is posted at the entrance to a hall for a test-preparation class. Assessment experts say educators should use data from spring 2021 tests with caution.
Alex Brandon/AP