Assessment

NAEP Board Sets Rules For Background Questions

By Debra Viadero — September 03, 2003 3 min read
  • Save to favorites
  • Print

The board that sets policy for the National Assessment of Educational Progress has approved new guidelines aimed at reducing and better focusing the nonacademic questions that students and educators are asked to answer as part of the federally sponsored exams.

Besides testing what students know in specific subjects, such as reading and mathematics, the NAEP tests have long included some “noncognitive” or background questions on everything from the amount of time students spend on homework to the kinds of reading materials they use in class.

Members of the National Assessment Governing Board, the independent body that oversees the testing program, were concerned that the additional questions made the exams too cumbersome and yielded results that could be misinterpreted.

Grover J. “Russ” Whitehurst, the director of the Department of Education’s Institute of Education Sciences, shared some of the concerns. He opted on recent printed copies of NAEP reports to drop analyses of the data yielded by the background questions. (“‘Report Card’ Lacking Usual Background Data,” July 9, 2003.)

By law, the exams are required to gather information on race, ethnicity, gender, socioeconomic status, disability, and English-language proficiency for the students participating in the assessment. Those questions will continue to be part of future exams, according to the final guidelines approved by the governing board at its Aug. 1 meeting here.

The new guidelines allow for a trimmer set of questions on socioeconomic status, however, with some appearing in every assessment and some popping up periodically or accompanying only limited samples of tests.

Likewise, the exams could also include questions on “contextual variables,” such as student mobility, school safety issues, or discipline, but only if those topics have been shown through other research to have an impact on academic achievement.

The same holds true for subject-specific background questions. Queries in that category might probe relevant course content, teacher preparation, or other factors related to student achievement.

Experts’ Advice

In all, the framework approved last month says, background questions should not take up more than 10 minutes of students’ time, 20 minutes for teachers, and 30 minutes for school administrators. To keep within those limits, the guidelines encourage the Education Department to try to get some of the same information from other sources, such as school transcripts and other federal surveys.

In addition, since 4th grade test-takers can’t be counted on to provide reliable information on their parents’ income levels, the framework encourages federal test designers to try developing a wider index of proxy variables that might give a more accurate reading of a family’s socioeconomic status than the ones that are currently used.

“We think if we do a good job of reforming the NAEP, it will continue to be an important source of data for the research community,” said John H. Stevens, the executive director of the Texas Business and Education Coalition and the NAGB member who spearheaded the revision of the background questions.

While generally supportive of the board’s new direction on background questions, some researchers and national education groups expressed disappointment that the board had dropped earlier plans to set up an advisory board to help select appropriate questions.

“I don’t know where the capacity is to do the good work that the board wants to do,” Gerald E. Sroufe, the director of government relations for the Washington-based American Educational Research Association, told a board committee last month. “You need people really steeped in theory and research to suggest where the field is.”

And, while the guidelines say that analyses of the background data should be included in federal NAEP reports, they don’t require it.

When the data are presented, the framework says, the Education Department should refrain from suggesting any cause-and-effect relationships between the background factors and any variations in student achievement. Drawn from cross- sectional samples of students, the best such findings can do is suggest possible links for others to probe further, the guidelines note.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by STARI
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Climb: A New Framework for Career Readiness in the Age of AI
Discover practical strategies to redefine career readiness in K–12 and move beyond credentials to develop true capability and character.
Content provided by Pearson

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Opinion Principals Often Misuse Student Achievement Data. Here’s How to Get It Right
Eight recommendations for digging into standardized-test data responsibly.
David E. DeMatthews & Lebon "Trey" D. James III
4 min read
A principal looks through a telescope as he plans for the future school year based on test scores.
Vanessa Solis/Education Week via Canva
Assessment Explainer What Is the Classic Learning Test, and Why Is It Popular With Conservatives?
A relative newcomer has started to gain traction in the college-entrance-exam landscape—especially in red states.
9 min read
Students Taking Exam in Classroom Setting. Students are seated in a classroom, writing answers during an exam, highlighting focus and academic testing.
iStock/Getty
Assessment Opinion I Don’t Offer My Students Extra Credit. Here’s What I Do Instead
There isn’t anything "extra," but there is plenty my students can do to improve their grade.
Joshua Palsky
4 min read
A student standing on a letter A mountain peak with other letter grades are scattered in the vast landscape.
Vanessa Solis/Education Week + DigitalVision Vectors
Assessment Download How Digital Portfolios Help Students Showcase Skills and Growth
Electronic folders showcase student learning and growth over time, and can form a platform for post-high school endeavors.
1 min read
Vector illustration image with icons of digital portfolio concepts: e-portfolios; goals; ideas; feedback; projects, etc.
iStock/Getty