Assessment

Survey Data Missing From NAEP Site, At Least Temporarily

By Lynn Olson — September 10, 2003 3 min read
  • Save to favorites
  • Print

The results of a 2002 federal survey of teachers’ and administrators’ attitudes toward their states’ accountability systems have yet to find their way onto the Web site where most data from “the nation’s report card” are available. Nor were those questions, which were part of the background questionnaire that routinely accompanies the National Assessment of Educational Progress, repeated this year.

“It’s troubling, not only because the accountability issue is a rather critical one these days,” said Bella Rosenberg, the assistant to the president of the American Federation of Teachers, “but, even if it was a useless set of questions, this is public information. It was collected with taxpayer dollars. And there should not be any sort of a priori determination of whether it’s useful or not.”

Peggy Carr, the associate commissioner for assessment at the National Center for Education Statistics, which manages NAEP, said no decision was made to withhold information from the public. “There really was not any malice,” she said.

As the NCES gets ready to report NAEP results, the reports and any accompanying data are made available to the public through the NAEP data tool, an interactive database on the Web, she said. Since the NCES has no plans to do a special report on the accountability questions, the quality and integrity of the data haven’t been evaluated, she said, and the center has not asked its contractor, the Princeton, N.J.-based Educational Testing Service, for the results.

“We haven’t seen the findings,” Ms. Carr said. “I can’t emphasize more that we haven’t seen the data yet, so we don’t know what it looks like.”

Usefulness Questioned

Ms. Carr said the NCES, an arm of the Department of Education, decided to pull the accountability questions and others for 2003 primarily because of new language in the No Child Left Behind Act. The federal law requires all background questions to be clearly related to student achievement and to be “secular, neutral, and nonideological.”

In retrospect, she added, the NCES thought the questions, which were added to the survey on an exploratory basis in 2002, might not be appropriate for the federal government to ask, and the states weren’t too happy with the questions either.

For example, teachers were asked the extent to which they agreed or disagreed that “the requirements of the state accountability system are clear” and that they have the resources to meet the system’s requirements.

Educators also were asked whether they were under pressure to improve student performance on state tests and whether the accountability system had had a positive effect on their schools. A related series of questions focused on their states’ reading and mathematics standards. Those included the extent to which teachers used the standards to guide instruction and whether teachers had been provided with adequate instructional materials and training to use the standards.

Some members of the National Assessment Governing Board, which oversees NAEP, raised concerns last month about the questions as part of a broader review of the background questionnaires.

“I think you will find in states in which there are controversies about the accountability system, the responses from principals and teachers are entirely predictable,” said Thomas H. Fisher, the director of student-assessment services for the Florida education department.

But the governing board never made a decision about pulling the questions for 2003, since the NCES had already done so. The board also never decided whether to produce a special report on the 2002 findings or include them on the NAEP Web site. In fact, its release plan for the 2002 NAEP results called for the Web site to post “fully accompanying data” for the reading report.

Ms. Carr said it was up to the board to decide whether the findings would end up in a special report, an issue she’ll ask NAGB members to consider.

“As long as there’s nothing wrong with the data, they will probably find their way into the data tool, she said, “regardless of whether there’s a report or not.”

John H. Stevens, the NAGB board member who chaired the ad hoc committee on background questions, said he doesn’t think the board has a strong view on publishing the results. “I think their value is somewhat limited,” he argued. “It was pretty early in the accountability game in a lot of states, so I really don’t know what that’s going to tell us.”

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by Project Lead The Way
Recruitment & Retention Webinar EdRecruiter 2025 Survey Results: The Outlook for Recruitment and Retention
See exclusive findings from EdWeek’s nationwide survey of K-12 job seekers and district HR professionals on recruitment, retention, and job satisfaction. 
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion Students Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week