The results of a 2002 federal survey of teachers’ and administrators’ attitudes toward their states’ accountability systems have yet to find their way onto the Web site where most data from “the nation’s report card” are available. Nor were those questions, which were part of the background questionnaire that routinely accompanies the National Assessment of Educational Progress, repeated this year.
“It’s troubling, not only because the accountability issue is a rather critical one these days,” said Bella Rosenberg, the assistant to the president of the American Federation of Teachers, “but, even if it was a useless set of questions, this is public information. It was collected with taxpayer dollars. And there should not be any sort of a priori determination of whether it’s useful or not.”
Peggy Carr, the associate commissioner for assessment at the National Center for Education Statistics, which manages NAEP, said no decision was made to withhold information from the public. “There really was not any malice,” she said.
As the NCES gets ready to report NAEP results, the reports and any accompanying data are made available to the public through the NAEP data tool, an interactive database on the Web, she said. Since the NCES has no plans to do a special report on the accountability questions, the quality and integrity of the data haven’t been evaluated, she said, and the center has not asked its contractor, the Princeton, N.J.-based Educational Testing Service, for the results.
“We haven’t seen the findings,” Ms. Carr said. “I can’t emphasize more that we haven’t seen the data yet, so we don’t know what it looks like.”
Usefulness Questioned
Ms. Carr said the NCES, an arm of the Department of Education, decided to pull the accountability questions and others for 2003 primarily because of new language in the No Child Left Behind Act. The federal law requires all background questions to be clearly related to student achievement and to be “secular, neutral, and nonideological.”
In retrospect, she added, the NCES thought the questions, which were added to the survey on an exploratory basis in 2002, might not be appropriate for the federal government to ask, and the states weren’t too happy with the questions either.
For example, teachers were asked the extent to which they agreed or disagreed that “the requirements of the state accountability system are clear” and that they have the resources to meet the system’s requirements.
Educators also were asked whether they were under pressure to improve student performance on state tests and whether the accountability system had had a positive effect on their schools. A related series of questions focused on their states’ reading and mathematics standards. Those included the extent to which teachers used the standards to guide instruction and whether teachers had been provided with adequate instructional materials and training to use the standards.
Some members of the National Assessment Governing Board, which oversees NAEP, raised concerns last month about the questions as part of a broader review of the background questionnaires.
“I think you will find in states in which there are controversies about the accountability system, the responses from principals and teachers are entirely predictable,” said Thomas H. Fisher, the director of student-assessment services for the Florida education department.
But the governing board never made a decision about pulling the questions for 2003, since the NCES had already done so. The board also never decided whether to produce a special report on the 2002 findings or include them on the NAEP Web site. In fact, its release plan for the 2002 NAEP results called for the Web site to post “fully accompanying data” for the reading report.
Ms. Carr said it was up to the board to decide whether the findings would end up in a special report, an issue she’ll ask NAGB members to consider.
“As long as there’s nothing wrong with the data, they will probably find their way into the data tool, she said, “regardless of whether there’s a report or not.”
John H. Stevens, the NAGB board member who chaired the ad hoc committee on background questions, said he doesn’t think the board has a strong view on publishing the results. “I think their value is somewhat limited,” he argued. “It was pretty early in the accountability game in a lot of states, so I really don’t know what that’s going to tell us.”