Assessment

Policymakers Weigh Gathering More Data for NAEP

March 13, 2012 5 min read
  • Save to favorites
  • Print

As many experts raise questions about the future of “the nation’s report card,” the governing board for the assessment program is exploring changes aimed at leveraging the achievement data to better inform education policy and practice.

The core idea, outlined in a report to the board, is to expand and make far greater use of the background information collected when the National Assessment of Educational Progress is given. In doing so, the report suggests, NAEP could identify factors that may differentiate high-performing states and urban districts from low performers.

The effort, it says, would parallel the extensive reporting of background variables in global assessment systems, such as the Program for International Student Achievement, or PISA.

The report was released just weeks after the Obama administration proposed a fiscal 2013 budget that would cut the NAEP budget by $6 million, while funding a pilot program of state participation in PISA.

“Currently, the NAEP background questions are a potentially important but largely underused national resource,” says the report by a six-member expert panel commissioned by the National Assessment Governing Board, or NAGB, which sets policy for the testing program. “These data could provide rich insights into a wide range of important issues about the nature and quality of American primary and secondary education and the context for understanding achievement and its improvement.”

In addition, the report says NAEP background questions could help track policy trends, such as implementation of the Common Core State Standards or new teacher-evaluation systems.

The report, presented this month to NAGB at a meeting in New Orleans, was apparently well-received by many board members, including the chairman, former Massachusetts Commissioner of Education David P. Driscoll. But some of the ideas are generating pushback from current and former federal officials.

“NAGB has a tool that they want to use for everything,” said Mark S. Schneider, a former commissioner of the National Center for Education Statistics, the arm of the U.S. Department of Education that administers the test. He argues that NAEP should stick to its core strengths, namely measuring student achievement and serving as a benchmark for state assessments.

“I find this just a distraction,” Mr. Schneider said of the proposed plan.

Causation vs. Correlation

Although the report emphasizes the importance of not letting correlations between math achievement and rates of absenteeism, for instance, be confused for causation, Mr. Schneider argues that such distinctions would be lost on the public and risk damaging NAEP’s reputation.

“They will make statements that will inevitably push the boundaries, and you will end up with questionable reports, in my opinion,” said Mr. Schneider, who is now a vice president of the Washington-based American Institutes for Research. Other concerns raised about the proposals are the cost involved, especially given the president’s proposed cut to NAEP, and what some experts say may be resistance to the federal government’s collection and reporting of more information on students, given privacy concerns.

The new report, commissioned by NAGB, notes that complementing the NAEP tests is a “rich collection” of background questions regularly asked of students, teachers, and schools. But the collection and the public reporting of such information have been significantly scaled back over the past decade, the report says.

“NAEP should restore and improve upon its earlier practice of making much greater use of background data,” the report says, “but do so in a more sound and research-supported way.”

It offers recommendations in four areas related to the background questions: asking “important questions,” improving the accuracy of measures, strengthening sampling efficiency, and reinstituting what it calls “meaningful analysis and reporting.”

It’s the fourth area, analysis and reporting, that is proving especially controversial.

Marshall S. “Mike” Smith, a co-author of the report and a former U.S. undersecretary of education in the Clinton administration, notes that the report comes at a time when NAEP’s long-term relevance is at issue. He cites the work to develop common assessments across states in English/language arts and mathematics, as well as the growing prominence of international exams, like PISA.

“The future of NAEP is somewhat in doubt,” Mr. Smith said.

PISA’s use of extensive background questions, he said, has enabled it to have wide influence.

“They’ve built narratives around the assessments: Why are there differences among countries” in achievement, he said. “We can’t do that with NAEP. We’re not able to construct plausible scenarios or narratives about why there are different achievement levels among states. And we’ve seen that can be a powerful mechanism for motivating reform.”

Mr. Driscoll, the chairman of NAGB, said the next step is for board staff members to draft recommendations on how the proposed changes could be implemented.

“I have challenged the board to think about how NAEP and NAGB can make a difference and have an impact,” he said. “There is some very valuable information that we can lay out ... that would be instructive for all of us.”

The report makes clear that NAEP should not be used to assert causes for variation in student achievement, but that a series of “descriptive findings” could be illustrative and help “generate hypotheses” for further study. For example, it might highlight differences in access to 8th grade algebra courses or to a teacher who majored in math.

“A valid concern over causal interpretations has led to a serious and unjustified overreaction,” the report says.

But some observers see reason for concern.

“It’s a mistake to present results that are purely descriptive,” said Grover J. “Russ” Whitehurst, a senior fellow at the Brookings Institution in Washington who was the director of the federal Institute of Education Sciences under President George W. Bush. “It is misleading, and it doesn’t make any difference if you have a footnote saying these results should not be considered causally.”

Jack Buckley, the current NCES commissioner, expressed reservations about some of the suggestions, especially in the analysis and reporting of the background data.

“The panel is looking toward PISA as an exemplar,” he said. “Folks at [the Organization for Economic Cooperation and Development, which administers PISA] write these papers and get a broad audience, but it’s not always clear that the data can support the conclusions they reach about what works.”

Mr. Buckley said he understands NAGB’s desire to be “policy-relevant,” but he cautioned that “we have to carefully determine what is the best data source for measuring different things.”

Mr. Driscoll said he’s keenly aware of not going too far with how the background data are used.

“I agree ... that we have to be careful about the causal effects,” he said. “I think we’ve gone too far in one direction to de-emphasize the background questions, and the danger is to go too far in the other direction.”

A version of this article appeared in the March 14, 2012 edition of Education Week as NAEP Board Considering Gathering Additional Data

Events

Ed-Tech Policy Webinar Artificial Intelligence in Practice: Building a Roadmap for AI Use in Schools
AI in education: game-changer or classroom chaos? Join our webinar & learn how to navigate this evolving tech responsibly.
Education Webinar Developing and Executing Impactful Research Campaigns to Fuel Your Ed Marketing Strategy 
Develop impactful research campaigns to fuel your marketing. Join the EdWeek Research Center for a webinar with actionable take-aways for companies who sell to K-12 districts.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Privacy & Security Webinar
Navigating Cybersecurity: Securing District Documents and Data
Learn how K-12 districts are addressing the challenges of maintaining a secure tech environment, managing documents and data, automating critical processes, and doing it all with limited resources.
Content provided by Softdocs

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment AI May Be Coming for Standardized Testing
An international test may offer clues on how AI can help create better assessments.
4 min read
online test checklist 1610418898 brightspot
champpixs/iStock/Getty
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Design for Improvement: The Case for a New Accountability System
Assessments in more frequent intervals provide useful feedback on what students actually study. New curriculum-aligned assessments can le...
Content provided by Cognia
Assessment The 5 Burning Questions for Districts on Grading Reforms
As districts rethink grading policies, they consider the purpose of grades and how to make them more reliable measures of learning.
5 min read
Grading reform lead art
Illustration by Laura Baker/Education Week with E+ and iStock/Getty
Assessment As They Revamp Grading, Districts Try to Improve Consistency, Prevent Inflation
Districts have embraced bold changes to make grading systems more consistent, but some say they've inflated grades and sent mixed signals.
10 min read
Close crop of a teacher's hands grading a stack of papers with a red marker.
E+