Assessment

Policymakers Weigh Gathering More Data for NAEP

March 13, 2012 5 min read
  • Save to favorites
  • Print

As many experts raise questions about the future of “the nation’s report card,” the governing board for the assessment program is exploring changes aimed at leveraging the achievement data to better inform education policy and practice.

The core idea, outlined in a report to the board, is to expand and make far greater use of the background information collected when the National Assessment of Educational Progress is given. In doing so, the report suggests, NAEP could identify factors that may differentiate high-performing states and urban districts from low performers.

The effort, it says, would parallel the extensive reporting of background variables in global assessment systems, such as the Program for International Student Achievement, or PISA.

The report was released just weeks after the Obama administration proposed a fiscal 2013 budget that would cut the NAEP budget by $6 million, while funding a pilot program of state participation in PISA.

“Currently, the NAEP background questions are a potentially important but largely underused national resource,” says the report by a six-member expert panel commissioned by the National Assessment Governing Board, or NAGB, which sets policy for the testing program. “These data could provide rich insights into a wide range of important issues about the nature and quality of American primary and secondary education and the context for understanding achievement and its improvement.”

In addition, the report says NAEP background questions could help track policy trends, such as implementation of the Common Core State Standards or new teacher-evaluation systems.

The report, presented this month to NAGB at a meeting in New Orleans, was apparently well-received by many board members, including the chairman, former Massachusetts Commissioner of Education David P. Driscoll. But some of the ideas are generating pushback from current and former federal officials.

“NAGB has a tool that they want to use for everything,” said Mark S. Schneider, a former commissioner of the National Center for Education Statistics, the arm of the U.S. Department of Education that administers the test. He argues that NAEP should stick to its core strengths, namely measuring student achievement and serving as a benchmark for state assessments.

“I find this just a distraction,” Mr. Schneider said of the proposed plan.

Causation vs. Correlation

Although the report emphasizes the importance of not letting correlations between math achievement and rates of absenteeism, for instance, be confused for causation, Mr. Schneider argues that such distinctions would be lost on the public and risk damaging NAEP’s reputation.

“They will make statements that will inevitably push the boundaries, and you will end up with questionable reports, in my opinion,” said Mr. Schneider, who is now a vice president of the Washington-based American Institutes for Research. Other concerns raised about the proposals are the cost involved, especially given the president’s proposed cut to NAEP, and what some experts say may be resistance to the federal government’s collection and reporting of more information on students, given privacy concerns.

The new report, commissioned by NAGB, notes that complementing the NAEP tests is a “rich collection” of background questions regularly asked of students, teachers, and schools. But the collection and the public reporting of such information have been significantly scaled back over the past decade, the report says.

“NAEP should restore and improve upon its earlier practice of making much greater use of background data,” the report says, “but do so in a more sound and research-supported way.”

It offers recommendations in four areas related to the background questions: asking “important questions,” improving the accuracy of measures, strengthening sampling efficiency, and reinstituting what it calls “meaningful analysis and reporting.”

It’s the fourth area, analysis and reporting, that is proving especially controversial.

Marshall S. “Mike” Smith, a co-author of the report and a former U.S. undersecretary of education in the Clinton administration, notes that the report comes at a time when NAEP’s long-term relevance is at issue. He cites the work to develop common assessments across states in English/language arts and mathematics, as well as the growing prominence of international exams, like PISA.

“The future of NAEP is somewhat in doubt,” Mr. Smith said.

PISA’s use of extensive background questions, he said, has enabled it to have wide influence.

“They’ve built narratives around the assessments: Why are there differences among countries” in achievement, he said. “We can’t do that with NAEP. We’re not able to construct plausible scenarios or narratives about why there are different achievement levels among states. And we’ve seen that can be a powerful mechanism for motivating reform.”

Mr. Driscoll, the chairman of NAGB, said the next step is for board staff members to draft recommendations on how the proposed changes could be implemented.

“I have challenged the board to think about how NAEP and NAGB can make a difference and have an impact,” he said. “There is some very valuable information that we can lay out ... that would be instructive for all of us.”

The report makes clear that NAEP should not be used to assert causes for variation in student achievement, but that a series of “descriptive findings” could be illustrative and help “generate hypotheses” for further study. For example, it might highlight differences in access to 8th grade algebra courses or to a teacher who majored in math.

“A valid concern over causal interpretations has led to a serious and unjustified overreaction,” the report says.

But some observers see reason for concern.

“It’s a mistake to present results that are purely descriptive,” said Grover J. “Russ” Whitehurst, a senior fellow at the Brookings Institution in Washington who was the director of the federal Institute of Education Sciences under President George W. Bush. “It is misleading, and it doesn’t make any difference if you have a footnote saying these results should not be considered causally.”

Jack Buckley, the current NCES commissioner, expressed reservations about some of the suggestions, especially in the analysis and reporting of the background data.

“The panel is looking toward PISA as an exemplar,” he said. “Folks at [the Organization for Economic Cooperation and Development, which administers PISA] write these papers and get a broad audience, but it’s not always clear that the data can support the conclusions they reach about what works.”

Mr. Buckley said he understands NAGB’s desire to be “policy-relevant,” but he cautioned that “we have to carefully determine what is the best data source for measuring different things.”

Mr. Driscoll said he’s keenly aware of not going too far with how the background data are used.

“I agree ... that we have to be careful about the causal effects,” he said. “I think we’ve gone too far in one direction to de-emphasize the background questions, and the danger is to go too far in the other direction.”

A version of this article appeared in the March 14, 2012 edition of Education Week as NAEP Board Considering Gathering Additional Data

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Recruitment & Retention Webinar
Be the Change: Strategies to Make Year-Round Hiring Happen
Learn how to leverage actionable insights to diversify your recruiting efforts and successfully deploy a year-round recruiting plan.
Content provided by Frontline
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Equity & Diversity Webinar
Critical Ways Leaders Can Build a Culture of Belonging and Achievement
Explore innovative practices for using technology to build an environment of belonging and achievement for all staff and students.
Content provided by DreamBox Learning
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Professional Development Webinar
Strategies for Improving Student Outcomes with Teacher-Student Relationships
Explore strategies for strengthening teacher-student relationships and hear how districts are putting these methods into practice to support positive student outcomes.
Content provided by Panorama Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Don’t Use State Tests ‘Punitively,’ Ed. Secretary Cardona Warns
As federal accountability restarts after two years, guidance from the department underscores how complicated that could be.
5 min read
Image of data, target goals, and gaining ground.
iStock/Getty
Assessment Latest Round of Federal Grants Aims to Make States' Assessments More Equitable, Precise
The U.S. Department of Education awarded over $29 million in competitive grants to 10 state education agencies.
2 min read
Assessment review data 599911460
vladwei/iStock/Getty<br/>
Assessment Opinion Are There Better Ways Than Standardized Tests to Assess Students? Educators Think So
Student portfolios and school community surveys are but two of the many alternatives to standardized tests.
3 min read
Illustration of students in virus environment facing wave of test sheets.
Collage by Vanessa Solis/Education Week (Images: iStock/DigitalVision Vectors/Getty)
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Empowering personalized instruction with a three-tiered approach to learning evidence
Navvy is the first classroom assessment system designed to empower personalized learning by providing granular, reliable, and proximal le...
Content provided by Pearson