Cross-posted from Catherine Gewertz at Curriculum Matters.
In all the high-profile attention to the periodic results of the National Assessment for Educational Progress, often forgotten is the wealth of data that is collected from students as a part of those tests. The National Assessment Governing Board, which sets policy for NAEP, is taking early steps to revamp that bank of “background questions,” but got a big blast of criticism Friday from an influential source: the U.S. Department of Education’s chief statistician.
It unfolded during the quarterly meeting of the NAGB board, which Jack Buckley attends as chief of the National Center for Education Statistics.
At a committee meeting, two respected experts—Alan Ginsburg, the former director of policy and program evaluation services at the federal education department, and Marshall “Mike” Smith, a top advisor in the department during three adminstrations—laid out their proposal for how NAGB could streamline its current list of more than 1,400 bits of information that are collected about students’ backgrounds through questionnaires, and instead expand and regroup the questions to form composite indicators that bear on policy and practice.
Putting groups of variables together on given topics yields important information about the factors that affect student achievement, Smith argued. Ginsburg offered examples of what that might look like. Information collected from NAEP-tested students about their teachers’ expectations, combined with their reported race or ethnicity, for instance, shows that two-thirds of white students report that their teachers have high expectations for them, while only 47 percent of African American students report the same.
Smith and Ginsburg proposed a framework that would embrace student-level factors such as home educational climate, preschool and afterschool experiences, and student motivation, and school-level factors like teacher quality and professionalism, schools’ use of technology and school climate.
The presentation drew some warm remarks from committee members about its potential value helping illuminate performance patterns that can shape policy to improve education. But during the question-and-answer period, Buckley rose from the spectator section to change the tone of the conversation.
“I find this whole line of reasoning baffling,” said Buckley, who recently announced his resignation to take a senior job with the College Board. The committee seems to be functioning, he said, “as if we live in a world where the only source of data is NAEP.”
Noting that the NCES produces staggering amounts of data every year, in such forms as “The Condition of Education,” which includes dozens of tables of NAEP data, and the “Digest of Education Statistics,” he urged the committee to recognize that “there’s an awful lot of information out there already... That’s why we have a federal agency for [education] statistics,” he said.
Smith said that the new proposed approach would produce information in different combinations than is currently available.
“We’re talking about concepts that aren’t typically captured in one question,” he said. “It is different, I think.”
The tension was broken by the committee chair announcing that there was no more time for discussion, since a working lunch session of the entire NAGB board was about to begin.
At that session, Smith and Ginsburg gave a modified version of their presentation to the NAGB board. It drew many interested questions, and supportive remarks.
“Wonderful work,” said NAGB member and assessment expert W. James Popham. “Don’t you dare stop.”
This wasn’t the first whack that NAGB has taken at the questionnaires of background questions given to students who take the tests; they’ve been undergoing revision for a while. And today’s exchange wasn’t the first sign of tension between NCES and NAGB, either; the two entities have an uneasy alliance peppered with disagreements. The NAGB board members, appointed by the U.S. Secretary of Education, are officially independent of the department.
A version of this news article first appeared in the Inside School Research blog.