Assessment

‘Nation’s Report Card’ Has a New Reading Framework, After a Drawn-Out Battle Over Equity

By Sarah Schwartz — August 13, 2021 10 min read
results 925693186 02
  • Save to favorites
  • Print

The governing board that oversees the test known as the “nation’s report card” has adopted a new framework for designing the reading assessment, one that will provide more granular information about student performance by socioeconomic status and race and test students’ ability to read across disciplinary contexts.

But even after a unanimous vote to approve the new framework last week, some members of the panel tapped to develop the document have lamented what they see as missed opportunities for a fairer test—the after-effect of a heated back-and-forth over equity in assessment during the development process over the past several years.

The National Assessment of Educational Progress, the NAEP, is given to a nationally representative sample of U.S. students to measure what they know and can do across subjects.

The National Assessment Governing Board supervises the NAEP, and leads the process for updating the frameworks that guide how the test is constructed. The reading framework was last revised in 2009. The new changes will go into effect for the 2026 test administration.

A key consideration in updating the framework is maintaining NAEP’s long-term trend line, the ability to compare results from upcoming years to past scores, so as to draw conclusions about whether students are improving or not. (The National Center for Education Statistics, which conducts and analyzes NAEP tests, has said that the new adopted framework is likely to maintain trend.)

Understanding what the trends are is especially important now, said Lesley Muldoon, the NAGB’s executive director, to evaluate the effect that COVID-19 has had on student achievement “so that people can have a trusted baseline that they can use going forward.”

The framework development process has always included a diversity of perspectives, with varying factions working to hammer out their differences to develop a consensus document. But tensions ran especially high this time.

The debate raised questions central to the construct of reading itself: What does “real-world reading” actually look like? And how much of it is influenced by readers’ cultural backgrounds and the social contexts in which they learn?

At the same time, these conversations were taking place in the middle of a national conversation on race that has pushed educational organizations to consider how teaching, learning, and assessment can better support students of color.

Framework offers more data on students’ reading across disciplines

There are significant changes in the consensus document—changes that advocates on both sides of the framework debate said, in interviews with Education Week, would make NAEP a richer source of data on students’ reading ability.

The new framework calls for more detailed reporting on NAEP subgroups. Scores won’t just be disaggregated by race, ethnicity, and English-language learner status, but also differentiated by socioeconomic status within race and ethnicity. So, going forward, it would be possible to see the differences in scores between Black students from high-income families and Black students from low-income families, for example.

Students will also be tested on their ability to read informational text in social studies and science. This isn’t meant to evaluate students’ content knowledge—"this is not a test about whether they know the causes of the American Revolution,” Muldoon said—but rather that students can use discipline-specific reading skills in genres they’ll encounter in the classroom and real world.

And the framework adds a new “comprehension target,” or tested component of reading comprehension ability. Previously, the framework included three: 1) locate and recall information, 2) integrate and interpret information, and 3) analyze and evaluate information.

Now, students will also be expected to “use and apply” what they read, to solve problems or create something new. For example, after reading a series of opinion pieces on a subject, a student might be asked to write a blog that synthesizes the different positions or offers their own argument.

“This is not just your mother’s and father’s ‘find the main idea,’” said David Steiner, a professor of education at Johns Hopkins University and the executive director of its Institute for Education Policy. (Steiner was not involved in the drafting of the framework, but has commented publicly on the process.)

Other updates to the framework formalize changes that have already been made to the NAEP, following its shift to digital, rather than paper, administration. These include updates such as incorporating more digitally native text—such as what might be read on websites—and virtual “characters” that simulate a classroom environment or group work.

One new feature added to this list: Test-takers will also have examples of student responses to questions, to better illustrate what a strong response looks like.

‘What kind of reading do we want to draw inferences about?’

At last week’s board meeting, held both in person in McLean, Va., and streamed online, members praised the consensus process that resulted in the framework adoption.

Still, some members of the development panel felt that the final version diverged too far from the initial drafts—and that commitments made to equity were stripped at the 11th hour by a vocal minority of NAGB’s main board.

At the heart of this disagreement were two interconnected questions: How to define reading comprehension and what constitutes “real-world” reading.

Early versions of the framework, written by the NAGB-appointed development board, put forth a sociocultural model of reading comprehension. The model argues that reading is in part about what’s going on inside a student’s head—the cognitive processes—but that comprehension is greatly influenced by social and cultural contexts like home, school, and community.

These early drafts also broadened the use of “informational universal design elements,” text introductions, pop-ups, and videos that give students some background knowledge about the passages that they are about to read. This change was suggested because research has shown that reading comprehension ability is greatly influenced by readers’ background knowledge on the topic. (Students will probably have an easier time reading Animal Farm, for example, if they have some understanding of the Russian Revolution.)

Gina Cervetti, an associate professor of literacy at the University of Michigan School of Education, and a member of the framework development panel, said that beefing up these knowledge scaffolds would have made NAEP a truer test of students’ reading comprehension ability. It would test their knowledge of text structures, or their skills in analyzing information, rather than their content knowledge, she said. It would level the playing field for students who come to the test with different stores of knowledge.

When this version of the framework was put out for public comment, though, it brought forth harsh criticism from some corners of the education world. “This came to be seen as an attempt to inflate the scores of traditionally underperforming students,” Cervetti said. “And nothing could be further from the truth.”

But Steiner, who criticized the draft framework when it was released for comment, said that providing all that supporting information would have created conditions on the NAEP that don’t exist in real-world reading. Take a word like yacht, he said. “You could argue, and this is argued in many state assessments, you can’t use a word like yacht, because less-affluent students have not grown up in a world of yachts.”

But “yacht,” Steiner said, is a word that regularly shows up in works that students might be expected to read as adults: news, magazines, novels. It’s part of a broad public vocabulary that students would be expected to know, and that teachers could reasonably be expected to make sure students know, he said.

Testing whether students are prepared for reading in college and career should include testing whether they can read and make sense of texts that include that word, he argued—and not testing this could mask indicators that students might have trouble with reading later on.

The draft framework was released for public comment last summer, and the development panel incorporated changes resulting from that feedback. But in May, when the revised framework was presented to the full board, some members thought the changes didn’t go far enough.

Grover (Russ) Whitehurst, a NAGB board member and former director of the Institue of Education Sciences, conducted his own, further revision of the document, striking most of the references to sociocultural frameworks and toning down the use of informational UDEs, to the alarm of many members of the original development panel.

“The goal ... is to handle background knowledge in ways that strengthen the validity of the assessment, rather than trying to define it out of existence as a factor in reading comprehension,” Whitehurst wrote at the time.

To hammer out these differences and create a consensus document, NAGB’s chair, Haley Barbour, assembled a smaller, cross-committee working group which put forth the final framework as adopted.

Informational UDEs are still in the framework, but they play a much smaller role. This concerns Cervetti, who maintains that a more robust set of informational UDEs would make the NAEP more like “real” reading, not less.

“In the real world, outside of a standardized assessment, we rarely read completely unfamiliar texts in isolation,” she said. If a student reads a word they didn’t know, they can look it up. “We all have phones, and computers, and people [around us], and dictionaries,” Cervetti said.

“What constitutes real reading is, I think, a real bone of contention. And it makes a huge difference,” said P. David Pearson, a professor emeritus at the University of California, Berkeley’s Graduate School of Education, and the chair of NAGB’s development panel. “But the question is, what kind of reading do we want to draw inferences about?”

Possible changes to framework development process on the horizon

Pearson said the final framework is “something to be celebrated,” but also that he would want to see more work done—in defining reading in more of a sociocultural context, which he said would bring NAEP in line with other national and international assessments, and in gathering more data about students’ school and community environments. And he questioned the framework development process, which requires that NAGB approve new developments through consensus.

“I think that’s a great tradition, but if things get controversial, and if there are ideological and theoretical differences, then I’m not convinced that consensus is the only way to make important decisions,” he said. “The other thing about consensus is that it’s another name for minority rule, just as the filibuster in the Senate is another name for blocking the majority.”

The majority of the framework development committee supported the version of the document put forth in earlier drafts, Pearson said, and changes were introduced by a small group of dissenters in the full NAGB board.

But Whitehurst, one of these dissenters, said that his does not represent a minority view. He argued that many in the reading education community—including researchers and school-level educators alike—would endorse a model of reading that put more emphasis on the cognitive processes than sociocultural contexts. But, he said, this diversity of viewpoints wasn’t included on the framework development panel.

“Those of us on the board who sort of had to take that position would not have had to if there were greater diversity in the views of those who developed the document,” he said.

After a drawn-out public battle over the reading framework, the framework development process itself is up for review this September by the NAGB board—in part, so the team can “have an easier time with framework development in the future,” said Sharyn Rosenberg, NAGB’s assistant director for assessment development, in the board meeting last week.

Ideally, Whitehurst said, the framework development process going forward would produce documents in which “the tensions are already worked out.”

Events

Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
Ed-Tech Policy Webinar Artificial Intelligence in Practice: Building a Roadmap for AI Use in Schools
AI in education: game-changer or classroom chaos? Join our webinar & learn how to navigate this evolving tech responsibly.
Education Webinar Developing and Executing Impactful Research Campaigns to Fuel Your Ed Marketing Strategy 
Develop impactful research campaigns to fuel your marketing. Join the EdWeek Research Center for a webinar with actionable take-aways for companies who sell to K-12 districts.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment As They Revamp Grading, Districts Try to Improve Consistency, Prevent Inflation
Districts have embraced bold changes to make grading systems more consistent, but some say they've inflated grades and sent mixed signals.
10 min read
Close crop of a teacher's hands grading a stack of papers with a red marker.
E+
Assessment Opinion What's the Best Way to Grade Students? Teachers Weigh In
There are many ways to make grading a better, more productive experience for students. Here are a few.
14 min read
Images shows colorful speech bubbles that say "Q," "&," and "A."
iStock/Getty
Assessment Spotlight Spotlight on Assessment
This Spotlight will help you evaluate effective ways to offer students feedback, learn how to improve assessments for ELs, and more.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Understanding 'Through-Year' Assessment: What Everyone Should Know
This is a once-in-a-generation opportunity to reconsider our assessment systems. Discover a fresh approach with Through-Year Assessment.
Content provided by New Meridian