Opinion Blog

Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Assessment Opinion

The Future, Present, and Past of ‘the Nation’s Report Card’

By Rick Hess — May 26, 2022 7 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
  • Save to favorites
  • Print

The National Assessment of Educational Progress, otherwise known as “the nation’s report card,” is the federal test taken every couple years by a sample of students in 4th, 8th, and 12th grades. NAEP has come to serve as the authoritative gauge of school performance in the U.S. How did NAEP come to play that role? And what’s ahead for the nation’s most important test? Chester E. “Checker” Finn Jr., who 30-odd years ago served as the first chair of the National Assessment Governing Board charged with overseeing NAEP, has been a close chronicler of NAEP over time. Now, Finn, the president emeritus of the Fordham Foundation, has published Assessing the Nation’s Report Card: Challenges and Choices for NAEP, in which he explores the history and future of NAEP. I recently spoke to Checker about his takeaways and the challenges ahead for the assessment. Here’s what he had to say.


Rick: So, Checker, what moved you to write the book?

Checker: My career has intersected with NAEP in various ways over half a century. Most consequential were seven years starting in 1985 when I had a considerable amount to do with shifting NAEP onto its modern trajectory. I served as (U.S. Secretary of Education) Bill Bennett’s assistant secretary for research, with oversight of the National Center for Education Statistics and NAEP, and then as first chair of the new National Assessment Governing Board. I helped recruit my friend, outgoing Tennessee Gov. Lamar Alexander, to chair the “Alexander-James” study group that wrote the influential turnaround report, and Emerson Elliott and Mike Kirst and I helped draft the report. I negotiated with Terry Hartle of Sen. (Edward) Kennedy’s staff as most of the panel’s recommendations became law. I helped Bill Bennett select the first members of NAGB. And then as chair, I pushed to launch state-level reporting and to develop and defend the board’s trio of “achievement levels,”—"basic,” “proficient,” and “advanced"—which to this day are the closest thing the U.S. has to national education standards. Since then, I’ve retained some interest and involvement with NAEP as cheerleader, adviser, and critic. So when Lesley Muldoon and Laura Logerfo of the NAGB staff pointed out that nobody in recent decades had written a “biography” of NAEP, undertaking such a thing myself seemed like a natural fit. And turned out to be, at least in part, a labor of love.

Rick: How did NAEP actually get started, anyway?

Checker: Back in Lyndon Johnson’s time, Commissioner of Education Frank Keppel realized that he, and the country, had lots of conventional quantity-and-inputs data on K-12 schooling in America but absolutely no information as to whether anyone was learning anything. He asked the eminent psychologist Ralph Tyler if anything could be done. Tyler came back with a set of ideas, and the Carnegie Corporation of New York helped him turn them into a plan. By 1970, the first NAEP tests were being given.

Rick: Looking back, what’s been NAEP’s biggest contribution over the past half-century?

Checker: It’s by far the country’s most important and respected barometer of educational achievement, and the lack thereof, in our K-12 schools as well as of achievement gaps between groups of students. NCLB also turned it into auditor and arbiter of state standards and achievement results, a sort of truth squad as to whether states are being honest with their citizens about their students’ proficiency, and lack thereof, and how their students compare with other states and the nation as a whole.

Rick: You’ve noted that NAEP tends to be invisible to those who aren’t Beltway education diehards. What are a couple of things that even non-wonks should know about NAEP?

Checker: Thanks to good advice from my editor, that’s what the first chapter is all about! NAEP is invisible because it doesn’t report on anything at the student, teacher, or school level—and only reports for a couple dozen big districts. It’s really about the country and the states. So it’s not immediately relevant to most people. But it’s hugely valuable to policy types and researchers. Not many people know that it periodically tests 10 different subjects; that it tests at three grade levels—essentially the end of elementary, end of middle, and end of high school—that its governance is remarkably independent of politics and, so far, of culture wars; and for the most part, its trend lines give us the best data we have on achievement changes over time.

Rick: What is it about the composition and administration of the tests that makes NAEP so credible, anyway?

Checker: The school and student samples are carefully drawn so the data are valid. The tests are carefully built and rigorously double-checked, pilot-tested, and evaluated. The assessment methodology—though now beginning to grow creaky—has long been “state of the art” in the world of large-scale education assessment. The analysis and reporting of results are meticulous—which is why they’re sometimes awfully slow. And NAGB’s extremely elaborate process of deciding what to test has a lot of stakeholder involvement and has been carefully done, albeit is potentially vulnerable to worsening conflict over culture and curriculum.

Rick: With polarization infecting so much in education, can NAGB keep NAEP from becoming one more front in education’s culture wars?

Checker: You’re touching on my biggest worry. NAEP’s credibility depends in huge part on the widespread understanding that what it’s testing represents a reasonable and nonpolitical consensus about what young Americans should be learning. That, along with achievement levels, is NAGB’s foremost responsibility. Yet the board nearly came unglued over its new reading framework and is beginning to struggle with a revised science framework. “Consensus” may prove to be a big challenge there. Think, for example, how NAEP should deal with the touchy issue of “climate change.” It may even encounter pushback on evolution! Then history and civics lie ahead …

Rick: While NAEP’s 4th and 8th grade tests are treated as authoritative, why is it that nobody has much confidence in the 12th grade results?

Checker: I get into this in the book. Turns out there’s no problem. People should trust the results. NCES and NAGB have done a lot of research and have shown that 12th graders participate enough to yield valid data and that they take the test seriously. Moreover, “NAEP proficient” at 12th grade is approximately the same as genuinely “college ready,” at least in reading and almost as close in math. For me, the big 12th grade problem is that NAEP doesn’t deliver its 12th grade results at the state level, as it does for grades 4 and 8. You’d think the end of high school would be the data most in demand by state leaders.

Rick: In the book, you explain where NAEP’s notion of “proficient” came from. What’s the tale mean for how we think about proficiency today?

Checker: “Proficient” is the middle achievement level set by NAGB. Establishing, refining, and defending these levels might be the board’s heaviest lift over the past three decades. It began with the 1989 Charlottesville summit and the setting of national education goals for the year 2000. One goal declared, “By the year 2000, American students will leave grades 4, 8, and 12 having demonstrated competency in challenging subject matter, including English, mathematics, science, history, and geography.” But how was anyone to know if we were making progress toward such a goal? NAGB said NAEP could do it, but only if benchmarks were added that defined “demonstrated competency” and “challenging subject matter,” and that also had a gauge that could be used on the subjects spelled out by governors and the president. My NAGB colleagues agreed that we could, and should, make NAEP take on that job. And the rest, I would say, is history.

Rick: What’s the status of NAEP on Capitol Hill?

Checker: NAEP is expensive, heading toward $200 million a year, and it’s grown too complicated and inefficient. There are lots of ways to make NAEP leaner—but at the same time, there are lots of things it could and should be doing that it cannot do because of budgetary constraints. Congress has seemed willing to keep funding and increasing its budget. What’s missing on Capitol Hill are members who understand NAEP and want it to work better and who are willing to champion it. Statistical programs just aren’t that sexy. This is one of many reasons that Sen. Alexander is much missed!

Rick: So, what’s next for NAEP?

Checker: NAEP is currently facing the task of modernizing this very complicated, large, and surprisingly delicate 50-year-old assessment instrument. I’m not sure NAEP’s “insiders” can pull this off alone. It may call for a full-fledged rethink from outside the government. Yet, where that big-picture view of NAEP’s future might come from and how to ensure its balance, credibility, and ultimately its consensus—all this may be too daunting to pull off in today’s political and cultural environment. Possibly, NAEP will prove a sleeping dog that shouldn’t be awakened. But it definitely needs to get rid of some fleas and worms. The further risk, however, at a time when testing is playing defense on all fronts and results-based school accountability is in trouble, is that the dog itself may be deemed an unnecessary companion, possibly even an embarrassing nuisance to keep around.

This interview has been edited and condensed for clarity.

Related Tags:

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.


School Climate & Safety K-12 Essentials Forum Strengthen Students’ Connections to School
Join this free event to learn how schools are creating the space for students to form strong bonds with each other and trusted adults.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Reading & Literacy Webinar
Creating Confident Readers: Why Differentiated Instruction is Equitable Instruction
Join us as we break down how differentiated instruction can advance your school’s literacy and equity goals.
Content provided by Lexia Learning
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
IT Infrastructure & Management Webinar
Future-Proofing Your School's Tech Ecosystem: Strategies for Asset Tracking, Sustainability, and Budget Optimization
Gain actionable insights into effective asset management, budget optimization, and sustainable IT practices.
Content provided by Follett Learning

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment What the Research Says What Teachers Should Know About Integrating Formative Assessment With Instruction
Teachers need to understand how tests fit into their larger instructional practice, experts say.
3 min read
Students with raised hands.
E+ / Getty
Assessment AI May Be Coming for Standardized Testing
An international test may offer clues on how AI can help create better assessments.
4 min read
online test checklist 1610418898 brightspot
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Assessment Whitepaper
Design for Improvement: The Case for a New Accountability System
Assessments in more frequent intervals provide useful feedback on what students actually study. New curriculum-aligned assessments can le...
Content provided by Cognia
Assessment The 5 Burning Questions for Districts on Grading Reforms
As districts rethink grading policies, they consider the purpose of grades and how to make them more reliable measures of learning.
5 min read
Grading reform lead art
Illustration by Laura Baker/Education Week with E+ and iStock/Getty