Assessment

Everybody Loves the Nation’s Report Card. But How Should It Evolve?

By Stephen Sawchuk — August 04, 2017 8 min read
  • Save to favorites
  • Print

Washington

Educational testing in the K-12 space is, let’s face it, at the top of nobody’s popularity list. But there is one exception: the National Assessment of Educational Progress.

The exam, better known as the Nation’s Report Card, is probably the shining example of educational measurement. It’s not “high stakes” for students or teachers. It’s difficult, if not impossible, to “game” or teach to the test, since no student takes a full test form. (Scores are aggregated from a sample of students.) The exams are typically updated regularly with an eye to high-quality items and innovations in curriculum.

But as the times change, the National Assessment Governing Board, the panel that sets NAEP policy, doesn’t want to rest on its laurels, and has been grappling with how its priorities should evolve. The last major policy change to NAEP, after all, was in 2002, when the No Child Left Behind Act mandated that all states participate in its reading and math tests every two years to serve as a check on their own assessment results.

Think about that: 2002 was long before the Common Core State Standards, before STEM was on everyone’s lips, before graduation rates hit 70 percent, and before “college and career ready” became the latest example of edu-jargon.

NAEP’s evolution has been a major point of discussion over NAGB’s last few quarterly board meetings. At the one held here Aug. 4, the panel tapped Lillian Lowery of the Education Trust; Michael Petrilli, of the right-leaning Thomas B. Fordham Institute; and Carmel Martin, a self-described policy wonk who served in both Congress and in the White House under President Obama, to offer feedback on some core themes. Board members discussed the topic both in full session and in small breakout groups.

It’s important to note, first of all, that there are as yet no scheduled board actions or votes on any of these ideas. NAGB is trying to get on top of key themes and work through all the implications.

The board does, however, want to set some priorities next spring so that it can better prepare to devise the upcoming schedule of assessments, which will probably be drafted sometime in the following fall. It will also, at some point, have to interface with Congress. (Skip to the bottom for the details on that.)

Now, let’s take a look at some of the core questions the board is considering.

Should Main NAEP Testing Be Every Two, Three, or Four years?: By law, the main NAEP tests, in reading and math, must be administered every two years. But is that still necessary? What if they were given every three or four years, instead? That might save up some cash to put to other uses, such as developing more innovative items in the tests.

Petrilli, for one, told the board he thought it was time to consider less-frequent main NAEP exams. It made sense in an NCLB era, he said, but not so much now. “I think a check every three years, maybe every four years, would be enough ... It would free up a ton of financial resources and get us out of this crazy conversation where every two years we look to see whether there’s a little bump in scores, and everyone tries to draw conclusions about why,” he said.

Lowery and Martin, though, were less sanguine on this idea, because states’ testing plans and priorities are still evolving under NCLB’s successor, the Every Student Succeeds Act. It’s important to keep an external check in place for now, they said.

Lowery also pointed out that the two federally financed testing consortia created under the Obama administration were supposed to offer better, more comparable results across states. But as states have gone their own way on testing, those goals haven’t been realized. “As the number of states dwindled— and continues to dwindle—who are participating in the consortia, NAEP becomes even more important,” she said.

Keep, Omit, Deepen, or Integrate Content-Area Tests?: If the board can redirect some of its funds—perhaps by administering the main NAEP less often—what might it invest them in?

Martin suggested that the panel take a closer to make sure they reflect states’ current priorities, such as scaling back the number of tested topics but going into deeper depth, and perhaps looking more closely at adaptive reasoning and problem solving. Most states have adopted so-called college- and career-ready standards that make those changes, she pointed out. “Maybe you do want to assess fewer things, so it’s more aligned with what schools are teaching,” she suggested. “If NAEP doesn’t make a similar adjustment, I worry NAEP won’t be as relevant as a check on what’s happening,” she said.

Board members also talked about whether the board might consider integrating some of the subjects, given the push for more interdisciplinary learning. Does it make sense, for example, to combine NAEP’s science measurement with its technology- and engineering-literacy exam? How about combining history and civics? (In truth, changes like that would require a deliberative process of rewriting exam frameworks; you can’t just smoosh different tests together without considering their technical properties and length, for starters.)

Finally, panelists suggested administering certain subjects more often, or giving state-by-state results in subjects that currently are only reported out nationally. Petrilli, Martin, and several other panelists all put civics at the top of this list, as did some board members.

“Let’s face it: If you look at the history of American education, why schools were founded, it’s to have an educated citizenry. To me the civics assessment has a higher priority, because there is so much confusion—no matter where you stand on the political spectrum—about our democratic way of life,” said NAGB member Fr. Joseph O’Keefe.

Adding subjects and exams, of course, also potentially means cutting back others that are either less useful or whose results seem to have less impact on education policy.

Getting Serious About Testing 12th Graders?: The main NAEP data in reading and math for high school seniors exists primarily at the national level, which limits its usefulness. Only a handful of states participate in state-by-state results at grade 12 (before they were cut for budget reasons) and getting students to take it seriously is a perennial challenge. Several board members suggested that the time is right to start assessing at this grade level at least every four years, and possibly by issuing state-by-state data for 12th graders

Count board member Dale Nowlin among the proponents: “That’s the end of the system,” he said. “It’s the only way we measure students who are coming out of it.”

It’s also, Petrilli suggested, a way of figuring out whether higher and higher graduation rates really signal something or not: “All the states have been trying reach this mark and grad rates are going up, up, up, up up—maybe because standards for graduation are going down, down, down, down, down—it would be an incredibly useful check.”

What to Do About NAEP’s Long-term Trend Paradox?. All of the outside experts who testified urged the board to preserve the Long-Term Trend exam, which is given to students at ages 9, 13, and 17. After all, they noted, it is unique in the landscape of testing. There is no other exam that has tracked U.S. students’ performance over time for decades.

But the exam has gone virtually unchanged since the 1970s, and its age is showing, some board members said. For example, the math portion tends to measure low-level skills and prioritizes computation, rather than more sophisticated skills, such as analyzing data or modeling with mathematics.

Updating this exam raises complicated measurement questions. If you update the test too much, for example, you will not be able to preserve the trend line. But keeping it exactly the same also doesn’t seem logical. As board member Tonya Matthews pointed out, it’s a little bit like building something assuming the earth is flat, and continuing to pretend it’s flat even when you know better.

It’s not clear whether there’s middle ground here—to preserve the trend line while updating the test—but expect the NAGB folks to have its measurement experts on speed dial to see if it’s possible.

The elephant (or camel) in the room: NAGB and the NAEP exams are funded by Congress and it sets some pretty strict parameters on the assessments. The board does have some maneuvering room, but some of the things it’s looking at would require congressional approval.

Congress, for example, would have to legislate a mandatory 12th grade state NAEP, or to permit the main reading and math tests to be given only every three to four years. Given the volatility of Congress and the budgeting process, some board members noted that proposing a bunch of ideas would inevitably require congressional scrutiny—and potentially allow the proverbial camel’s nose under the tent.

But Bill Bushaw, NAGB’s executive director, pointed out that perhaps it’s better to make the case rather than waiting for Congress to act.

“I have a sense that there’s a window here, given the bipartisanship with ESSA. If the governing board re-established its priorities, built an assessment schedule totally based on those priorites in the out years, and then we asked Peggy [Carr, an official at the U.S. Department of Education’s statistics wing] to cost it out, and we put it together with some rationale that’s not reactive, that’s a proactive approach,” he said in one of the breakout meetings. “I just think we have a better chance of designing something that could make more sense. But it’s a sequence of things we need to do, starting with what’s most important.”

And that’s where the difficulty lies, he added. “It’s hard to figure out what to do; we don’t have consensus.”


Read more on NAEP:

A version of this news article first appeared in the Curriculum Matters blog.