Guest post by Jaclyn Zubrzycki, originally posted at Curriculum Matters.
The release this month of scores from NAEP’s newest test, the Technology and Engineering Literacy Assessment, has revived a debate about plans to postpone the organization’s oldest test.
The four-decade-old Long-Term Trend Assessments in math and reading, used to compare students’ scores in math and reading over time, will not be given again until 2024. The postponement has raised red flags for some researchers looking for a constant student learning measure in an era of changing assessments.
The long-term trend tests have been given approximately every four years, with some variation, since the early 1970s. They are paper-and-pencil tests given to 9-, 13-, and 17-year-old students, and provide results at a national level, rather than the state-level information provided by the main NAEP test in reading and math.
The test was last administered in 2012; exams planned for 2016 and 2020 have both been nixed.
Bill Bushaw, the executive director of the National Assessment Governing Board, or NAGB, which sets policy for NAEP, said that the delay was the result of a “classic resource issue,” in which other tests were more in line with the governing board’s priorities at a time when the NAEP budget is tight. But he said that there has been debate among board members about the continued value of the long-term trend test when the main NAEP has been in place since the early 1990s.
The decision to delay the test from 2020 to 2024 was made at a meeting of the NAGB in November 2015.
At the time, NAGB chair Terry Mazany said in a press release, “I believe the board made the best decision it could to both reflect the budget realities that face NAEP, and maintain its standing as the nation’s gold standard in assessment.”
“Without a change in funding outlook, there could be more cuts to the assessment schedule, and upholding the breadth and depth of these valuable measures of educational progress would be compromised,” Mazany wrote.
But after the NAGB and NCES released results from the new Technology and Engineering Literacy exam, some renewed questions about why the trend assessment was sacrificed while TEL and other new tests have received funding.
“The long-term trend test was supposed to be the stable, granite-like test that we could always look at as a country and see, how are we doing compared to 40 years ago?” said Tom Loveless, a senior research fellow with the Brookings Institution.
Loveless said that the 12-year-gap would leave a hole in observers’ understanding of how current education policies and reforms, including the implementation of the Common Core State Standards, are playing out.
“During a time when we’re engaged in this project called Common Core, during which we really do need alternative data and measures of how well kids are doing in reading and math, we’ve poured resources into these other assessments,” Loveless said. Loveless said he worried some of the so-called 21st century skills assessed on the TEL, like communication and problem-solving, are more nebulous or “faddish” compared to the fairly concrete reading and math tests.
“I question the value of what it’s measuring, and I wonder if they’re measuring what they think they’re measuring,” Loveless said of the new test. (Here’s more on the structure of TEL.)
Bushaw said that the governing board made its decisions about which tests to administer on a limited budget based on a set of priorities:
- Transitioning to digitally based assessments for reading and math while maintaining trend and state validation studies;
- Assessing broad-based curricular areas with a priority for STEM;
- Providing state-level data in curriculum areas outside of reading and math; and
- Including more TUDAs, or Trial Urban District Assessments.
As far as tracking basic skills over time on the long-term test or trying new, complex assessments like TEL, “it’s a balancing act,” Bushaw said. He said that while the governing board is committed to having trend data, “NAEP has a history of innovation and of finding new ways of assessing students in various topics.”
NAEP did add more TUDA cities this year, but has not added state-level data in subjects like history, also because of budget restrictions.
“The governing board has always had a very ambitious agenda, but the NAEP budget has not kept pace,” said Jack Buckley, a senior vice president for research at the College Board and former NCES commissioner. Buckley said that while the data isn’t heavily used by researchers, “it’s the longest-running time series of its kind of education so it is always of interest to policy makers.”
But as far back as 2001, governing board members have considered scrapping the test. Minor modifications have been made to the test’s language since then, and most of those have been aimed at removing references to outmoded technologies.
Bushaw said that debate over the test’s merit is still going among the governing board’s members. He said that some board members argue that the main NAEP test, which also assesses reading and math, could replace the long-term trend test, while others argue that the long-term trend test fills a unique role.
Sarah Lubienski, a researcher at the University of Illinois and a former chairwoman of the American Educational Research Association’s study group for NAEP, said that the main NAEP provided a more valuable set of information for researchers in her field.
“I think it’s a good move to delay it,” she said. “It’s very basic skills-focused, and in terms of math education, we value the content in the main NAEP more.”
She said she could imagine questions from the long-term trend data eventually being incorporated in the main NAEP for the purpose of tracking longer-term trends.
Former NCES commissioner Buckley said in an email that the long delay didn’t seem promising for the future of the test. “Waiting that long for the next data point seems almost like they wanted to kill it entirely but didn’t have the heart,” Buckley said.
A version of this news article first appeared in the Inside School Research blog.