Assessment

What 150 Years of Education Statistics Say About Schools Today

By Sarah D. Sparks — November 16, 2017 6 min read
  • Save to favorites
  • Print

Long before there was an independent federal education department—before many states had school systems, in fact—there was a federal education statistics agency.

Today, the National Center for Education Statistics celebrates its 150th anniversary (albeit without a permanent commissioner in place). Though the agency remains independent of the Education Department, its work has laid a bedrock for education policy in the United States in areas from large-scale testing, to tracking students over time, to using surveys and local administrative data to understand changes in schools.

“NCES, even if people aren’t aware of it, has played a huge role in shaping education research,” said Sean P. “Jack” Buckley, a former commissioner of NCES. “The idea of standardized assessments in longitudinal studies … really all grew out of NCES and IES [the Institute of Education Sciences], and it drives so much research now that probably more than half of researchers aren’t aware of where that came from.”

Early Years

In 1867, Congress passed a law creating the first education department, focused primarily on collecting education statistics, in the Department of the Interior. (The Department of Education did not exist as a full-scale, cabinet-level agency until 1979.) The statistics agency’s first goal was “collecting such statistics and facts as shall show the condition and progress of education in the several States and Territories.”

“You were looking at a growing nation and people were just trying to get a handle on the scope of education in the country,” said Thomas Snyder, the current program director for annual reports and information at NCES.

Yet some of the key questions from those early reports would sound very familiar to education watchers today. The early annual reports often drew contrasts to international education systems in France or Prussia—though one noted with pride that the United States spent more per pupil than any other country.

Equity was also an issue from the start. In its 1870 report, for example, the agency detailed the limited schooling available for newly freed black students—the “achievement gap” at that time meant that 80 percent of black adults and 20 percent of white adults couldn’t read or write their own names.

This print, published by the American Chromo Co. in 1872, shows an interior scene in a school classroom, a child, at right center, is being admonished by both the teacher, seated on a platform at center of the background, and a woman, possibly the child's mother, seated on a bench in the left foreground. The boy does not seem care; it is possibly his lack of initiative that has both teacher and parent concerned.

Data was hand-collected and varied wildly. For example, New Mexico did not provide any data in the first collection, because citizens voted against establishing any public schools that year. In Greeley, Colo., officials of a newly established school district complained that they were struggling to build a common curriculum “... with as great a variety of textbooks as there were number of pupils.”

“The quality of the data was very dependent on what the states were able to put together,” Snyder said of the agency’s early work. “A lot of states just weren’t able to provide data because they didn’t have the resources. In a way, our use of administrative data has come full circle today.”

Throughout the first part of the 20th century, the agency continued to expand in scope—though its staff never rose much above 120 people—as policymakers’ interest in education grew. After World War I, the agency began to study more vocational and career training, and after the G.I. Bill passed and returning war veterans began heading to colleges and universities, it added significantly more studies of postsecondary access.

The agency began its first major longitudinal studies in the late 1970s and 80s.

Emerson Elliott, the first NCES commissioner to get that title and a presidential appointment, took charge of the agency in 1985, just as NCES faced a blistering evaluation by the National Academy of Sciences. The Academy criticized NCES for having slow turnaround and not having established standards for its statistical practices.

“It was a wonderful report from the perspective that it said, in very authoritative terms, what we had been saying for years,” Elliott said. The academy report argued that if NCES could not be turned around, it should be “folded up, and the responsibility sent to the [National Science Foundation],” recalled Elliott, now the director of special projects at the Council for the Accreditation of Educator Preparation.

Instead, the agency developed explicit standards for data quality and privacy, and guidelines for how to plan studies to answer educators’ and policymakers’ questions which are still in large part used today. “By the time I left, there was a consistent feeling that the center was a respectable member of the federal statistical community,” he said. “You could trust the data.”

Building NAEP

Elliott also expanded the National Assessment of Educational Progress from a single long-term trend study to state-level studies of math and reading, and national and internationally benchmarked studies of academic subjects, from civics and social studies to technology and engineering.

“I do remember that perceptions were very different” in NAEP’s early years, Elliott said. He recalled standing next to Washington Gov. Booth Gardner, then president of the National Governors Association, just before the announcement of NAEP results in 1990. “He said, ‘Ah, this won’t make any difference because all the states [results] will be the same.’ He couldn’t possibly have been more wrong.”

Peggy Carr, the current acting NCES commissioner and an NCES staff member since the early 1980s, said the agency learned a lot as the NAEP expanded. For example, anomalies in NAEP reading scores led to a massive investigation and panels with other researchers. In the end, officials learned that brown ink in one of the automatic test booklets was read incorrectly by the grading machines, skewing the results. “The color of ink makes a difference. ... Things like that matter,” Carr said. “We have learned from those nuance errors that we have to be methodical.”

Today, the agency is working to move NAEP from pencil-and-paper to computer-adaptive testing, and moving from rapidly shrinking surveys to studies that integrate more of the demographic, programing, and other data that schools collect for general reporting and accountability purposes—commonly called administrative data—into research surveys. But those changes, too, raise new issues for the agency.

Mark Schneider, a vice president and an institute fellow at the American Institutes of Research and the NCES commissioner from 2005-08, recalled that in the mid-2000s, one CD of data was being shipped on a delivery truck that got into a wreck. The CD was lost, with potentially personally identifiable information for more than 17,000 students; NCES had a backup of the data, but had to contact every family to let them know what had happened. While merging electronic data can protect against physical losses like that one, he said he worries that large merged data sets could become bigger targets for hackers. It will be a constant balance between ensuring that policymakers and researchers have the information they need and protecting the privacy of the students who provide the data, he said.

“The key thing is the administrative data is unstructured; you have data collected for X and you need to repurpose it for Y,” Schneider said.

Schneider believes NCES will be grappling with how to best use administrative data for years to come. “We had 100 years of experience in making surveys better and better—there was a whole science around question ordering, question writing … and what do we have on administrative data? Ten years, 20, maybe? It’s a new world.”

A version of this article appeared in the November 29, 2017 edition of Education Week as Happy Birthday, NCES! Agency Turns 150

Events

Mathematics Live Online Discussion A Seat at the Table: Breaking the Cycle: How Districts are Turning around Dismal Math Scores
Math myth: Students just aren't good at it? Join us & learn how districts are boosting math scores.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Design for Improvement: The Case for a New Accountability System
Assessments in more frequent intervals provide useful feedback on what students actually study. New curriculum-aligned assessments can le...
Content provided by Cognia
Assessment The 5 Burning Questions for Districts on Grading Reforms
As districts rethink grading policies, they consider the purpose of grades and how to make them more reliable measures of learning.
5 min read
Grading reform lead art
Illustration by Laura Baker/Education Week with E+ and iStock/Getty
Assessment As They Revamp Grading, Districts Try to Improve Consistency, Prevent Inflation
Districts have embraced bold changes to make grading systems more consistent, but some say they've inflated grades and sent mixed signals.
10 min read
Close crop of a teacher's hands grading a stack of papers with a red marker.
E+
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Sponsor
Fewer, Better Assessments: Rethinking Assessments and Reducing Data Fatigue
Imagine a classroom where data isn't just a report card, but a map leading students to their full potential. That's the kind of learning experience we envision at ANet, alongside educators
Content provided by Achievement Network
Superintendent Dr. Kelly Aramaki - Watch how ANet helps educators
Photo provided by Achievement Network