Assessment

What 150 Years of Education Statistics Say About Schools Today

By Sarah D. Sparks — November 16, 2017 6 min read
  • Save to favorites
  • Print

Long before there was an independent federal education department—before many states had school systems, in fact—there was a federal education statistics agency.

Today, the National Center for Education Statistics celebrates its 150th anniversary (albeit without a permanent commissioner in place). Though the agency remains independent of the Education Department, its work has laid a bedrock for education policy in the United States in areas from large-scale testing, to tracking students over time, to using surveys and local administrative data to understand changes in schools.

“NCES, even if people aren’t aware of it, has played a huge role in shaping education research,” said Sean P. “Jack” Buckley, a former commissioner of NCES. “The idea of standardized assessments in longitudinal studies … really all grew out of NCES and IES [the Institute of Education Sciences], and it drives so much research now that probably more than half of researchers aren’t aware of where that came from.”

Early Years

In 1867, Congress passed a law creating the first education department, focused primarily on collecting education statistics, in the Department of the Interior. (The Department of Education did not exist as a full-scale, cabinet-level agency until 1979.) The statistics agency’s first goal was “collecting such statistics and facts as shall show the condition and progress of education in the several States and Territories.”

“You were looking at a growing nation and people were just trying to get a handle on the scope of education in the country,” said Thomas Snyder, the current program director for annual reports and information at NCES.

Yet some of the key questions from those early reports would sound very familiar to education watchers today. The early annual reports often drew contrasts to international education systems in France or Prussia—though one noted with pride that the United States spent more per pupil than any other country.

Equity was also an issue from the start. In its 1870 report, for example, the agency detailed the limited schooling available for newly freed black students—the “achievement gap” at that time meant that 80 percent of black adults and 20 percent of white adults couldn’t read or write their own names.

This print, published by the American Chromo Co. in 1872, shows an interior scene in a school classroom, a child, at right center, is being admonished by both the teacher, seated on a platform at center of the background, and a woman, possibly the child's mother, seated on a bench in the left foreground. The boy does not seem care; it is possibly his lack of initiative that has both teacher and parent concerned.

Data was hand-collected and varied wildly. For example, New Mexico did not provide any data in the first collection, because citizens voted against establishing any public schools that year. In Greeley, Colo., officials of a newly established school district complained that they were struggling to build a common curriculum “... with as great a variety of textbooks as there were number of pupils.”

“The quality of the data was very dependent on what the states were able to put together,” Snyder said of the agency’s early work. “A lot of states just weren’t able to provide data because they didn’t have the resources. In a way, our use of administrative data has come full circle today.”

Throughout the first part of the 20th century, the agency continued to expand in scope—though its staff never rose much above 120 people—as policymakers’ interest in education grew. After World War I, the agency began to study more vocational and career training, and after the G.I. Bill passed and returning war veterans began heading to colleges and universities, it added significantly more studies of postsecondary access.

The agency began its first major longitudinal studies in the late 1970s and 80s.

Emerson Elliott, the first NCES commissioner to get that title and a presidential appointment, took charge of the agency in 1985, just as NCES faced a blistering evaluation by the National Academy of Sciences. The Academy criticized NCES for having slow turnaround and not having established standards for its statistical practices.

“It was a wonderful report from the perspective that it said, in very authoritative terms, what we had been saying for years,” Elliott said. The academy report argued that if NCES could not be turned around, it should be “folded up, and the responsibility sent to the [National Science Foundation],” recalled Elliott, now the director of special projects at the Council for the Accreditation of Educator Preparation.

Instead, the agency developed explicit standards for data quality and privacy, and guidelines for how to plan studies to answer educators’ and policymakers’ questions which are still in large part used today. “By the time I left, there was a consistent feeling that the center was a respectable member of the federal statistical community,” he said. “You could trust the data.”

Building NAEP

Elliott also expanded the National Assessment of Educational Progress from a single long-term trend study to state-level studies of math and reading, and national and internationally benchmarked studies of academic subjects, from civics and social studies to technology and engineering.

“I do remember that perceptions were very different” in NAEP’s early years, Elliott said. He recalled standing next to Washington Gov. Booth Gardner, then president of the National Governors Association, just before the announcement of NAEP results in 1990. “He said, ‘Ah, this won’t make any difference because all the states [results] will be the same.’ He couldn’t possibly have been more wrong.”

Peggy Carr, the current acting NCES commissioner and an NCES staff member since the early 1980s, said the agency learned a lot as the NAEP expanded. For example, anomalies in NAEP reading scores led to a massive investigation and panels with other researchers. In the end, officials learned that brown ink in one of the automatic test booklets was read incorrectly by the grading machines, skewing the results. “The color of ink makes a difference. ... Things like that matter,” Carr said. “We have learned from those nuance errors that we have to be methodical.”

Today, the agency is working to move NAEP from pencil-and-paper to computer-adaptive testing, and moving from rapidly shrinking surveys to studies that integrate more of the demographic, programing, and other data that schools collect for general reporting and accountability purposes—commonly called administrative data—into research surveys. But those changes, too, raise new issues for the agency.

Mark Schneider, a vice president and an institute fellow at the American Institutes of Research and the NCES commissioner from 2005-08, recalled that in the mid-2000s, one CD of data was being shipped on a delivery truck that got into a wreck. The CD was lost, with potentially personally identifiable information for more than 17,000 students; NCES had a backup of the data, but had to contact every family to let them know what had happened. While merging electronic data can protect against physical losses like that one, he said he worries that large merged data sets could become bigger targets for hackers. It will be a constant balance between ensuring that policymakers and researchers have the information they need and protecting the privacy of the students who provide the data, he said.

“The key thing is the administrative data is unstructured; you have data collected for X and you need to repurpose it for Y,” Schneider said.

Schneider believes NCES will be grappling with how to best use administrative data for years to come. “We had 100 years of experience in making surveys better and better—there was a whole science around question ordering, question writing … and what do we have on administrative data? Ten years, 20, maybe? It’s a new world.”

A version of this article appeared in the November 29, 2017 edition of Education Week as Happy Birthday, NCES! Agency Turns 150

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
A Whole Child Approach to Supporting Positive Student Behavior 
To improve student behavior, it’s important to look at the root causes. Social-emotional learning may play a preventative role.

A whole child approach can proactively support positive student behaviors.

Join this webinar to learn how.
Content provided by Panorama
Recruitment & Retention Live Online Discussion A Seat at the Table: Why Retaining Education Leaders of Color Is Key for Student Success
Today, in the United States roughly 53 percent of our public school students are young people of color, while approximately 80 percent of the educators who lead their classrooms, schools, and districts are white. Racial
Jobs January 2022 Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Assessment Whitepaper
Report: Now is the Time to Reimagine Assessments
In New Meridian’s latest whitepaper, Founder and CEO Arthur VanderVeen identifies five principles to boldly re-imagine assessment and cre...
Content provided by New Meridian
Assessment State Test Results Are In. Are They Useless?
While states, districts, and schools pore over data from spring 2021 tests, experts urge caution over how to interpret and use the results.
9 min read
FILE - In this Jan. 17, 2016 file photo, a sign is seen at the entrance to a hall for a college test preparation class in Bethesda, Md. The $380 million test coaching industry is facing competition from free or low-cost alternatives in what their founders hope will make the process of applying to college more equitable. Such innovations are also raising questions about the relevance and the fairness of relying on standardized tests in admissions process.
A sign is posted at the entrance to a hall for a test-preparation class. Assessment experts say educators should use data from spring 2021 tests with caution.
Alex Brandon/AP
Assessment Data Young Adolescents' Scores Trended to Historic Lows on National Tests. And That's Before COVID Hit
The past decade saw unprecedented declines in the National Assessment of Educational Progress's longitudinal study.
3 min read
Assessment Long a Testing Bastion, Florida Plans to End 'Outdated' Year-End Exams
Florida Gov. Ron DeSantis said the state will shift to "progress monitoring" starting in the 2022-23 school year.
5 min read
Florida Governor Ron DeSantis speaks at the opening of a monoclonal antibody site in Pembroke Pines, Fla., on Aug. 18, 2021.
Florida Gov. Ron DeSantis said he believes a new testing regimen is needed to replace the Florida Standards Assessment, which has been given since 2015.
Marta Lavandier/AP