Opinion
Assessment Opinion

The Nation’s Report Card Could Be Education’s Data Gold Mine

Better support for educators, higher student achievement, improved tests are among the outcomes
By Mark Schneider & John Whitmer — May 18, 2023 5 min read
Abstract illustration of big data technology and artificial intelligence
  • Save to favorites
  • Print

ChatGPT feels like it’s everything, everywhere, all at once (repurposing a great movie title but inserting punctuation). How generative artificial intelligence—AI that creates new text or images (as you can see in ChatGPT, Bing, or DALL-E)—shakes out is unclear: Will we create an artificial superintelligence that displaces humans? Or will we harness its power to improve learning processes and outcomes?

Nobody can predict that future with certainty, but one thing we do know is that generative AI requires large quantities of high-quality, relevant data to be of any value. In the education sciences, we also know that such large-scale, high-quality data are neither everywhere nor all at once. However, the National Assessment of Educational Progress, often known as the Nation’s Report Card, provides carefully collected, valid, and reliable data with rich contextual information about learners while protecting student privacy. In short, NAEP can begin to fulfill the data needs of modern education research. And the National Assessment Governing Board—which sets policy for NAEP and meets this week—should prioritize the release of these data.

As is so often the case, the science is moving faster than the speed of government, but this is one area where we have everything we need to catch up. Given the potential these taxpayer-funded data have to improve support for educators and outcomes for students, there is a clear obligation to make the information available to researchers. As advocates for high-quality, high-impact research, we urge that step.

Since 1969, NAEP has measured student achievement in mathematics, reading, science, writing, arts, history, and civics. NAEP uses a mix of conventional forced-choice items; student essays; short, open-ended responses; and simulations. NAEP also collects “process data” about how students interact with items using the digital-based assessment platform. Further, NAEP collects detailed demographic and self-reported information, which includes the basics (for example, race/ethnicity, gender) and deeper information (for example, English-language-learner status, IEP status, disability accommodations). NAEP’s data mine holds hundreds of thousands of examples of student work coupled with detailed contextual information about students, their school, and their community. We need to use those data to improve AI algorithms that can in turn improve student outcomes.

Automated scoring is among the most widely researched and deployed uses of AI in education. But replicating human scoring is the floor, not the ceiling. Researchers could use NAEP data to explore complex constructs that have more far-reaching implications than scoring—such as categorizing math misconceptions, identifying ways to improve student writing, or understanding the key themes present in student writings about civic engagement.

With NAEP’s large samples and detailed contextual variables about the test-takers, their schools, and their families, we can also learn about the impact of many factors on student achievement.

NAEP can begin to fulfill the data needs of modern education research.

Protecting student privacy is, of course, essential but also not a reason to delay the release of the data, as some argue. Many safeguards are already in place. NAEP’s results reported at the group level means that protecting privacy is easier than individual assessments, because every result is a summary across many individuals. Further, NAEP’s long history and its procedures minimize risk. For example, the information that could identify a particular test-taker is removed even before the data leave the school. There are known solutions to ensure that individual student identities will not be revealed as a result of a small number of students being categorized in any subgroup. Open-ended responses are a bit trickier; NAEP doesn’t control what students put into these fields, and sometimes, they write a bit off-topic, revealing personal data that need to be scrubbed (perhaps noting that “My uncle, Frank Johnson, who lives in Auburn, was once busted for DUI”).

The Institute of Education Sciences, where we work, is scrupulously addressing privacy concerns in NAEP data. Our recently announced competition (with $100,000 in prizes) asks researchers to solve the difficult problem of using AI to replicate human-assigned scores for open-ended math items. Before NAEP math-assessment data were released to participants, the information was scrubbed for personally identifiable information and sensitive language using automated and human-based reviews. The reviews ensured that neither student identities nor other types of sensitive information such as a social media handle were disclosed. The dataset is being further processed through our internal controls to ensure it is sufficiently safe to release.

Decisions regarding data privacy should be weighed for the relative risk and reward. The value of tapping NAEP’s data gold mine is high, and, given its history and design, the risk to student privacy is low. In short, privacy concerns should not inhibit the release of NAEP data to qualified researchers.

See Also

Photo collage of crystal ball with the word “AI.”
F. Sheehan for Education Week / Getty

Research using NAEP data could improve NAEP itself but, more importantly, answer questions about how students learn. For NAEP as an assessment, modern research methods could be used to help review and revise the questions, identifying items that specific groups of students find difficult due to wording or issues not related to the underlying construct. This would move beyond standard psychometric analyses through the incorporation of rich contextual data.

NAEP data could have much broader applicability, especially in the context of large-language models—the underlying approach used by generative AI. Most existing large-language models are based on data scraped from all over the web. While OpenAI, the company that created ChatGPT, does not disclose the specific data sources used for model training, ChatGPT is reportedly trained using information from web texts, books, news articles, social media posts, code snippets, and more. There are more than a few examples of ChatGPT providing questionable or toxic responses depending on the prompt it is given. An equally serious (and related) problem is that large-language models do not have access to enough student academic work, leaving them severely anemic just where we need them most. NAEP data could help with fine-tuning these models, making them more accurate and more useful.

We are only beginning to see how the future of education research will be transformed by generative AI—but one thing is crystal clear: NAEP data must be part of that future. Opening up NAEP’s gold mine of data is an easy call. Doing so will allow us to tap into the creativity of the research community to explore what insights we can derive from NAEP data that will be useful to education stakeholders.

NAEP is approaching a $200 million a year operation. While it produces invaluable insights into student achievement, it has not yet delivered on its full promise.

A version of this article appeared in the May 31, 2023 edition of Education Week as The Nation’s Report Card Could Be Education’s Data Gold Mine

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI in Education: Empowering Educators to Tap into the Promise and Steer Clear of Peril
Explore the transformative potential of AI in education and learn how to harness its power to improve student outcomes.
Content provided by Panorama Education
English Learners Webinar Family and Community Engagement: Best Practices for English Learners
Strengthening the bond between schools and families is key to the success of English learners. Learn how to enhance family engagement and support student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
How an Inquiry-Based Approach Transforms Math Learning
Transform math learning with an approach that empowers students to become active, engaged learners.
Content provided by MIND Education

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion Students Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week