Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Future of Work Opinion

Students Are ‘Digital Natives,’ But Here’s Where They Struggle

A leading researcher argues ‘digital literacy’ can help
By Rick Hess — September 24, 2024 6 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
  • Save to favorites
  • Print

Sam Wineburg is a co-founder of the Digital Inquiry Group (formerly the Stanford History Educational Group), a national leader in the provision of free social studies curricula—including materials like Civic Online Reasoning, Reading Like a Historian, and Beyond the Bubble. The Margaret Jacks Professor of Education and professor of history, emeritus, at Stanford University, Wineburg has long focused on challenges involving civic education, curriculum, and technology. His most recent book is Verified: How to Think Straight, Get Duped Less, and Make Wise Decisions about What to Believe Online. Given the interest in misinformation, how we teach students to navigate social media, and the challenges of civic literacy, I thought it worth reaching out to Wineburg to get his take. Here’s what he had to say.
—Rick

Rick: You’ve become a leading authority on digital literacy and misinformation. Can you talk a bit about how you got into these issues?

Sam: Fortuitously. Back in 2015, I got an email from a program officer at Chicago’s McCormick Foundation. This person had seen our innovative history assessments, in which students analyze primary sources from the collection of the Library of Congress. This person wanted to know if we could create an instrument that directly measured students’ ability to assess online sources. We accepted the challenge. The next year, Trump was elected, and “fake news” became part of the public discourse. During this time, the conventional wisdom preached by people like Marc Prensky and others was that adults were the digital knuckleheads but that young people—also known as “digital natives”—had game. But we weren’t so sure, so we set out to measure students’ abilities to sift fact from fiction, in many cases by having them analyze actual material from the web. After combing through nearly 8,000 responses from students in middle school through college, we found them to be just as confused as the rest of us. A Wall Street Journal reporter featured our study, which led to appearances on NPR, BBC, ABC, and countless other outlets. From that point on, there was no turning back.

Rick: Can you tell me more about that study? When you say you found the students were “just as confused as the rest of us,” what did you see?

Sam: One of the findings that the Wall Street Journal highlighted was that 82 percent of middle school students couldn’t tell the difference between an ad and a news story. What the Journal didn’t say was that in a study conducted by Edelman-Berland, a global communications firm, 59 percent of adults couldn’t tell the difference, either. Findings like these made us realize that were all in the same boat—and that boat was rapidly taking on water.

Rick: Is there an appetite for schools taking this on?

Sam: There’s increased attention at the legislative level to issues of information literacy. States like Illinois, California, and New Jersey have passed curriculum mandates, and there’s legislative action in something like 15 other states. What’s heartening is that this concern spans the red state/blue state divide. Teaching students to be wise consumers of digital information can’t be a partisan issue. Without the ability to tell the difference between information backed by solid evidence and sham, democracy doesn’t stand a chance.

Rick: I love the goal. But, as you know, we live in a time of sometimes intense disagreement about what’s fact and what’s “misinformation.” I mean, we’ve seen credible authorities vehemently denounce some statements as falsehoods, on topics like the origins of COVID or Hunter Biden’s laptop—only to later learn the statements were actually true. How do you navigate those tensions?

Sam: Listen, there are topics where authorities rushed to pronounce judgment—case in point, the COVID lab-leak hypothesis. To broach the idea in 2020 branded you a racist; today, the origin of the virus is an open question. But to generalize from this instance—to go from “authorities sometimes err” to “you can’t trust them at all”—leads to a crippling nihilism. Let’s stick with medical issues for a second: The rage on TikTok is a procedure called “mewing,” the idea that by doing repetitive jaw exercises, you can change your jawline and achieve a sleeker profile. There are hundreds of videos with millions of views attesting to the procedure, including endorsements from supermodels. But if you know how to separate signal from noise on the internet, you quickly learn that there are no medical studies that attest to the efficacy of the procedure and that the dentist who promoted it had his dental license stripped. You won’t die from mewing, but there’s a lot of scary medical advice floating that can lead to serious illness or even death. When it doubt, it’s wise to go with authorities like the Mayo Clinic over sketchier places such as the [fictional] Dave and Tom’s Homeopathic Supplements.

Rick: How has the emergence of AI affected your work?

Sam: AI magnifies the challenge. We have a wondrous tool that’s been programmed to offer persuasive responses—accurate or not. In too many cases, the responses of large language models—LLMs—are the linguistic equivalents of a green smoothie—a phrase from a Facebook post combined with text drawn from a RAND report, abutting content from Wikipedia, and a sprinkling of text from The Onion. In fact, the now-famous “Elmer’s glue keeps cheese on pizza” LLM response originally came from a satirical Reddit post. AI weakens the most important bond we need to consider when evaluating information: the nexus between claim and evidence. In the words of cognitive scientist Gary Marcus, generative AI is “frequently wrong, but never in doubt.” Rather than rendering traditional search skills obsolete, AI has made the ability to verify information even more imperative. Letting kids loose on AI without establishing that they have search skills in place is like framing a house without first pouring a foundation.

Rick: Your book Verified, published last year, is a resource for helping to sort fact from fiction on the internet. What are a few key takeaways?

Sam: We think of our book as the driver’s manual for the internet that none of us ever received. It helps readers determine what’s true and what’s not. In the days of print, newspapers gave us tactile clues to decipher information: news on the front page, editorial content on the interior, advertisements set off in boxes, etc. The internet erases these clues. When a post appears in our feed, do we really know what it is? Imagine, for example, when searching for nutrition information, we land on the site of the “International Life Sciences Institute.” At first glance, this looks like a credible scientific organization. That sense increases as we spend more time on the site, examining the group’s refereed publications and perusing the impressive bios of its scientific advisers. Only when we leave the site and read laterally—i.e., using the internet to check the internet, as we explain in Verified—do we learn that the group receives the bulk of its funding from the food, chemical, and agribusiness industries. This is how public policy is transacted on the internet. Front groups, lobbyists, and partisan organizations portray themselves as “nonpartisan” or “grassroots” or “citizen-led.” In many cases, these sites are the handiwork of public relations firms that specialize in creating digital masquerades. With a few right moves, however, you can often detect these ruses in as little as 30 seconds, which we show how to do in Verified.

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
(Re)Focus on Dyslexia: Moving Beyond Diagnosis & Toward Transformation
Move beyond dyslexia diagnoses & focus on effective literacy instruction for ALL students. Join us to learn research-based strategies that benefit learners in PreK-8.
Content provided by EPS Learning
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
How to Use Data to Combat Bullying and Enhance School Safety
Join our webinar to learn how data can help identify bullying, implement effective interventions, & foster student well-being.
Content provided by Panorama Education
Classroom Technology Live Online Discussion A Seat at the Table: Is AI Out to Take Your Job or Help You Do It Better?
With all of the uncertainty K-12 educators have around what AI means might mean for the future, how can the field best prepare young people for an AI-powered future?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Future of Work STEM Jobs Aren’t Students’ First Choice. More Hands-On Experiences Could Help, Experts Say
Lack of exposure to STEM concepts may be contributing to the disconnect, according to the report.
3 min read
African-american schoolgirl pupil student using working with microscope at biology chemistry lesson class at school lab. STEM concept.
iStock/Getty
Future of Work Opinion 5 Lessons I’ve Learned From Using AI
Many educators may be nervous to use AI, but the reality is they are most likely using it already.
3 min read
Screen Shot 2023 11 19 at 10.03.27 AM
Canva
Future of Work Students Want STEM Careers, But Think Schools Are Doing a ‘Poor Job’ Preparing Them
Nearly all survey respondents said preparing students for STEM jobs is important.
3 min read
Photo of students working on computer boards.
E+ / Getty
Future of Work What 3 After-School Programs Are Doing to Prepare Kids for the Future of Work
After-school programs offer flexibility and time for hands-on learning to explore careers.
6 min read
robotics classroom with young african american student wearing VR
iStock/Getty<br/>