Opinion Blog


Rick Hess Straight Up

Education policy maven Rick Hess of the American Enterprise Institute think tank offers straight talk on matters of policy, politics, research, and reform. Read more from this blog.

Classroom Technology Opinion

The Stanford Scholar Bent on Helping Digital Readers Spot Fake News

By Rick Hess — April 08, 2021 7 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
  • Save to favorites
  • Print

Sam Wineburg is the Margaret Jacks Professor of Education at Stanford University, where his research examines how people judge the credibility of digital content. His work has appeared in prominent publications including The New York Times, Wall Street Journal, Washington Post, USA Today, and Smithsonian Magazine. The digital document-based history curriculum he helped create has been downloaded 10 million times. Of note: Wineburg is the only one out of the top 50 in this year’s RHSU EduScholar rankings with a primary focus on online learning or education technology. Especially in light of that, I was curious to ask him about fake news, digital learning, and how teachers and parents should help children navigate online content.

—Rick

Rick: So Sam, can you talk a bit about just what it is that you study and how you got into this field?

Sam: I started off as a history teacher, got curious about how kids learn, and landed in a Ph.D. program in Psychological Studies in Education at Stanford under Lee Shulman, my doctoral adviser and unpaid life coach. For most of my career, I studied how kids learn history, especially how they did make sense of conflicting historical texts. Since 2014, my research team has focused on how people tell what’s true on the internet.

Rick: As you know, in this year’s RHSU EduScholar rankings, you were the only scholar in the top 50 who studies anything related to digital learning. Why is that?

Sam: Education researchers are pack animals. You’d think that independently minded scholars would look around and ask: “What are the pressing issues of the day that no one’s studying?” Instead, they look at their neighbors and, without a great deal of thought, follow in their footsteps. Disinformation has eaten away at the fabric of democracy. Yet, in schools of education, you can’t find more than a handful of scholars studying how students become informed citizens using the devices that occupy eight hours of their waking day.

Rick: How do you actually go about studying this stuff?

Sam: People have used a variety of methods: interviews, focus groups, even multiple-choice tests. But these methods are imprecise proxies. We think that if you want to know what people do on the Internet, don’t ask them what they would do. Put them in front of a computer and watch them do it.

Rick: You’ve done some pretty cool experiments—can you describe a few?

Sam: In 2019, the Hewlett Foundation supported us in conducting the largest study to date of how teenagers evaluate digital sources. We provided three-thousand high school students with a live internet connection and had them solve a series of tasks. One task asked them to evaluate a website that rejects the scientific consensus about climate change. When you Google the group behind it, you learn that they’re funded by Exxon—a clear conflict of interest. Yet, 92 percent of students never made the link. Why? Because their eyes remained glued to the original site. But my favorite study was when Sarah McGrew—now an assistant professor at the University of Maryland—and I flew to New York and Washington, D.C., in 2017 to watch fact checkers at the nation’s most prestigious news outlets evaluate unfamiliar websites. We then observed a group of very smart people, Ph.D.s from five universities along with a group of Stanford undergraduates, solving the same tasks. Fact checkers uniformly saw through common digital ruses and arrived at truth in seconds. Academics and Stanford students, critical thinkers all, often spun around in circles, confused by the internet’s wiles.

Rick: Tell me a bit more—why did the fact checkers do a better job than the “smart” group?

Sam: The intelligent people we’ve studied are invested in their intelligence. That investment often gets them in trouble. Because they’re smart, they think they can outsmart the web. They land on a website that looks professionally prepared, with scholarly references and a list of research reports, and conclude, “Looks OK.” Basically, they’re reading the web like a piece of static print—thinking that they can determine what something is by looking at it. Unless you have multiple Ph.D.s in a half-dozen fields—immunology, virology, economics, physics, political science, and history—you’re kidding yourself. On the internet, hubris is your Achilles heel. Fact checkers have a different approach. They understand that online information demands a different kind of reading, a process we call “lateral reading.” Rather than dwelling on an unfamiliar site, they take a quick peek, leave it, and then open up multiple tabs to search for information about the group or organization behind the original site. They return to the original site only if it checks out. In other words, they learn about a site by leaving it to consult the broader web.

Rick: How does “lateral reading” compare to how we usually teach students to identify credible sources?

Sam: Teaching kids to do lateral reading goes against what they learn in school about judging a text: Read it thoroughly and only then render judgment. Yet, on the web, where attention is scarce, expending precious minutes reading a text, before you know who produced it and why, is a colossal waste of time. Lateral reading isn’t a cure-all. But research we’ve conducted shows that it can take a big chunk out of students’ most egregious errors. We saw this in a study we just published, in which we used examples of bogus nutrition information in the context of a college nutrition course at the University of North Texas. In videos that we integrated into the course, we modeled how to vet nutrition information by turning to the web and looking into who produced the information. The results were stunning: Over the semester, lateral reading went from the least used to the most used strategy for evaluating the trustworthiness of a site.

Rick: This is a hugely timely, hugely useful topic. How do you make sure this research actually gets to educators and parents?

Sam: The internet’s created lots of problems, but it’s also lowered the opportunity costs for academics who want to make a dent on society. My research group continues to put its work through peer review. But once an article’s accepted, that’s when the real work begins. How do we turn research studies into materials that busy teachers in challenging contexts find useful? Our document-based history curriculum has been downloaded 10 million times and adopted by LAUSD, the nation’s second-largest district. Our digital-literacy curriculum—full disclosure, work that was supported by Google.org—has 65 classroom-ready lessons and assessments, and a set of videos produced by John Green’s Crash Course that have been viewed over two million times. This summer, working with Justin Reich at MIT’s Teaching Systems Lab, we launched a “Civic Online Reasoning” MOOC. All of our materials have remained free. Anyone can download them just by registering at sheg.stanford.edu.

Rick: Are there any popular approaches to teaching students to determine the credibility of online content that aren’t actually credible themselves?

Sam: Unfortunately, there are a lot of approaches that address web credibility like a game of twenty questions: “Is the site a .org?” If so, “It’s good.” “Is it a .com?” If so, “It’s bad.” “Does it have contact information?” That makes it good. But if it has banner ads? “It’s bad.” Problem is that bad actors read these lists, too, and each of these features is ludicrously easy to game. Antiquated advice even appears on websites of prestigious universities. One of them disseminates guidelines for web credibility written in 1996, the internet’s Paleolithic era.

Rick: How early should we begin teaching students these things?

Sam: Easy, the moment we give them a smartphone.

Rick: Last question, is there anything you’d encourage policymakers or philanthropists to do in this area that would be especially helpful?

Sam: Whatever name it goes by, if teaching web credibility remains an add-on, its effect will be negligible—just another barnacle on the hull of the curriculum. We’re deluding ourselves if we think an elective can drag us out of this mess. The challenge is not to add a new feature to a bloated curriculum but to transform the curriculum we already have. How, in the face of our current digital assault, do we rethink the teaching of history, science, civics, and language arts—the basics? When we think about the high school curriculum, how much longer can we turn a blind eye when kids are historicized by sites that claim that “thousands” of Black Americans took up arms for the Confederacy or that the Holocaust was a hoax? Or pseudoscience sites that purport to show a link between vaccinations and autism? On every question we face as citizens—to raise the minimum wage, to legalize marijuana, to tax sugary drinks, to abolish private prisons, you name it—sham sources jostle for our attention right next to trustworthy ones. Failing to teach kids the difference is educational negligence. If the storming of the Capitol on January 6, an insurrection fueled by digital toxins, was not a Sputnik moment, I don’t know what is. Crawling ourselves out of this mess will require experimentation, lots of trial and error, and substantial investment. It won’t come cheap. Then again, neither is the cost of maintaining a flourishing democracy.

This interview has been edited and condensed for clarity.

Related Tags:

The opinions expressed in Rick Hess Straight Up are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and other jobs in K-12 education at the EdWeek Top School Jobs virtual career fair.
Ed-Tech Policy Webinar Artificial Intelligence in Practice: Building a Roadmap for AI Use in Schools
AI in education: game-changer or classroom chaos? Join our webinar & learn how to navigate this evolving tech responsibly.
Education Webinar Developing and Executing Impactful Research Campaigns to Fuel Your Ed Marketing Strategy 
Develop impactful research campaigns to fuel your marketing. Join the EdWeek Research Center for a webinar with actionable take-aways for companies who sell to K-12 districts.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Classroom Technology What Kids Say They Need to Understand How AI Works
A National 4-H Council survey explores kids’ knowledge and use of artificial intelligence.
4 min read
Photo illustration of student with laptop.
Anderson Piza/iStock/Getty
Classroom Technology From Our Research Center How Young Is Too Young to Teach Students About AI? Survey Reveals Differing Opinions
Educators overwhelmingly agree that students need to learn how AI works, but at what age, exactly, is a source of debate.
4 min read
A young kid using a tablet building robot game with a chat bot notification face icon pop up
iStock/Getty Images
Classroom Technology From Our Research Center Will AI Use in Schools Increase Next Year? 56 Percent of Educators Say Yes
Some districts are already looking for ways the technology might help save educators' time.
1 min read
Illustration of a network of laptops around a chatbot
iStock/Getty