It can be difficult to look at the increasing volume of online misinformation—and its consequences on our civic life—with anything other than despair.
But all is not lost. There’s a small beacon in new research concluding that students really can become more critical consumers of online information—a key skill in distinguishing legitimate news and sources of information from slickly produced ones designed to mislead.
The research, recently released by the Stanford History Education Group, is based on an empirical study of its own Civic Online Reasoning curriculum. SHEG made the curriculum freely available to all last December. (You must register to download it.)
Key elements of the curriculum include learning to use “lateral reading,” in the mold of professional fact-checkers, by opening up multiple browser windows to cross-check the source of the information. (Students could investigate, for example, whether something they’re asked to evaluate was produced by a legitimate news outlet or by an advocacy group or other source that might raise questions about its accuracy.)
You may remember that last fall SHEG found that high school students were, in general, terrible consumers of digital information. In a previous study, it found that over half of high schoolers it tested took at face value a video purporting to show ballot-stuffing, and concluded it was strong evidence of voter fraud in U.S. elections. (The video actually showed footage from Russia.)
But in the new study, students who were taught using the SHEG materials showed growth in their ability to evaluate online sources critically of about two-and-a-half points on a 14-point scale, compared to just over a half-point of growth in those who didn’t use the materials.
“I’ll just say we are experiencing a time of profound pessimism of our ability to do something about the rapid misinformation and disinformation that envelops us every time we turn on a device and look at the screen,” said Sam Wineburg, the founder of SHEG and an education professor at Stanford. “The idea we can move the middle with a fairly minimum investment is a finding we believe we can celebrate.”
Filling in Research Holes
One reason the study matters is that media literacy is still, all things considered, a pretty nascent field, and research is catching up, noted Cyndy Scheibe, a professor in the department of psychology at Ithaca College. She also runs a media-literacy group at the college that offers curriculum and training, Project Look Sharp. (Scheibe did not contribute to the SHEG research.)
In general, “I think the robustness of the [media literacy] research and the quality of the research varies a little bit. Some of it is qualitative in its assessment more than quantitative,” Scheibe said. “Unlike other things we measure that may be relatively easy to assess, the issue with media literacy is if what you’re trying to do is look at how people interpret media messages or analyze media messages, ... there isn’t one right answer, typically.
“What you’re really looking for is the depth and the probing of people’s responses and whether they can give evidence to back up their conclusions.”
To that point, outcomes like self-reports or multiple-choice questions don’t tend to do a good job of measuring students’ media-evaluation skills. And as my colleague Sarah Schwartz reported last year, when the RAND corporation tried to look through the literature, it found that researchers defined media literacy in different ways and there were few studies of specific teaching approaches or programs.
The SHEG research, on the other hand, tests its own curriculum, which explicitly teaches lateral reading and other skills. The study is based on a sample of about 460 high school juniors or seniors taking a civics or government class in six high schools in an unnamed midwestern district. Researchers randomly assigned half the schools to use the SHEG materials.
Teachers in the treatment schools incorporated six of the lessons into their classes, while those in the comparison high schools received their normal civics and government programming. Students took a pre- and post test at either end of the semester requiring them to evaluate online sources.
Each pair of schools generally had similar demographics, and the researchers controlled for characteristics that tend to impact measures of learning.
The study also found some preliminary evidence that black students and students who don’t speak English at home did not improve as much as their peers. (The civics education community is increasingly concerned about this so-called “civics gap.” Groups that have had to fight the hardest to exert their civic rights in the United States are often the least likely to be taught about their rights and the tools they can use.)
Even though the study was not done by independent researchers, Scheibe praised it for being soundly designed and an important addition to the literature. Nearly all the research focuses on high school students even though the consensus in the field is that students need to be taught media-literacy skills far earlier for it to become an automatic habit of mind.
“We have a long way to go, including how do you teach this effectively, at different grade levels, and different curriculum areas,” she said. “For us, we see media literacy as literacy. And therefore you can’t just start teaching it in high school—you have to start teaching it in preschool, and then every year.”
The study is currently being submitted for publication.
Clarification: This post has been updated to clarify that Sam Wineburg is an education professor at Stanford University, but holds a courtesy affiliation with its history department.
Image credit: Syahrir Maulana/Getty
A version of this news article first appeared in the Teaching Now blog.