Most of today’s middle school students weren’t even born when Apple first unveiled Siri, an AI-activated voice assistant.
But despite growing up with AI—and learning to write a five-paragraph essay at a time when ChatGPT can perform that task in seconds—middle schoolers are well aware of AI’s pitfalls, including its tendency to get facts wrong and its potential to stifle their own learning, according to research conducted by professors at Texas Christian University slated to be discussed at the ISTELive 25 + ASCD Annual Conference 25 in San Antonio June 29 to July 2.
A lot of the focus on AI literacy has been on high schoolers and college students. But middle schoolers are “really are thinking about this,” said Michelle Bauml, a professor of education at TCU and one of the study’s authors. “They’re not all in or all out at this point with generative AI.”
The study included 43 students entering grades 6 through 9 who attended a free, week-long civics summer camp. Researchers surveyed the students both before and after a lesson on the benefits, risks, and ethics surrounding generative AI.
The team also conducted two focus groups—one with six students, the other with seven students—asking about the potential positives, drawbacks, and reliability of GenAI.
Many students use AI, but largely without adult guidance
The survey found that 60% of students had used GenAI tools at some point, including 15% who used it at least once a week, the survey found. More than half learned to use GenAI platforms on their own, while about a quarter learned from a teacher.
That finding stood out to Sue Anderson, another author and a professor of education at Texas Christian University.
“These kids are using it, but they’re not necessarily getting adult guidance,” she said.
After the AI lesson—which was informed by resources from Common Sense Media, a nonprofit organization focused on kids and technology—there was a 20% increase in the number of students saying they knew “a great deal” about AI. Both before and after the survey, most students said they saw AI as “somewhat” accurate and trustworthy.
Students don’t fully trust AI, and they worry it could hurt their learning
None of the 13 students who participated in focus groups believed it was a good idea to trust every piece of information generated by AI. They worried AI would unearth biased, outdated, or just plain wrong information.
More strikingly, many worried about what relying on AI might do to their own skills.
Students in the focus groups worried that regular use of AI writing tools might “stunt” their learning and creativity and make them too dependent on technology. They were also wary of losing their authentic voice to chatbots.
“[Using AI] wouldn’t feel sincere. It wouldn’t feel like it was from you,” one student said, according to the report.
Some also said they’d steer clear of using AI’s help with a capstone project for the camp—a group presentation and public service announcement on a community issue, such as animal welfare or food insecurity.
AI couldn’t create a presentation as interesting as students’ own products, one student said. “I just feel like we’re at a stage [of our project] that’s more creativity-based. And AI isn’t really good at creativity,” the student told researchers.
Others raised concerns about accuracy. One told researchers they didn’t want to use AI because all the information in their final presentation needed to “be 100% true.”
Not all students are rejecting AI, but many are still unsure
Still, not all students were turned off from ever using AI. Roughly 35% thought they would use GenAI for school assignments in the future. Another 37% were not sure if they would use it, and 28% did not plan to use it for school assignments.
Given how many middle school students are already using AI, schools need to step up their instruction on the technology, the report concluded.
“Knowing that they’re going to use it, whether teachers like it or not, it’s important for adults to provide guidance,” Anderson said. Middle school teachers need to help their students delve into questions such as: “How does AI work? When can you use it? What ethical considerations do you need to keep in mind when you use it?” Anderson said.
Teachers will need professional development to have the background knowledge to help students tackle those questions, she added.
“We’re still kind of in the wild west of AI, right?” Anderson said. “We’re just experimenting. Products are coming out all the time. Teachers don’t always feel like they have the knowledge to teach kids or guide kids to use this.”
But when presented with a lesson on AI, middle schoolers—at least the ones that the summer camp—have the capacity for nuanced thinking about the technology, Bauml said.
The students “demonstrate[ed] this critical thinking” about AI, Bauml said. “And isn’t that what we want? For our citizens not just to accept any source that comes in front of [them] or any tool, but to really weigh benefits and drawbacks?”