Artificial Intelligence Q&A

AI Makes Stuff Up. So How Can Teachers Use It in Instruction?

By Arianna Prothero — May 13, 2025 3 min read
Photo of a young white boy viewing AI Chatbot responses on his mobile device.
  • Save to favorites
  • Print

Artificial intelligence is the future, and educators must incorporate the swiftly evolving technology into their instruction to both stay current and prepare their students for the jobs of the future. Or so say many experts in education and technology.

At the same time, experts caution that generative AI tools can be biased and “hallucinate” made-up answers to queries. Some recent research shows that the most sophisticated new versions of popular AI chatbots are hallucinating more than before, even as they get better at performing some tasks such as math problems.

It’s no small wonder that many teachers remain doubtful about using the technology in their classrooms, especially for anything beyond composing emails to parents or creating grading guidelines.

So, then, how exactly are teachers supposed to responsibly incorporate this technology into instruction? Education Week put this question to Rachel Dzombak, a professor of design and innovation in Carnegie Mellon University’s Heinz College of Information Systems and Public Policy.

This conversation has been edited for length and clarity.

Why does AI ‘hallucinate’?

Rachel Dzombak

I think of hallucinations just as, simply, when a tool generates an inaccurate response. Recently, I said [to an AI tool], “Pull direct quotes from this article,” and it looked like they were real quotes, but then I tried to find the quotes in the article, and they did not exist.

There are other types of systems, software systems, that are designed to give the one right answer. That’s not how AI systems are set up. AI systems are complex systems. They’re not trained to give a single right answer. AI systems are trying to find patterns across data, and what’s going to come out of it is what we call emergent effects. I think of it as 2+2=5: that there’s always going to be these unintended aspects. It’s what makes AI and large language models, specifically, really great at helping us brainstorm and do creativity tasks. But the flip side of that is it gets things wrong.

The flip side of the hallucination is this massive capacity to pull information together in new ways that was previously not possible with traditional software systems. There are just trade-offs.

What I can say is that the systems are going to continue to evolve in good and bad ways over time. So, we could see, because it is a maturing tool, these swings where [AI] gets worse at some things and it gets better at some things.

How does this then affect teachers who are using AI to teach?

They need to have a continued sense of curiosity of, what are these tools? What are they today, how are they changing? There are no hard and fast rules.

A big challenge people have today in the education space is not thinking about how to use [AI tools] in a way that really fits with their intention. [Use them] where you have more wiggle room where you’re not looking for that one exact answer, where variability and randomness is a good thing, a feature rather than a bug of the system. I think about this in my own classes: How do I encourage students to use generative AI tools in places where it makes sense, where it encourages them to be creative?

When should students use generative AI?


I see educators saying, “You’re not allowed to use a large language model to write a term paper in my class.” I personally think that students are going to find ways around that anyway. I think [teachers] should say, “If you’re going to, here are ways that you should think about using [AI], and here are the trade-offs. In many ways, it’s forcing creativity on the part of educators to rethink, how are we really hitting the learning outcomes that we want to hit?

Education has needed change for a long time. People are at times blaming generative AI tools for shifting education in the current moment: It’s enabling cheating, it’s enabling all of these things. But maybe it’s just shining a spotlight on behaviors that were already there. I’m an engineer by training, and a colleague of mine did a study of engineering undergraduates, and the average engineering undergrad does 3,000 problem sets where there’s one right answer. But if you look at what the skills that are needed most in the workforce, it’s comfort with ambiguity and creativity and complex problem-solving.

That’s not what we’ve been assigning students.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by STARI
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Portrait of a Learner: From Vision to Districtwide Practice
Learn how one district turned Portrait of a Learner into an aligned, systemwide practice that sticks.
Content provided by Otus

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Opinion Schools Are Urged to Embrace AI—and Ban Phones. Can We Resolve the Tension?
Don’t reflexively adopt AI just because “that’s where the world is moving,” cautions Michael Horn.
8 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Artificial Intelligence A Group of Students Took a Deep Dive Into AI. Here’s What They Told Teachers
They came away with new skills, but also confronted some thorny ethical questions.
6 min read
JL838
Students at Percy Julian Middle School in Oak Park, Ill., volunteered this school year to use some of their recesses and lunch periods to investigate AI tools. They presented to the faculty as part of a panel discussion on April 8, 2026. Teacher Ashley A. Kannan, right, developed the idea for the project.
Joshua Lott for Education Week
Artificial Intelligence Opinion Bloom's Taxonomy Needs an Update for the AI Age
Here’s how one superintendent is reimagining the classic framework of learning objectives.
Jeffrey Schoonover
5 min read
Concept of AI, Digital brain with ai chip on generate bar. AI created generate art, text, video, and audio with prompt. Big data visualization and machine learning. Vector illustration.
Education Week + iStock/Getty Images
Artificial Intelligence Opinion Is Your School’s Approach to AI Too Flexible?
It’s tempting to prioritize adaptability when dealing with AI tools. It can also be a mistake.
Laura Arnett
3 min read
040726 opinion Arnett principal is in hendrie fs
F. Sheehan/Education Week via Canva