Artificial intelligence is the future, and educators must incorporate the swiftly evolving technology into their instruction to both stay current and prepare their students for the jobs of the future. Or so say many experts in education and technology.
At the same time, experts caution that generative AI tools can be biased and “hallucinate” made-up answers to queries. Some recent research shows that the most sophisticated new versions of popular AI chatbots are hallucinating more than before, even as they get better at performing some tasks such as math problems.
It’s no small wonder that many teachers remain doubtful about using the technology in their classrooms, especially for anything beyond composing emails to parents or creating grading guidelines.
So, then, how exactly are teachers supposed to responsibly incorporate this technology into instruction? Education Week put this question to Rachel Dzombak, a professor of design and innovation in Carnegie Mellon University’s Heinz College of Information Systems and Public Policy.
This conversation has been edited for length and clarity.
Why does AI ‘hallucinate’?

I think of hallucinations just as, simply, when a tool generates an inaccurate response. Recently, I said [to an AI tool], “Pull direct quotes from this article,” and it looked like they were real quotes, but then I tried to find the quotes in the article, and they did not exist.
There are other types of systems, software systems, that are designed to give the one right answer. That’s not how AI systems are set up. AI systems are complex systems. They’re not trained to give a single right answer. AI systems are trying to find patterns across data, and what’s going to come out of it is what we call emergent effects. I think of it as 2+2=5: that there’s always going to be these unintended aspects. It’s what makes AI and large language models, specifically, really great at helping us brainstorm and do creativity tasks. But the flip side of that is it gets things wrong.
The flip side of the hallucination is this massive capacity to pull information together in new ways that was previously not possible with traditional software systems. There are just trade-offs.
What I can say is that the systems are going to continue to evolve in good and bad ways over time. So, we could see, because it is a maturing tool, these swings where [AI] gets worse at some things and it gets better at some things.
How does this then affect teachers who are using AI to teach?
They need to have a continued sense of curiosity of, what are these tools? What are they today, how are they changing? There are no hard and fast rules.
A big challenge people have today in the education space is not thinking about how to use [AI tools] in a way that really fits with their intention. [Use them] where you have more wiggle room where you’re not looking for that one exact answer, where variability and randomness is a good thing, a feature rather than a bug of the system. I think about this in my own classes: How do I encourage students to use generative AI tools in places where it makes sense, where it encourages them to be creative?
When should students use generative AI?
I see educators saying, “You’re not allowed to use a large language model to write a term paper in my class.” I personally think that students are going to find ways around that anyway. I think [teachers] should say, “If you’re going to, here are ways that you should think about using [AI], and here are the trade-offs. In many ways, it’s forcing creativity on the part of educators to rethink, how are we really hitting the learning outcomes that we want to hit?
Education has needed change for a long time. People are at times blaming generative AI tools for shifting education in the current moment: It’s enabling cheating, it’s enabling all of these things. But maybe it’s just shining a spotlight on behaviors that were already there. I’m an engineer by training, and a colleague of mine did a study of engineering undergraduates, and the average engineering undergrad does 3,000 problem sets where there’s one right answer. But if you look at what the skills that are needed most in the workforce, it’s comfort with ambiguity and creativity and complex problem-solving.
That’s not what we’ve been assigning students.