Opinion Blog

Classroom Q&A

With Larry Ferlazzo

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to lferlazzo@epe.org. Read more from this blog.

Artificial Intelligence Opinion

Wondering How to Deal With AI in Your Classroom? Try These Ideas

By Larry Ferlazzo — February 05, 2025 6 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
  • Save to favorites
  • Print

Today’s post is the latest in a two-year series on artificial intelligence in the classroom.

‘These Tools Aren’t Magic’

Jane Rosenzweig is the director of the Harvard College Writing Center, writes frequently about AI and education in her Writing Hacks newsletter and elsewhere, and publishes The Important Work, a newsletter written by high school and college writing instructors about teaching writing in the age of AI:

There’s a lot of discussion—and disagreement—about whether and how students should be using generative AI tools in the classroom. But as these tools become more and more widely available, what we do know is that our students are going to be using them and that we need to be able to talk to them about the role of generative AI in their education.

I’m wary of making grand pronouncements about how to talk to students about using generative AI tools when things are changing so quickly. But I do think there are some guidelines we can follow when deciding to use these tools in the classroom—or outside the classroom.

1. Teachers should understand the basics about how generative AI tools like ChatGPT actually work

I’m not suggesting that every teacher has to become a tech expert—I’m certainly not one. But if you’re going to use these tools or encourage your students to use them, it’s helpful to understand how these tools are trained and how they generate output.

Here’s an example from my own class: On the first day of class in the fall, one of my students mentioned that she really liked using ChatGPT because it’s more objective than humans. If you believed that, it would definitely shape how you use ChatGPT. But it’s not actually true: AI tools like ChatGPT can only answer questions based on what’s in their training data, and that data is drawn largely from what’s available online—not from some objective or all-knowing source.

AI tools also “hallucinate”—meaning they sometimes just give you inaccurate information. Students find it interesting to learn how these tools generate output, and you can explain this in ways that are grade appropriate. Here are some resources that I’ve found helpful for learning how generative AI tools work.

This explainer from the Financial Times explains how large language models work with helpful examples.

If you want to take a deeper dive, try this article, “Large language models explained, with a minimum of math and jargon.”

2. Talk to your students about what you want them to learn, not just about what tools like ChatGPT can do or whether they are allowed to use them.

I think it’s helpful to look at the use of generative AI tools in terms of what problems you’re trying to solve in the classroom. (In fact, I teach a writing course called To What Problem is ChatGPT the Solution.)

I’ve found this framework to be helpful for myself—but also for my students. I talk to them about what problems they’re solving when they use AI: Is it the problem of not having time to do the work? Is it the problem of not having an idea? Or is it an interesting, knotty problem that’s hard to solve that generative AI might help them solve in a cool way?

I also tell them that I’m not asking them to write papers because the world needs more papers; I’m asking them to write papers because it’s one way of thinking through a problem—and then we talk about how using AI at different points in the writing process may or may not get in the way of that thinking.

There’s a big difference between telling students to use or not to use generative AI and telling them why what you want them to do matters in the first place. Framing things this way may not always stop students from using these tools in ways you think are counterproductive—but it will help students understand where you’re coming from.

lookattheusejane

3. Be aware of the difference between useful and not useful ways of using these tools.

We’ve heard a lot about how AI tools like Khanmigo can provide personalized tutoring. But some teachers are finding that some students using these tools are not engaging with them or learning from them—and that sometimes the way Khanmigo helps students is different from what you’d do in your own classroom.

If you’re asking your students to use AI tools, it’s going to be helpful to be aware of how the same tool you’ve set up to enhance learning could get in the way of that learning. Dan Meyer offers a useful example of this over at his newsletter, Mathworlds.

4. Don’t remove the friction from the learning process.

Tools like ChatGPT are being marketed as efficiency tools—tools that will save us time so that, as OpenAI says, we can focus on other things. But learning requires time, and it requires friction.

If you’re going to use AI tools with your students, it’s useful to consider how you’re setting up assignments to allow for that productive friction.

When I made a chatbot to help my students practice counterargument, some of them were surprised that the chatbot didn’t enable them to do the work more quickly. But I wasn’t trying to help them be efficient; I was trying to help them learn something complicated.

I’ve written more about friction and learning here. This piece on friction and time-saving is a great overview of the conversation about friction and AI, with a focus on Magic School.

learningrequires

5. Beware of the hype.

It seems like new tools are being released every day, and I’m the first to note that tools like Google’s NotebookLM, which turns any text into a podcast, are pretty cool! But they were not designed to solve problems that we’re trying to solve in the classroom. They were designed to get people to use them.

I’ve found over the past few years that when I question the role of these tools in the classroom or express concerns about the hype, some people tell me that I must be anti-technology. But that’s not true at all—I was an early experimenter with GPT and I’m very interested in all of these tools.

However: It’s not our job as educators to adopt technology because it’s cool; it’s our job to ask hard questions and think about what will help our students learn. Which brings me back to my earlier question: When thinking about how to teach your students about AI, it’s useful to start by asking what problems you’re trying to solve in your classroom and how AI can help solve those (or whether it will create new ones).


We’ve entered an era where there will be new generative AI tools regularly that come with promises to magically solve all the challenges we face as teachers. But it’s worth keeping in mind that these tools aren’t magic—and that the way you choose to use them—or not—should always be based on what you’re trying to do in your classroom.

itsnotourjob

Thanks to Jane for contributing her thoughts!

Today’s post answered this question:

What are guidelines teachers should follow when teaching students to use or not use artificial intelligence?”

Consider contributing a question to be answered in a future post. You can send one to me at lferlazzo@epe.org. When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo.

Just a reminder; you can subscribe and receive updates from this blog via email. And if you missed any of the highlights from the first 12 years of this blog, you can see a categorized list here.

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Managing AI in Schools: Practical Strategies for Districts
How should districts govern AI in schools? Learn practical strategies for policies, safety, transparency, and responsible adoption.
Content provided by Lightspeed Systems
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Absenteeism Webinar
Turning Attendance Data Into Family Action
This California district cut chronic absenteeism in half. Learn how they used insight and early action to reach families and change outcomes.
Content provided by SchoolStatus
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Climb: A New Framework for Career Readiness in the Age of AI
Discover practical strategies to redefine career readiness in K–12 and move beyond credentials to develop true capability and character.
Content provided by Pearson

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence FAQ: Artificial Intelligence in Schools
Education Week answers some key questions about the use of artificial intelligence in schools.
1 min read
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Jae C. Hong/AP
Artificial Intelligence Students Are Worried That AI Will Hurt Their Critical Thinking Skills
Despite those concerns, students are using the tech more and more for schoolwork.
4 min read
Students present their AI powered-projects designed to help boost agricultural gains in Calla Bartschi’s Introduction to AI class at Riverside High School in Greer, S.C., on Nov. 11, 2025.
Students present their AI-powered projects designed to help boost agricultural gains during an introduction to AI class at a high school in Greer, S.C., on Nov. 11, 2025. A new RAND Corp. survey of middle, high school, and college students shows nearly 7 in 10 middle and high school students say they are concerned that using AI for schoolwork is eroding their critical thinking skills.
Thomas Hammond for Education Week
Artificial Intelligence How AI Could Help or Hurt Student Testing
There's a balance to strike that uses AI to improve assessments and keep humans in charge, experts say.
4 min read
TeachersAI SG01
Teachers attend a training session on using artificial intelligence at American Federation of Teachers headquarters in New York City on March 18, 2026. The union has partnered with AI developers to train 400,000 teachers on AI use in the classroom. One question teachers face is how best to use the technology as part of testing students' subject mastery.
Salwan Georges for Education Week
Artificial Intelligence Q&A How a School Uses AI to Address Student Behavior Problems
AI has helped streamline the development of behavior intervention plans, a school leader said.
4 min read
032026 AI SEL support 2162238913
Vanessa Solis/Education Week + DigitalVision Vectors