Opinion
Artificial Intelligence Opinion

AI in the Classroom Is Often Harmful. Why Are Educators Falling Prey to the Hype?

Teachers who rush to embrace chatbots are ignoring the essence of education
By Alfie Kohn — September 22, 2025 5 min read
Virtual hands both toss out and try to reach a brain thrown from a laptop computer.
  • Save to favorites
  • Print

When powerful institutions announce their intention to impose—and profit from—a radical transformation of our schools, our workplaces, and our daily lives, we have an obligation to ask whether what they’re unleashing is really in our best interests. If, instead, we just shrug and accept it as inevitable, we are shirking our responsibility and, indeed, surrendering our autonomy.

This dynamic has never been clearer than in the case of AI. Educators who rush to adopt programs like ChatGPT may not just be overestimating the capabilities of artificial intelligence but underestimating the essence of education.

The people most receptive to this technology, according to a recent survey, are those who know the least about it. Specifically, they may not be aware of AI’s staggering energy requirements, allegations that it was built on stolen data, the fact that corporations are less interested in assisting workers than in replacing them, its contribution to mass surveillance and the disruption of elections around the world, and the troubling implications of normalizing relationships with counterfeit humans for therapy, friendship, and even romance.

Or perhaps they don’t realize that AI tools produce factual errors more than half the time, according to two studies, meaning that they require human fact-checking and don’t save much time. (It’s tempting to assume that AI’s accuracy will improve, but some experts are finding that the opposite is true.)

AI products merely generate statistically probable responses to prompts; they cannot think or “know” things. Using them to write for us is therefore particularly problematic since, as educator John Warner put it, “the fundamental unit of writing is not the sentence but the idea.” To write is to construct meaning and persuade a reader whose perspective you try to imagine. A chatbot can only sneeze out some words that resemble an essay or a summary of someone else’s. In one novelist’s apt analogy, students who use this software won’t learn to write any more than they’d become physically fit by bringing a forklift to the gym to lift weights for them.

Nor is it sensible to have AI read for you—not only because its summaries are often wildly inaccurate but because the intrinsic value of reading is lost. Imagine a world where people’s primary engagement with books and articles—including great literature—consists of having a computer boil them down to a tl;dr. Even when gleaning information is the primary purpose of reading documents, doing it yourself can lead in unexpected directions and yield serendipitous insights.

A few years ago, tech critic Cory Doctorow foresaw a time when half of us, being busy or lazy, would feed a few bullet points into AI software so it could inflate them into an impressively formal document. That document would be sent to the other half of us, who, also being busy or lazy, would use similar software to reduce it to a few bullet points.

At best, training students to use a chatbot is very different from helping them reason through a problem, read deeply, or organize and express their own thoughts.

Call it MOBS, for Machines On Both Sides, and it’s happening for real—in schools and elsewhere. First, with the encouragement of administrators, education publishers, and even unions, teachers are using AI to create lesson plans—which, according to one study of 310 such plans in social studies, tend to emphasize rote memorization. Students then turn to chatbots for help with the assignments. Since kids don’t make the rules, their use of the technology is called “cheating.” Teachers complete the cycle by using similar tech tools to grade the students’ work—and perhaps to catch those who relied on AI. Finally, students who derive no benefit from this exchange can seek extra help ... from chatbottutors.”

Data to demonstrate any educational advantages from AI are sparse and, some experts say, based on poorly designed or misleadingly reported experiments. Other investigations suggest that its effects may actually be harmful. A 2024 study found that high school math students tutored by ChatGPT initially scored better on tests, but the benefit soon evaporated and they ended up faring worse than those who hadn’t used AI, apparently because they failed to acquire conceptual understanding. A 2025 experiment discovered a clear “cognitive cost” to receiving AI help with writing essays, and a third study reported that more use of AI was “associated with lower critical thinking skills.”

So what’s its appeal? AI is seductive for those who regard education as a series of steps toward a credential; one merely completes graded tasks to collect credits and, eventually, a diploma. The point is to emit behaviors (such as producing essays) rather than to play with ideas. ChatGPT just reinforces this troubling transactional model.

see also

English teacher Casey Cuny, center, helps a student input a prompt into ChatGPT on a Chromebook during class at Valencia High School in Santa Clarita, Calif., on Aug. 27, 2025.
English teacher Casey Cuny, center, helps a student input a prompt into ChatGPT on a Chromebook during class at Valencia High School in Santa Clarita, Calif., on Aug. 27, 2025. Teachers have mixed opinions about the impact of AI in the classroom.
Jae C. Hong/AP

If you’re convinced that the prose generated by chatbots outweighs the cons, then, sure, experiment with them. But don’t do so because the corporations that have bet their (and our) future on them play on your fears of being left behind or insist that it’s too late to unwind their technology and all we can do is tell people to use it “responsibly.”

At best, training students to use a chatbot is very different from helping them reason through a problem, read deeply, or organize and express their own thoughts. At worst, it’s teaching them how to avoid doing these things—not only irrelevant to a teacher’s primary objectives but inimical to them. Instead, teach students to analyze AI critically, to identify our tendency to anthropomorphize a text-extruding machine, to notice how the words it strings together are distinguished by an eerily insipid blandness coupled with absolute certitude (as it informs us that, say, Einstein invented the smoothie).

Anyone alarmed at turning chatbots loose in our schools should speak out and connect with other skeptics. You know those ads that tech companies insert into our correspondence (“Sent from my iPhone”)? What if we used the signature line to make a statement instead, ending our emails with “This message certified AI-free”? Now imagine a sign with that sentence tacked up on classroom walls, except with the word “message” replaced by “school” and ending with the assurance “Teaching and learning here are accomplished proudly by human beings.”

A version of this article appeared in the November 01, 2025 edition of Education Week as The dangerous allure of AI in education

Events

Webinar Supporting Older Struggling Readers: Tips From Research and Practice
Reading problems are widespread among adolescent learners. Find out how to help students with gaps in foundational reading skills.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Improve Reading Comprehension: Three Tools for Working Memory Challenges
Discover three working memory workarounds to help your students improve reading comprehension and empower them on their reading journey.
Content provided by Solution Tree
Recruitment & Retention Webinar EdRecruiter 2026 Survey Results: How School Districts are Finding and Keeping Talent
Discover the latest K-12 hiring trends from EdWeek’s nationwide survey of job seekers and district HR professionals.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence What It Means for a High School Graduate to Be ‘AI-Ready’
Students should learn how to use AI to solve problems, new "Profile of an AI Ready Graduate" says.
2 min read
Students in Bentonville public schools’ Ignite program work on projects during class on Nov. 5, 2025, in Bentonville, Ark.
Students in Bentonville public schools’ Ignite program work on projects during class on Nov. 5, 2025, in Bentonville, Ark. The career pathways program emphasizes the development of AI skills.
Wesley Hitt for Education Week
Artificial Intelligence Opinion What Guidelines Should Teachers Provide for Student AI Use?
The goal is to teach students to harness AI to bolster learning and preserve their work's integrity. 
11 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
Sonia Pulido for Education Week
Artificial Intelligence Opinion ‘Instant Support’: Why We Should Embrace AI Tools for English Learners
Though not a replacement for educators, it can be a powerful ally, writes Jean-Claude Brizard.
Jean-Claude Brizard
5 min read
students translating on laptops screen literature news summarization artificial intelligence concept
iStock/Getty
Artificial Intelligence Q&A How One District Uses AI to Build More Efficient Master School Schedules
In tight budgetary times, AI can find savings in schools' class schedules.
5 min read
Illustration of calendar and AI assistant.
iStock