Opinion
Artificial Intelligence Opinion

AI in the Classroom Is Often Harmful. Why Are Educators Falling Prey to the Hype?

Teachers who rush to embrace chatbots are ignoring the essence of education
By Alfie Kohn — September 22, 2025 5 min read
Virtual hands both toss out and try to reach a brain thrown from a laptop computer.
  • Save to favorites
  • Print

When powerful institutions announce their intention to impose—and profit from—a radical transformation of our schools, our workplaces, and our daily lives, we have an obligation to ask whether what they’re unleashing is really in our best interests. If, instead, we just shrug and accept it as inevitable, we are shirking our responsibility and, indeed, surrendering our autonomy.

This dynamic has never been clearer than in the case of AI. Educators who rush to adopt programs like ChatGPT may not just be overestimating the capabilities of artificial intelligence but underestimating the essence of education.

The people most receptive to this technology, according to a recent survey, are those who know the least about it. Specifically, they may not be aware of AI’s staggering energy requirements, allegations that it was built on stolen data, the fact that corporations are less interested in assisting workers than in replacing them, its contribution to mass surveillance and the disruption of elections around the world, and the troubling implications of normalizing relationships with counterfeit humans for therapy, friendship, and even romance.

Or perhaps they don’t realize that AI tools produce factual errors more than half the time, according to two studies, meaning that they require human fact-checking and don’t save much time. (It’s tempting to assume that AI’s accuracy will improve, but some experts are finding that the opposite is true.)

AI products merely generate statistically probable responses to prompts; they cannot think or “know” things. Using them to write for us is therefore particularly problematic since, as educator John Warner put it, “the fundamental unit of writing is not the sentence but the idea.” To write is to construct meaning and persuade a reader whose perspective you try to imagine. A chatbot can only sneeze out some words that resemble an essay or a summary of someone else’s. In one novelist’s apt analogy, students who use this software won’t learn to write any more than they’d become physically fit by bringing a forklift to the gym to lift weights for them.

Nor is it sensible to have AI read for you—not only because its summaries are often wildly inaccurate but because the intrinsic value of reading is lost. Imagine a world where people’s primary engagement with books and articles—including great literature—consists of having a computer boil them down to a tl;dr. Even when gleaning information is the primary purpose of reading documents, doing it yourself can lead in unexpected directions and yield serendipitous insights.

A few years ago, tech critic Cory Doctorow foresaw a time when half of us, being busy or lazy, would feed a few bullet points into AI software so it could inflate them into an impressively formal document. That document would be sent to the other half of us, who, also being busy or lazy, would use similar software to reduce it to a few bullet points.

At best, training students to use a chatbot is very different from helping them reason through a problem, read deeply, or organize and express their own thoughts.

Call it MOBS, for Machines On Both Sides, and it’s happening for real—in schools and elsewhere. First, with the encouragement of administrators, education publishers, and even unions, teachers are using AI to create lesson plans—which, according to one study of 310 such plans in social studies, tend to emphasize rote memorization. Students then turn to chatbots for help with the assignments. Since kids don’t make the rules, their use of the technology is called “cheating.” Teachers complete the cycle by using similar tech tools to grade the students’ work—and perhaps to catch those who relied on AI. Finally, students who derive no benefit from this exchange can seek extra help ... from chatbottutors.”

Data to demonstrate any educational advantages from AI are sparse and, some experts say, based on poorly designed or misleadingly reported experiments. Other investigations suggest that its effects may actually be harmful. A 2024 study found that high school math students tutored by ChatGPT initially scored better on tests, but the benefit soon evaporated and they ended up faring worse than those who hadn’t used AI, apparently because they failed to acquire conceptual understanding. A 2025 experiment discovered a clear “cognitive cost” to receiving AI help with writing essays, and a third study reported that more use of AI was “associated with lower critical thinking skills.”

So what’s its appeal? AI is seductive for those who regard education as a series of steps toward a credential; one merely completes graded tasks to collect credits and, eventually, a diploma. The point is to emit behaviors (such as producing essays) rather than to play with ideas. ChatGPT just reinforces this troubling transactional model.

see also

English teacher Casey Cuny, center, helps a student input a prompt into ChatGPT on a Chromebook during class at Valencia High School in Santa Clarita, Calif., on Aug. 27, 2025.
English teacher Casey Cuny, center, helps a student input a prompt into ChatGPT on a Chromebook during class at Valencia High School in Santa Clarita, Calif., on Aug. 27, 2025. Teachers have mixed opinions about the impact of AI in the classroom.
Jae C. Hong/AP

If you’re convinced that the prose generated by chatbots outweighs the cons, then, sure, experiment with them. But don’t do so because the corporations that have bet their (and our) future on them play on your fears of being left behind or insist that it’s too late to unwind their technology and all we can do is tell people to use it “responsibly.”

At best, training students to use a chatbot is very different from helping them reason through a problem, read deeply, or organize and express their own thoughts. At worst, it’s teaching them how to avoid doing these things—not only irrelevant to a teacher’s primary objectives but inimical to them. Instead, teach students to analyze AI critically, to identify our tendency to anthropomorphize a text-extruding machine, to notice how the words it strings together are distinguished by an eerily insipid blandness coupled with absolute certitude (as it informs us that, say, Einstein invented the smoothie).

Anyone alarmed at turning chatbots loose in our schools should speak out and connect with other skeptics. You know those ads that tech companies insert into our correspondence (“Sent from my iPhone”)? What if we used the signature line to make a statement instead, ending our emails with “This message certified AI-free”? Now imagine a sign with that sentence tacked up on classroom walls, except with the word “message” replaced by “school” and ending with the assurance “Teaching and learning here are accomplished proudly by human beings.”

A version of this article appeared in the November 01, 2025 edition of Education Week as The dangerous allure of AI in education

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Managing AI in Schools: Practical Strategies for Districts
How should districts govern AI in schools? Learn practical strategies for policies, safety, transparency, and responsible adoption.
Content provided by Lightspeed Systems
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Absenteeism Webinar
Turning Attendance Data Into Family Action
This California district cut chronic absenteeism in half. Learn how they used insight and early action to reach families and change outcomes.
Content provided by SchoolStatus
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Climb: A New Framework for Career Readiness in the Age of AI
Discover practical strategies to redefine career readiness in K–12 and move beyond credentials to develop true capability and character.
Content provided by Pearson

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence FAQ: Artificial Intelligence in Schools
Education Week answers some key questions about the use of artificial intelligence in schools.
1 min read
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Jae C. Hong/AP
Artificial Intelligence Students Are Worried That AI Will Hurt Their Critical Thinking Skills
Despite those concerns, students are using the tech more and more for schoolwork.
4 min read
Students present their AI powered-projects designed to help boost agricultural gains in Calla Bartschi’s Introduction to AI class at Riverside High School in Greer, S.C., on Nov. 11, 2025.
Students present their AI-powered projects designed to help boost agricultural gains during an introduction to AI class at a high school in Greer, S.C., on Nov. 11, 2025. A new RAND Corp. survey of middle, high school, and college students shows nearly 7 in 10 middle and high school students say they are concerned that using AI for schoolwork is eroding their critical thinking skills.
Thomas Hammond for Education Week
Artificial Intelligence How AI Could Help or Hurt Student Testing
There's a balance to strike that uses AI to improve assessments and keep humans in charge, experts say.
4 min read
TeachersAI SG01
Teachers attend a training session on using artificial intelligence at American Federation of Teachers headquarters in New York City on March 18, 2026. The union has partnered with AI developers to train 400,000 teachers on AI use in the classroom. One question teachers face is how best to use the technology as part of testing students' subject mastery.
Salwan Georges for Education Week
Artificial Intelligence Q&A How a School Uses AI to Address Student Behavior Problems
AI has helped streamline the development of behavior intervention plans, a school leader said.
4 min read
032026 AI SEL support 2162238913
Vanessa Solis/Education Week + DigitalVision Vectors