Whether students should use AI in the classroom is a hotly contested issue. Will it boost their learning and prepare them for the workforce or will it lead to digital dependence and cognitive atrophy? Regarding teacher use of artificial intelligence, however, there’s a growing consensus that it might not be such a bad thing. That argument is appealing, intuitive, … and wrong.
Seeing as teachers spend upward of 29 hours a week on nonteaching tasks, the argument runs, educators should offload that work to AI and focus on higher impact aspects of instruction. For example, a 2023 Department of Education report recommends that teachers use AI for everything from providing feedback on student essays to planning their classroom routines, and a 2025 executive order seeks to help teachers “utilize AI in their classroom” in order to, among other things, reduce “time-intensive administrative tasks.”
Offloading that work to AI, while it might streamline the teaching process, would be a grave mistake. Those seemingly inconsequential tasks are many of the most fundamental and important aspects of the teaching profession. Far from being side tasks, they are the work of teaching itself. Consider a few examples.
According to an Education Week survey of 990 educators, a common use of AI among teachers is drafting emails. “I find it much quicker to type in the general idea,” one teacher wrote, “and receive an email I could have written, but it would have taken me 15 minutes or more.”
It’s certainly easier to have AI write that email, but is it better?
When I was an assistant principal, I had to write many sensitive emails—about poor behaviors, special education accommodations, restroom accidents, bullying, fights, abuse, and plenty more. Switching to a tab with ChatGPT open almost certainly would have made my job easier, but it would have made me a less effective leader and administrator.
Substantive reflection undergirds a well-written email. It’s why only a few lines can sometimes take us 15 minutes or even a whole afternoon to write. We’re not struggling to tap out letters but to understand the situation and write exactly what we want to say.
If I had to write an email to a parent about their child’s behavior, for example, I had to consider and think through the problem. Why was this student acting out? What’s the history behind this one incident? Should I loop in our special education coordinator? What should I as a school leader, the parent, and the teacher do going forward?
Drafting emails took time, but it necessitated that I think through those questions and develop potential solutions, which meant I would better manage whatever problem our school faced going forward. If I offload that task to Claude, I’m offloading that deliberate reflection, too.
This same cognitive tradeoff exists for just about any other use case that a teacher or ed-tech consultant could dream up.
If a teacher uses Grok to draft questions for an exam, they neglect the contemplation that undergirds a well-sequenced, thought-out unit or activity. When I was an English teacher, thinking through even an end-of-period reflection question forced me to consider how the day’s discussion fit into the broader unit and year, what I was asking and why precisely, and what kind of answers I hoped to see from students. Then, during the discussion, my earlier thoughts informed my follow-up questions, which rabbit holes I might pursue or avoid, and the passages from the day’s reading to which I wanted to draw attention.
The craft of teaching is refined through struggling with and fussing over these seemingly inconsequential tasks.
Beyond cognitive offloading, teachers’ use of AI also changes the relational dynamics of the classroom.
AI can obviously digest student work and provide feedback. But if I had chosen not to read all student work in order to “streamline my workflow,” I would have learned far less about my students and their academic strengths and weaknesses. Meanwhile, my students would have received feedback that wasn’t really from me. Would they have bothered to read it, trust it, or even complete their own work?
In the classroom, the interpersonal connection forged by shared academic work builds relationships essential to student engagement and academic success. Reliance on AI interrupts that connection.
The consequences of AI reliance may be more obvious in a physical analog: athletics. After all, athletes only spend a small portion of their careers in competition—the vast majority involves spending hours training in the gym or on practice. Why not let a machine run the routes or complete the reps, so athletes can focus on the more important work of competing? Doing so would cause their muscles to atrophy and weaken the team bond built through months of struggle and practice.
Much of the work of an athlete is the practice and the physical training. Likewise, much of the work of a teacher is in reflection and planning. If educators outsource that to an AI assistant, don’t be surprised when classroom community and academic outcomes rapidly deteriorate.