Opinion
Artificial Intelligence Opinion

ChatGPT Can Make English Teachers Feel Doomed. Here’s How I’m Adapting

Two ways I’m changing the point of my class this year
By David Nurenberg — October 16, 2024 5 min read
Hands handwriting and doodling over fading binary code.
  • Save to favorites
  • Print

I’m in the midst of the most existentially dreadful fall of my 25-year English teaching career. Last year, too many of my high school students ChatGPTed their way through too many of my assignments—and they weren’t alone. According to one 2023 study, 20 percent of students reported using an AI chatbot to prepare the entirety of a paper, project, or assignment; unlike a plagiarized paper, which can become the genesis of a conversation (and consequences) from which students can learn, AI usage is essentially unpoliceable via school policies, and technology experts do not believe reliable AI-detection tools will emerge soon.

Am I now doomed to just go through the motions of class, unsure if my students are really writing anything, or if there is even value any longer in them learning how? Recent Atlantic think-pieces have outright declared the end of high school English. As one writing professor bemoaned to the magazine, “With ChatGPT, everything feels pointless.”

To combat those narratives of doom and pointlessness, I hope to change the “point” of my class this year, in two major ways.

First, I plan to focus more on process than product. Since 2001, schools have operated in No Child Left Behind’s world of “measurable outcomes.” We threw out the student-centered, constructivist methods of the 1970s, which focused on the journey of the learner and, instead, adopted backward design, aiming our teaching toward students meeting performance goals on all-important assessments. Products and scores became the proof of learning.

But today, AI has broken that fundamental equation; it produces products instantly, no learning required.

Therefore, what I’ve decided to do now is spend more time assessing my students’ process of drafting multiple iterations of their work, with and without AI assistance. I’ll depend increasingly on students’ structured in-class reflections, rather than on their finished essay itself, to demonstrate learning.

Yes, I could just have my students write everything in class, by hand—but we don’t write essays just to produce essays. Rather, we write them as a means of developing our faculties for analysis, evaluation and presentation of evidence, and clear communication of ideas.

Assessing the more complex aspects of an essay—originality of ideas, incisiveness, sophistication of argument—takes longer, is more nuanced, and raises perhaps legitimate concerns about consistency, equity, and subjectivity in grading. But I don’t think English teachers now have any other choice. I’m considering jettisoning those “objective” but oh-so-limited rubrics for assessing an essay’s basic structural components. Instead, I’m experimenting with letting students have ChatGPT instantly create those “five-paragraph essays” for them and then helping them examine what’s worth keeping, what they might want to modify, and why, in order to make the writing more ambitious, more distinctive and personal to each of them.

Many of us have always pushed our students toward “big-idea critical thinking,” but making the fuzzier aspects of writing the core of what we assess and grade will present challenges. It may become harder to compare students’ progress against one another, which will alarm those who depend on such comparisons and rankings for the purposes of everything from college admissions to identifying equity concerns.

Then again, by looking at the process of how students develop their thinking, perhaps we can shift the emphasis to comparing each student against their own past progress, which is both more pedagogically useful and, I believe, more humane.

But, I also want to reduce the role that writing plays in my classroom in general. This may seem like anathema to our profession, but only because the last 30 years have moved English away from exploring what great literature could teach us about the human condition and toward teaching students “job useful” writing skills. Yet, my friends with office jobs routinely outsource their memos, annual reports, and grant proposals to ChatGPT.

Since AI has automated much “practical” writing, while simultaneously raising enormous questions about what it means to be human, perhaps it’s time for English teachers to return to the less measurable—but arguably more important—philosophical work we used to do.

For centuries, authors from Plato to Mary Shelly to Aldus Huxley have written about how humans have grappled with society-changing technologies; an even wider range of authors have explored love and rejection and loss, what constitutes a meaningful life, how to endure despair and face death.

And we don’t just read these books for prescriptive advice—especially in an isolating age like ours, we read to know we’re not alone.

Additionally, numerous studies have made the link between reading and empathy; immersing students in fictional worlds is vital preparation for navigating the highly diverse, highly polarized communities in which we live.

So, too, is class discussion. Although I still plan to use short, in-class writing assignments as one means to assess student thinking, I am substantially increasing the role of discussions: paired, small group, and full class.

This will require including more support and scaffolding for students whose social-emotional or linguistic needs might create barriers for them, but that only makes the practice more necessary. Learning how to be good speakers and listeners, how to actively engage, how to respectfully disagree—these skills have atrophied in our post-pandemic, digitally mediated and politically divided world.

The English classroom may now be the only place where many students can get practice in real-time social interaction and discourse that is at the heart of a functioning democracy. I’ll be spending less time on grammar and mechanics and more on analyzing and synthesizing competing narratives, current and historical (that several states now try to ban such historical analysis only reaffirms its necessity).

The humanities have spent the last three decades desperate to prove that we’re of “practical use.” Well, these skills are the “new practical.”

I’m hardly alone; many of my colleagues are making similar shifts. But many more are afraid to do so because those all-important state standardized tests reward rote skills more than complex, critical thinking. While the pandemic briefly engaged policymakers’ creativity around alternate assessment methods, their support for those traditional multiple-choice tests has since come roaring back in both K-12 and higher ed. If COVID wasn’t enough to force policymakers to realize the futility of continuing with accountability as we currently know it, maybe AI will be.

Focusing on process, and on “big picture” issues, will make grading messier. But I can get behind a future where the bots take care of anything requiring simple-to-measure skills, leaving teachers and students alike to focus on the importantly messy work of figuring out how to be amazing humans. I admit I don’t have a crystal-clear vision of what this new shape of English class will eventually look like. But then, neither does ChatGPT. That’s the whole darned idea.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Managing AI in Schools: Practical Strategies for Districts
How should districts govern AI in schools? Learn practical strategies for policies, safety, transparency, and responsible adoption.
Content provided by Lightspeed Systems
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Absenteeism Webinar
Turning Attendance Data Into Family Action
This California district cut chronic absenteeism in half. Learn how they used insight and early action to reach families and change outcomes.
Content provided by SchoolStatus
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Climb: A New Framework for Career Readiness in the Age of AI
Discover practical strategies to redefine career readiness in K–12 and move beyond credentials to develop true capability and character.
Content provided by Pearson

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Teachers Move Beyond AI Basics to More Sophisticated Instructional Uses
A national AI training academy introduces teachers to complex collaboration with the technology.
5 min read
TeachersAI SG21
Teachers participate in a team exercise at the first training session of the National Academy for AI Instruction on March 18, 2026, at UFT headquarters in New York City. The partnership between the American Federation of Teachers and major AI developers aims to train 400,000 teachers to use artificial intelligence in the classroom.
Salwan Georges for Education Week
Artificial Intelligence Opinion Why Teachers Shouldn’t Offload Their Busywork to AI
The idea that AI can let teachers carve out more time for students is appealing, intuitive—and wrong.
Daniel Buck
4 min read
AI chip hype concept, GPU. Red microchips with AI printed on falling off a production line.
Education Week + iStock
Artificial Intelligence How Do Parents Want Schools to Handle AI? Insights From a New Survey
Regardless of political affiliation, 79% of parents want more protection for kids.
4 min read
Bruce Perry, 17, demonstrates the possibilities of artificial intelligence by creating an AI companion on Character.AI,, July 15, 2025, in Russellville, Ark.
A 17-year-old in Russellville, Ark., creates an AI companion on Character.AI, on July 15, 2025. In a recent survey, parents said AI chatbots should be required to provide pop-up warnings before displaying sensitive topics related to violence, self-harm, or abuse.
Katie Adkins/AP
Artificial Intelligence Real-Time Data Shows Exactly How Students Use AI on School Technology
About 20% of student interactions with AI using school technology involved problematic behaviors.
4 min read
Vector illustration of a robotic trojan horse in a gift box with the letters AI on the top of the box and inside behind the horse.
Xeniya Udod Femagora/DigitalVision Vectors