Since ChatGPT’s release, we have witnessed generative AI’s potential to push our 9th grade English students in ways that have surprised us. With ethical safeguards and guidance, they rise to the challenge of analyzing and questioning AI’s responses. Rather than viewing the technology as a way to cut corners or an unquestionable authority, our readers and writers use its responses to deepen their writing and thinking.
For a number of years, we have used in our classrooms the workshop model where the emphasis is always on process and discovery. Whether our students are drafting a personal narrative or crafting a polished argument, we guide them to embrace the messiness of writing. The workshop model is not just a structure for our classes but an educational philosophy that centers student ownership and authentic voice. In that frame, we present AI as another tool within the workshop, one that offers insights that may or may not be applicable.
Our students don’t write in isolation. They are part of a vibrant classroom community where feedback comes in all forms: teachers, peers, and yes, artificial intelligence.
As the leaders of the workshop, we have established a set of rules for writing with AI. (These can be found in our book AI in the Writing Workshop: Finding the Write Balance.)
Rule 1 is to write without any AI first. We want students to connect to their writing and to know what they are asking for feedback on rather than just asking AI to do the writing for them. Rule 2 is to struggle on your own before turning to AI.
One of our students, Lexi, spent the weekend mulling over what to do with the end of her poem, and when she returned to school, she was still undecided. We loaded her poem into an AI bot (Rule 3 is to prompt the bot), asking for feedback on her ending. After going over the list of suggestions provided by AI (Rule 4 is to question your AI results), we picked our favorite words or phrases to apply to her ending.
When Lexi commented in her reflection on the use of AI in the development of her poem, she wrote: “AI did help with my poem’s title and ending suggestions. I do not like to use AI, especially if I’m copying exactly what it states because, for lack of a better term, it feels too robotic and not organic. Whenever I use AI, I use it for inspiration and not to complete the assignment because I like how I write.”
When teachers give suggestions during a writing workshop, they don’t expect the student to accept every comment. In the same manner, suggestions from AI are just that, suggestions. We want our students to be aware of themselves as writers. Many, like Lexi, like how they write and don’t want to end up with what she calls a “robotic” voice.
We don’t seek to pit human authorship against AI; rather we aim to show how the two can work together.
In our classrooms, we also challenge the misconception that AI tools serve merely as shortcuts, bypassing critical thinking and creativity. We don’t seek to pit human authorship against AI; rather we aim to show how the two can work together. We can embrace AI as a thought partner, a means to enhance critical thinking and encourage deeper engagement with texts.
In her essay on “Macbeth,” another student, Cadence, illustrates this. When provided with an AI-generated essay on liquid imagery in the play, Cadence noted both strengths and gaps in the response. One surprising insight for her came when ChatGPT connected the witches’ cauldron with the play’s themes of chaos. “I didn’t have any notes about the witches’ scene with the cauldron,” she reflected, “but ChatGPT noted the liquid imagery ... and what it could symbolize.” This prompted her to expand her analysis, recognizing how liquid imagery in the witches’ scenes further conveyed Macbeth’s unraveling fate.
Our students’ ability to navigate feedback from AI reflects a shift in how we frame the role of technology in education. Even advanced technology is not a replacement for human thought but a complement, and sometimes even a catalyst, for profound learning.
Other students haven’t embraced AI. Lauren feels strongly about holding onto her own writing voice. In a class survey, she stated, “To me, writing is something that is purely made from passion and interest. I feel that this is what AI misses … . Although AI can be helpful in minor ways, such as finding a word that sounds just right in an assignment, I feel that I would be just fine without AI.” Lauren’s sentiments are valid. While some students thrive with AI’s push, others remain skeptical and choose to push back. Both responses are acceptable in our classrooms.
By shifting the image of AI from shortcut to scaffold, we encourage students and educators to engage with technology in ways that deepen learning and elevate authorship. Students can interrogate, question, and ultimately refine what AI provides.
Similarly, teachers must resist the siren song of instant grading using AI. If we teach our students throughout their writing lives what the grading robot says matters most, then we are teaching them that their audience doesn’t matter. The cycle of students handing in bot-written essays and teachers using AI to grade them is not an educational system either of us hope to see. Similar to our rule of writing first, teachers need to engage with student content before they consider using AI to help them provide feedback.
When we teach our students that their voices matter, their perspectives are unique, and their work is their own, we empower them to push their thinking, defend their ideas, and refine their craft. Even the most advanced technologies cannot rival the authenticity and depth of human authorship. Meaningful integration of AI into our classrooms challenges us to prioritize critical thinking and creativity over mere convenience. It fosters a generation of thoughtful and innovative learners, learners who will go beyond the basics when they find the balance between technology and humanity.