Opinion
Artificial Intelligence Opinion

I’m Not Worried AI Helps My Students Cheat. I’m Worried How It Makes Them Feel

Schools are asking the wrong question about artificial intelligence
By Stan Williams — February 12, 2026 4 min read
Photo illustration of high school students with pixelated headshots masking their faces.
  • Save to favorites
  • Print

I recently displayed a photo to a 9th grade humanities class and asked what should have been an easy question: “Is this real?”

The students voted, and half of them believed the image was AI-generated. It was not. It was a photo I had taken at a local Vermont beach.

I tried again, this time with a photo of my cat, Rafiki. The results were even more stark. Almost all the students thought it was an AI-generated image. Nope. It was real, taken in my kitchen.

When we talk about artificial intelligence in schools, we usually focus on cheating and plagiarism. But what I saw that day wasn’t about academic integrity. It was about trust.

When students can’t be sure whether a picture of their teacher’s cat is real, we are facing something much bigger than a student using ChatGPT on a history assignment. We are facing a world where certainty itself feels unstable and school suddenly feels like just another place where students aren’t sure they can trust the version of reality being offered.

For most of modern schooling, facts might get debated, but we didn’t question whether they existed. We argued over interpretation and meaning, not whether the basic thing in front of us was real. Our shared reality was the starting point. You could trust your eyes.

Many of us grew up inside that shared reality. I remember the World Book Encyclopedia. If someone had the “G” volume, you waited your turn before starting your report on Greenland or germs. Once you got the book, you trusted it. Information was scarce, and that book represented a version of the truth we generally agreed on, even when it was flawed and incomplete. It still gave us a shared starting point.

Today, that common floor has dropped out. Our students don’t have that shared reality. They exist within a digital feed that doesn’t stop and never seems to agree with itself. This erosion isn’t new. The internet had already made it easier to question everything and trust nothing before the widespread use of generative AI, but it has accelerated dramatically.

Truth used to feel like something we could find if we searched hard enough, but that’s not the experience our students are having. In fact, the more they search, the harder it can become to tell fact from fiction. In this new AI landscape, we’re all sorting through endless versions of reality and deciding which one we’re willing to live with. It’s exhausting and it’s exactly what I saw in my students when they looked at those photos.

This isn’t just an AI problem. If you can’t trust an image of a cat in a kitchen, it becomes harder to trust the larger promises society makes about the future. Gen Z financial commentator Kyla Scanlon calls this the “end of predictable progress.” For decades, the path was clear for many of us. You went to school, got a job, and eventually bought a house. But that path has dissolved into the same fog as AI-generated images.

Our students feel this instability everywhere. They are told AI may replace careers before they even start them. They see a housing market that feels permanently closed. They live in what Scanlon calls a “casino economy,” where a viral moment can feel more valuable than years of steady work.

The version of learning we’re offering our students no longer matches the world students are trying to survive. When even a teacher’s photo doesn’t feel stable, the old model of school cracks. If students are taught to question every headline and doubt every promise of the future, why would they walk into a classroom and trust us? School can start to feel like just another simulation, a game of compliance disconnected from the physical world they actually have to navigate.

School is too important to be a game. We have to stop asking the small questions. We spend so much time debating whether AI can do a student’s work, but the students are stuck on much more existential questions. They are trying to figure out if the work still matters, if school still matters, and honestly, if they still matter.

If school is going to mean anything in this world, maybe it’s time to shift from “learning” as a way to prepare for the future to learning as a way to understand and change the present. Students are demanding relevance. We can’t just hand them information anymore or tell them to trust us that what we’re teaching them today will matter in the future. We have to give them work that carries real and immediate consequence.

We need students creating things they can touch and solving problems in their own schools and neighborhoods that won’t get fixed unless they are there to do it. We need them grappling with what it means to be human, what it means to be needed, to be necessary. AI can write a report, but it can’t stand in the cold Vermont snow to help a neighbor. It can’t make students feel like they matter. That’s what will actually make school feel real.

With AI reshaping everything we see, showing our students how we live with uncertainty may be the most honest thing we can do. We have to stop pretending we have the answers and start making our own questions visible. When a source feels unreliable, we should think out loud. We need to model how we weigh evidence and how we decide what actually deserves our trust. No handbook or district policy can do that for us.

Trust is built by showing up for a student day after day. A chatbot can generate a perfect answer, but it can’t recognize the moment when a teenager finally starts to understand who they are and it can’t understand what it takes to keep showing up when everything feels uncertain. That is human work, and it is where teachers matter more than ever.

The goal is no longer just to teach the curriculum. The goal is to give students something real.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Bridging the Math Gap: What’s New in Dyscalculia Identification, Instruction & State Action
Discover the latest dyscalculia research insights, state-level policy trends, and classroom strategies to make math more accessible for all.
Content provided by TouchMath
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Belonging as a Leadership Strategy for Today’s Schools
Belonging isn’t a slogan—it’s a leadership strategy. Learn what research shows actually works to improve attendance, culture, and learning.
Content provided by Harmony Academy
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Too Many Initiatives, Not Enough Alignment: A Change Management Playbook for Leaders
Learn how leadership teams can increase alignment and evaluate every program, practice, and purchase against a clear strategic plan.
Content provided by Otus

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Letter to the Editor Train Teachers on AI? It’s Not That Simple
A letter to the editor shares what he thinks is the best way to prepare teachers.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Artificial Intelligence Teachers Want 'Guardrails and Guidance' on AI Use, Experts Tell Congress
Technology is evolving faster than policy can keep up, experts tell Congress.
3 min read
An art teacher uses the AI tool Google Gemini in her high school classroom, on Jan. 22, 2026, Riverside, Calif.
An art teacher uses the AI tool Google Gemini in her high school classroom, on Jan. 22, 2026, Riverside, Calif.
Damian Dovarganes/AP
Artificial Intelligence Video Is AI Good or Bad for Schools?
A growing number of educators are experimenting with generative AI. The challenge now is to share those lessons learned and best practices.
1 min read
Artificial Intelligence AI Is Poised to Reshape Social-Emotional Learning. But for Better or Worse?
Educators and researchers are only beginning to understand the implications of the technology.
4 min read
Image of AI interacting with a student and the student looks unsure.
sabelskaya/iStock/Getty