Opinion
Artificial Intelligence Opinion

AI in the Classroom: What a Skeptic and an Optimist Can Both Agree On

4 steps to embrace chatbots—with the right guardrails
By Pedro A. Noguera & Enrique Noguera — January 15, 2026 5 min read
Composite artwork sketch image collage of intelligence assistant creative device internet icon ai hand type laptop cogwheel magnifying glass
  • Save to favorites
  • Print

Artificial intelligence has already entered K–12 classrooms, whether schools are ready for it or not. From lesson planning and grading to essay writing and research, AI tools are changing how teachers teach and how students learn. Some educators see endless possibilities for innovation, while others worry that these same tools could weaken students’ ability to think critically, write clearly, and solve problems independently.

As an educator who trains future teachers (Pedro) and one who works with community college students transitioning from high school (Enrique), we find ourselves both hopeful and uneasy about AI’s potential. Uncle and nephew, members of two generations, we are bullish and bearish at the same time, and we believe this tension reflects where K–12 education stands right now.

The Skeptic’s View (Pedro, the uncle)

AI may be impressive, but we risk letting it replace the very cognitive and social skills that schools are meant to cultivate. Many teachers already report that students use ChatGPT to finish assignments without reading the material or developing their own ideas. The time students spend thinking, drafting, and revising is an essential part of learning that these students are missing out on entirely.

New research from the University of Southern California’s Center for Generative AI and Society suggests how convenience-driven technologies like generative AI can erode essential skills. Many people can’t navigate without GPS or do even simple arithmetic in their heads. Why assume AI will be different?

AI promises efficiency, but in education, efficiency can come at a cost. When students rely on algorithms to generate answers, they lose the opportunity to wrestle with ideas, make mistakes, and build understanding through effort. Teachers know that deep learning often happens in moments of confusion or struggle.

We find ourselves both hopeful and uneasy about AI’s potential.

That’s why schools should move cautiously before fully integrating AI tools. Instead of banning them outright or embracing them wholesale, educators should design assessments that ensure students can demonstrate original thought. Oral presentations, Socratic seminars, and project-based learning allow teachers to see what students know and how they think. These methods preserve the human elements of curiosity, originality, and critical reasoning—qualities no machine can replicate.

If we’re not intentional, the “efficiency” AI offers could hollow out the learning process itself.

The Optimist’s View (Enrique, the nephew)

I share the concern about shortcuts, but I’ve also seen how AI can enhance learning when used thoughtfully. In my research on community college faculty use of generative AI, I found that teachers who frame AI as a “cognitive companion” rather than a replacement see meaningful gains in student engagement and reflection. I believe that insight applies across grade levels.

For example, some teachers ask students to write an essay draft, use AI to revise it, and then submit a reflection explaining what changed and why. Others have students use AI to brainstorm ideas, generate questions, or fact-check responses. These strategies don’t weaken critical thinking, they strengthen it. Students learn to analyze, critique, and improve their work while developing awareness of how AI tools function.
What makes this moment promising is that many K–12 teachers are already experimenting on their own. They’re discovering ways to integrate AI to personalize instruction, support English learners, and provide real-time feedback. But most are doing this without much institutional guidance or professional development.

That’s why I’ve developed a framework I call AI Pedagogical Literacy, a practical approach that helps educators understand how to integrate AI responsibly. It’s not about teaching students how to use AI tools; it’s about helping teachers design learning experiences where AI amplifies, rather than replaces, human thought. This means knowing when to use AI, how to verify its output, and how to keep human reasoning at the center of every task.

Rather than fearing AI, we should prepare teachers and students to use it wisely. The real danger isn’t the technology itself; it’s a lack of guidance and support.

Finding Common Ground

AI is here to stay, and ignoring it won’t make it go away. But integrating it into the classroom without creating guardrails could lead to a nightmare scenario where teachers defer to technology and students stop thinking for themselves. The path forward lies between fear and blind enthusiasm.

We agree on several steps K–12 schools should take right now:

  • Invest in teacher training. Every educator should understand how AI works, its limitations, and its ethical implications. Teachers who are comfortable with AI will be better positioned to guide students in using it responsibly.
  • Design authentic assessments. When assignments require oral defense, collaboration, or reflection, students must engage deeply with the material—AI can’t do that for them. Research suggests that assessment designs emphasizing higher-order thinking (analysis, evaluation, creation) are more resistant to AI misuse than traditional recall tasks.
  • Teach dual literacies. Students need traditional skills like writing and problem-solving and new literacies that involve verifying sources, detecting bias, and prompting AI effectively.
  • Foster collaboration between educators. Schools should create spaces for teachers to share AI-integrated lesson plans and discuss what works, avoiding fragmented experimentation.

We may differ in how quickly schools should adopt AI, but we share a conviction that educators, not algorithms, must determine what learning should look like in an AI age.

AI’s promise and peril are two sides of the same coin. Used carelessly, it can erode student effort and grit. Used wisely, it can open new pathways for creativity and deeper understanding. The question for K–12 educators is not whether to use AI but how to ensure it strengthens rather than diminishes what makes learning human.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Integrating and Interpreting MTSS Data: How Districts Are Designing Systems That Identify Student Needs
Discover practical ways to organize MTSS data that enable timely, confident MTSS decisions, ensuring every student is seen and supported.
Content provided by Panorama Education
Artificial Intelligence Live Online Discussion A Seat at the Table: AI Could Be Your Thought Partner
How can educators prepare young people for an AI-powered workplace? Join our discussion on using AI as a cognitive companion.
Student Well-Being & Movement K-12 Essentials Forum How Schools Are Teaching Students Life Skills
Join this free virtual event to explore creative ways schools have found to seamlessly integrate teaching life skills into the school day.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Letter to the Editor I’m Pro-Technology, But AI’s Role in Education Worries Me
A parent shares his concerns with artificial intelligence in K-12.
1 min read
Education Week opinion letters submissions
Gwen Keraval for Education Week
Artificial Intelligence 'Grok' Chatbot Is Bad for Kids, Review Finds
The chatbot on X suggests risky behavior, and is unsafe for teens, Common Sense Media says.
4 min read
Workers install lighting on an "X" sign atop the company headquarters, formerly known as Twitter, in downtown San Francisco, July 28, 2023. Grok is the artificial intelligence chatbot built into the social media platform X.
Workers install lighting on an "X" sign atop the company headquarters of X, a social media platform formerly known as Twitter, in San Francisco on July 28, 2023. Grok is the artificially intelligent chatbot built into the social media platform.
Noah Berger/AP
Artificial Intelligence States Put 'Unprecedented' Attention on AI's Role in Schools
Most of the bills address AI literacy and require guidance on responsible use of the technology.
4 min read
Image of AI in a magnifying glass superimposed over an aerial view of a school.
Collage via EdWeek and Getty
Artificial Intelligence 'Dangerous, Manipulative Tendencies’: The Risks of Kid-Friendly AI Learning Toys
Toys powered by AI often generate inappropriate responses to questions.
4 min read
Photo illustration of a 3d rendering of a chatbot hovering over a motherboard circuit.
iStock/Getty