Opinion
Artificial Intelligence Opinion

AI Is Trained to Avoid These 3 Words That Are Essential to Learning

What type of “thinking” do chatbots model for students?
By Sam Wineburg & Nadav Ziv — October 15, 2025 5 min read
Human and AI robot completing a jigsaw puzzle, the human is holding the right piece
  • Save to favorites
  • Print

AI chatbots are like students who don’t do the reading and raise their hand anyway.

A new paper from researchers at OpenAI, the company behind ChatGPT, finds that a major reason AI hallucinates is because large language models are engineered to eliminate uncertainty. An AI chatbot that says “I don’t know” gets the same training score as one that offers incorrect information. As a result, AI provides confident-sounding answers that are frequently wrong. Many people are swayed by AI’s display of certainty, conflating the presentation of information with its quality.

Education should teach students to grapple with complexity. AI is designed to avoid it. This mismatch is yet another reason to slow roll the rush to put AI in students’ hands. It also points to a question teachers should ask themselves when evaluating how to integrate AI into the classroom: Can I use this tool in a way that models the thinking I want to teach my students?

For more than two decades, the Digital Inquiry Group and its earlier iteration at Stanford University have created curriculum that teaches students to read and think like historians, centered on document-based inquiry. After a 2016 study showed that young people struggle to evaluate online sources, our research group developed curriculum to help students separate fact from fiction on the internet.

What historical thinking and online reasoning share is the imperative to look beyond the surface of information and instead seek a broad context before diving in. Information doesn’t come out of nowhere: It’s authored by someone, somewhere, for some purpose. These considerations are essential in deciding what to trust.

Historians approach a document by sourcing it, glancing briefly at its contents before darting to the bottom to ponder its date, author, and relationship to the events it describes. These crucial details frame historians’ subsequent reading.

Similarly, when we studied how professional fact-checkers at the nation’s leading news outlets approach an unfamiliar website, we noticed that they almost immediately opened new tabs and read laterally to gain context. To investigate an unfamiliar digital text, a savvy reader, paradoxically, first needs to leave it.

The approach of both historians and fact-checkers differs from how students interact with historical texts, social media posts, and more recently, AI chatbot responses.

Many students see historical documents as vessels of information, not altogether different from their textbook, and are oblivious to authorship and inattentive to historical context. Likewise, students and adults alike are often swayed by the appearance of a video on social media or the authoritative tone of a website’s About page. Our most recent data at the Digital Inquiry Group suggest the same pattern may be emerging when it comes to AI—and that the chatbot’s confident tone is a likely culprit.

In a pilot study where students used AI to search the internet, we asked them to evaluate an answer from ChatGPT that failed to cite sources. “It touches on everything you could think of to ask,” said one student. “It gave a detailed response,” wrote another. We don’t want students to approach any source as the sole arbiter of truth, be it a traditional textbook or a shiny new chatbot. We never want them to rely on fabricated citations when they can’t find a source to support their argument—something that even professionals have been caught doing.

What we want is for students to weigh evidence. To recognize the challenging, fascinating, and rewarding process of piecing together a coherent account from multiple sources. To learn that admitting what we don’t know is its own achievement, its own unique form of knowledge.

And this is what concerns us about the findings of OpenAI’s researchers. Chatbots, the researchers admit, are programmed to provide authoritative responses to complex, thorny, and often unanswerable questions. The companies designing chatbots disincentivize the very expression of uncertainty that is so crucial in the classroom.

We shouldn’t hold our breath waiting for AI companies to fix their models. The good news is that AI is malleable enough, and individual users capable enough, that educators can take immediate action to apply lessons we already have about good thinking, good research, and good education.

Take, for example, an AI response that doesn’t cite sources. Even a few hours of instruction can get students to pay more attention to where information comes from. Our research group is now experimenting with teaching students how to prompt a chatbot to cite its sources. Our goal is to nudge students to be skeptical of AI answers pulled from Reddit threads and random blog posts and instead direct the model to sources that reflect subject-matter expertise. Students need to see their interaction with a chatbot as a process, much like knowledge creation, rather than a one-and-done exchange.

Despite its flaws, AI can serve as a powerful contextualization portal. But only if the people using it recognize how fallible it can be, how much we still don’t know about how it works, and learn how to prompt it so that it produces quality responses.

Information expert Mike Caulfield, for example, has illustrated how asking a chatbot to weigh the evidence for and against a claim can produce significantly better responses than just asking for a simple answer. With a more specific prompt, chatbots will often include qualifications about expert disagreement or lack of scholarly consensus.

Good educators don’t punish their students for uncertainty. And that means good educators should be cautious about placing a technology in students’ hands that’s trained to avoid saying “I don’t know.”

Our research group has long advocated digital literacy instruction and criticized approaches that tell students to stay away from search engines and shelter in the safety of peer-reviewed databases. But there is a difference between teaching students how to drive safely and throwing them in an F1 sports car before they have a license.
Too much of AI instruction right now looks like the latter. When it comes to AI in schools, all of us need a dose of humility that AI, at least for now, clearly lacks.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
College & Workforce Readiness Webinar
Smarter Tools, Stronger Outcomes: Empowering CTE Educators With Future-Ready Solutions
Open doors to meaningful, hands-on careers with research-backed insights, ideas, and examples of successful CTE programs.
Content provided by Pearson
Reading & Literacy Webinar Supporting Older Struggling Readers: Tips From Research and Practice
Reading problems are widespread among adolescent learners. Find out how to help students with gaps in foundational reading skills.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Improve Reading Comprehension: Three Tools for Working Memory Challenges
Discover three working memory workarounds to help your students improve reading comprehension and empower them on their reading journey.
Content provided by Solution Tree

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence The Rise of Deepfake Cyberbullying Poses a Growing Problem for Schools
The fallout from the spread of the manipulated photos and videos can create a nightmare for the victims.
4 min read
122225 education deepfakes AP BS
A school bus carries children at the end of a school day at Sixth Ward Middle School in Thibodaux, La., on Dec, 11, 2025. When a middle school student in Louisiana got into a fight with classmates who were sharing Al-generated nude images of her, she ended up getting expelled.
AP
Artificial Intelligence K-12 World Reacts to Trump’s Executive Order to Block State AI Regulations
The president says the patchwork of regulations across the states impedes AI companies’ growth.
2 min read
President Donald Trump speaks during an address to the nation from the Diplomatic Reception Room at the White House on Dec. 17, 2025, in Washington.
President Donald Trump addresses the nation from the Diplomatic Reception Room at the White House on Dec. 17, 2025, in Washington. Some experts on K-12 education are concerned that Trump wants to unleash the use of AI with very little regulation.
Doug Mills/The New York Times via AP
Artificial Intelligence What It Means for a High School Graduate to Be ‘AI-Ready’
Students should learn how to use AI to solve problems, new "Profile of an AI Ready Graduate" says.
2 min read
Students in Bentonville public schools’ Ignite program work on projects during class on Nov. 5, 2025, in Bentonville, Ark.
Students in Bentonville public schools’ Ignite program work on projects during class on Nov. 5, 2025, in Bentonville, Ark. The career pathways program emphasizes the development of AI skills.
Wesley Hitt for Education Week
Artificial Intelligence Opinion What Guidelines Should Teachers Provide for Student AI Use?
The goal is to teach students to harness AI to bolster learning and preserve their work's integrity. 
11 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
Sonia Pulido for Education Week