Special Report
Classroom Technology Reported Essay

No, AI Won’t Destroy Education. But We Should Be Skeptical

Artificial intelligence is a reminder of the importance of teaching students how to learn
By Lauraine Langreo — August 31, 2023 9 min read
Illustration of stylized teacher student relationship with AI represented between them as layered screens.
  • Save to favorites
  • Print

Type “artificial intelligence news” on any search engine and you would likely see the same doomsday headlines I’ve seen in the last few months: “How Could A.I. Destroy Humanity?”; “Goldman Sachs Predicts 300 Million Jobs Will Be Lost or Degraded by Artificial Intelligence”; and “The AI-Powered, Totally Autonomous Future of War Is Here.”

Even the AI-powered autocomplete search-engine features finished the sentence “Will artificial intelligence …” with similar apocalypses: … replace humans? … take away jobs? … take over the world?

Artificial intelligence and its use are not new. Computer scientists have been working on improving the technology for decades, and a lot of the tools we use daily—navigation apps, facial recognition, social media, voice assistants, search engines, smartwatches—run on AI. And beyond that, most—if not all—industries are already using AI one way or another. It’s in health care, transportation, the military, finance, telecommunications, drug research, education, and more.

3 steps for teachers to prepare

Stylized illustration of a teacher figure using a futuristic smart board
Traci Daberko for Education Week
Classroom Technology What Teachers Need to Know About AI, But Don’t
Lauraine Langreo, August 31, 2023
3 min read

But since the arrival of ChatGPT almost a year ago, AI has captivated the public’s attention and reignited discussions about how it could transform the world. In the K-12 space, educators have been discussing what and how much of a role AI should play in instruction, especially as AI experts say today’s students need to learn how to use it effectively in order to be successful in future jobs.

Right now, educators are unsure of AI’s superpowers. When the EdWeek Research Center asked a nationally representative group of 1,301 educators over the summer what they thought the impact of AI would be on teaching and learning in their school or district over the next five years, 49 percent said AI will have an “equally negative and positive” impact, 28 percent said “mostly negative,” 13 percent said “mostly positive,” and 10 percent said “no impact.”

About This Project

This story is part of a special project called Big Ideas in which EdWeek reporters ask hard questions about K-12 education’s biggest challenges and offer insights based on their extensive coverage and expertise.

Even with the healthy dose of skepticism, several educators have told me that schools need to accept that ChatGPT and other AI tools like it are here to stay. Schools, they argue, need to find ways to use the technology for the benefit of teaching and learning while being aware of its potential downsides.

“AI is calling for a fundamental reevaluation” of what the goal of education is, said Chad Towarnicki, an 8th grade English teacher in the 4,800-student Wissahickon school district in Pennsylvania.

What’s different with the arrival of ChatGPT

AI technologies replicate human thinking by training computer systems to do tasks that simulate some of what the human brain can do. They rely on systems that can actually learn, usually by analyzing vast quantities of data and searching out new patterns and relationships. These systems can improve over time, becoming more complex and accurate as they take in more information. (Or they can become more inaccurate if they’re pulling from faulty data).

ChatGPT is an AI-powered tool from research laboratory OpenAI that can hold humanlike conversations and instantly answer seemingly any prompt. But instead of learning a computer programming language to talk to the chatbot, people can just communicate with it in their natural language.

“People began to realize something is different,” said Glenn Kleiman, a senior adviser at the Stanford Graduate School of Education whose research focuses on the potential of AI to enhance teaching and learning. “Suddenly, the capabilities became available to everybody and easily accessible.”

Now, people are using AI to plan trips, draft emails, organize essays, summarize research papers, and write code. In K-12 classrooms, teachers have used ChatGPT to plan lessons, put together rubrics, provide students feedback on assignments, respond to parent emails, and write letters of recommendation.

It’s easy to get wrapped up in the hype surrounding the transformative powers of this next generation of AI—many technology CEOs have been quick to talk up its groundbreaking potential. With its new capabilities, AI can become our co-author, co-pilot, or personal assistant. Sam Altman, the CEO of OpenAI, believes the technology will help people become way more efficient and productive at their jobs. He sees it as an engine for new job creation.

Doomsday scenarios aren’t likely but ‘not impossible’

But many people are also raising cautionary flags about generative AI. Thousands of executives, researchers, and engineers who work in the AI field have sounded the alarm more than once in recent months that AI poses a “risk of extinction” of the human race and have called for a moratorium on its development. Even Altman said he’s a “little bit scared” of AI and conceded that it will eliminate many jobs.

What happens when the superpowers of AI fall into the wrong hands? Or what if militaries around the world—which already have some autonomous weapons—fall into competitive pressure to create more sophisticated autonomous weapons to the point where they’re uncontrollable and unpredictable?

Hal Abelson, a professor and researcher of computer science and artificial intelligence at the Massachusetts Institute of Technology, told me that while many of the doomsday scenarios we hear about in the media about AI aren’t likely, they’re also “not impossible.”

And beyond those scenarios, “there’s a whole long list of concerns,” Abelson said. “We don’t even know what they are yet because this is merely just starting.”

Generative AI tools are trained at a certain time, and the datasets they are trained on are not updated regularly, so these tools can provide outdated information or can fabricate facts when asked about events that occurred after they were trained. For instance, the free version of ChatGPT doesn’t have training beyond events and information available in 2021.

We’re at a point in time where “it’s very hard to identify what’s false, and a lot of people believe it,” Abelson said. “Does that mean that as a society we are no longer even aware that there’s such a thing as [objective] truth? What does that mean for our society?”

Education doesn’t go away. It just needs to change.

And because the datasets on which the AI tools are trained contain racist, sexist, homophobic, violent, and other biased information, that is also included in the responses generated by the tools. In fact, when you first log into ChatGPT, it warns you that it may “occasionally generate incorrect or misleading information and produce offensive or biased content.”

These AI tools—if left unchecked—could amplify harmful stereotypes about people who are already disadvantaged, according to Yeshimabeit Milner, the founder and CEO of Data for Black Lives, a nonprofit organization whose mission is to use data and technology to improve the lives of Black people.

To combat the inaccuracies that come with using these AI models, some education organizations are focusing on a version of the technology some are calling “walled-garden AI.” A walled-garden AI is trained only on content vetted by its creator, instead of all the unchecked content all over the internet. One example of this is Stretch, a (not-yet-publicly-available) chatbot trained only on information that was created or vetted by the International Society for Technology in Education and the Association for Supervision and Curriculum Development. There’s also Khanmigo, a chatbot developed by the nonprofit Khan Academy that acts like a tutor.

These more focused bots could potentially be an excellent model for K-12 schools to use because they’re (theoretically) more tailored to the needs of educators and students. Some experts warn that these models will still have to work to keep out bad information.

What students need to succeed in an AI-powered world

With all that in mind, it’s imperative for the K-12 system to prepare students to be successful in the age of AI.

ChatGPT has made it “painfully obvious that teaching the old ways and teaching the old curriculum is going to be out of date,” said Hadi Partovi, the CEO of Code.org, a nonprofit organization dedicated to expanding access to computer science education in schools. “How we work is going to change, and it also means how we prepare students for living in a digital world is going to change.”

To prepare for a future where AI is everywhere and in everything, students will still need to know the foundational skills in reading, math, science, and history. But schools will also need to be more explicit about teaching students how to learn rather than what to learn, because that will help them become much better problem-solvers.

“We need [education] to evolve for a world of lifelong learning,” Partovi said. “And knowing that in every job and every year you’re going to be learning new things using digital access to information and AI tools to help you along the way. That’s really a different format of learning than what most of us learned in K-12.”

Students will need to examine information with a critical and skeptical lens. If a chatbot says “something that sounds fishy, students should be able to say, ‘Well, maybe it’s not true,’” Abelson said. “That’s a skill that everybody’s going to need, as these AI systems permeate the environment.”

Students will also need to learn how to use AI as a tool, as an assistant and an adviser, in order for them to be better decisionmakers. Schools already teach the importance of teamwork and collaboration among students, but “the tweak I would make to that is that teams now should include some computer programs and some people,” said Tom Mitchell, a professor and researcher in machine learning and artificial intelligence at Carnegie Mellon University.

AI should be high up on school districts’ priority lists

The K-12 education system tends to be a slow-moving bureaucratic machine that is unable to respond quickly to change. Change is careful and methodical in the field, and some would argue that’s a good thing because it prevents schools from jumping on every trendy or bandwagon idea when it comes along.

And, sure, K-12 education has a lot on its plate right now. Staffing shortages have worsened. Student academic achievement and mental health have plummeted. Staff morale and student motivation are low in many schools across the country.

But this isn’t the first time the education system has had to deal with big changes. The most recent example is the pandemic, when schools had to suddenly switch to remote learning. District leaders had to adapt quickly to attend to the needs of their students and staff.

This moment with AI shouldn’t be different. Every day, there are new, groundbreaking developments in AI. It is in our faces and consequently forcing all of us—particularly schools—to take a hard look at what education in the age of AI should look like and what it shouldn’t look like. And schools need to do this now, before they fall further behind and risk leaving kids unprepared for their future.

Think about it. Big existential questions are already being raised about the role of AI in education, such as: Does the use of AI defeat the purpose of education?

The answer is a resounding no—at least from me and the people I talked to. In fact, everyone I interviewed argued that the existence of AI makes the purpose of education even more clear: to learn how to learn in an ever-changing, increasingly complex world.

“If we’re questioning the whole point of education, then it’s like we’re just sitting back and letting AI take control of everything instead of being the ones that are able to control it,” said Stephanie Harbulak, the curriculum, instruction, and assessment specialist at Meeker & Wright Special Education Cooperative in Minnesota. “Education doesn’t go away. It just needs to change.”

A version of this article appeared in the September 13, 2023 edition of Education Week as No, AI Won’t Destroy Education. But We Should Be Skeptical

Events

School Climate & Safety K-12 Essentials Forum Strengthen Students’ Connections to School
Join this free event to learn how schools are creating the space for students to form strong bonds with each other and trusted adults.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Math for All: Strategies for Inclusive Instruction and Student Success
Looking for ways to make math matter for all your students? Gain strategies that help them make the connection as well as the grade.
Content provided by NMSI

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Classroom Technology New Data Reveal How Many Students Are Using AI to Cheat
Recent advances in generative AI have not led to a massive rise in student cheating. But fixating on cheating may cause its own problems.
5 min read
Photo of student using chatGPT/AI.
iStock / Getty Images Plus
Classroom Technology Download AI Do's and Don'ts for Teachers (Downloadable)
Larry Ferlazzo and Katie Hull Sypnieski share some AI best practices for teachers.
Larry Ferlazzo & Katie Hull Sypnieski
1 min read
Classroom Technology A Deep Dive Into TikTok's Sketchy Mental Health Advice
Students should apply the same media literacy skills to mental health information that they would to other content, experts say.
8 min read
The TikTok logo is seen on a mobile phone in front of a computer screen which displays the TikTok home screen, Oct. 14, 2022, in Boston.
The TikTok logo is seen on a mobile phone in front of a computer screen which displays the TikTok home screen, Oct. 14, 2022, in Boston.
Michael Dwyer/AP
Classroom Technology The Best Science Fiction to Teach About AI, From Teachers
Science fiction can help students understand AI and its potential impacts, teachers say.
6 min read
3D rendered illustration of the moment an artificial intelligence becomes sentient.
E+/Getty