Classroom Technology

More Teachers Are Using AI-Detection Tools. Here’s Why That Might Be a Problem

By Arianna Prothero — April 05, 2024 7 min read
Close-up stock photograph showing a touchscreen monitor with a woman’s hand looking at responses being asked by an AI chatbot.
  • Save to favorites
  • Print

As ChatGPT and similar technologies have gained prominence in middle and high school classrooms, so, too, have AI-detection tools. The majority of teachers have used an AI-detection program to assess whether a student’s work was completed with the assistance of generative AI, according to a new survey of educators by the Center for Democracy & Technology. And students are increasingly getting disciplined for using generative AI.

But while detection software can help overwhelmed teachers feel like they are staying one step ahead of their students, there is a catch: AI detection tools are imperfect, said Victor Lee, an associate professor of learning sciences and technology design and STEM education at the Stanford Graduate School of Education.

“They are fallible, you can work around them,” he said. “And there is a serious harm risk associated in that an incorrect accusation is a very serious accusation to make.”

A false positive from an AI-detection tool is a scary prospect for many students, said Soumil Goyal, a senior at an International Baccalaureate high school in Houston.

“For example, my teacher might say, ‘In my previous class I had six students come up through the AI-detection test,’” he said, although he’s unsure if this is true or if his teachers might be using this as a scare tactic. “If I was ever faced with a teacher, and in his mind he is 100 percent certain that I did use AI even though I didn’t, that’s a tough scenario. [...] It can be very harmful to the student.”

Schools are adapting to growing AI use but concerns remain

In general, the survey by the Center for Democracy & Technology, a nonprofit organization that aims to shape technology policy, with an emphasis on protecting consumer rights, finds that generative AI products are becoming more a part of teachers’ and students’ daily lives, and schools are adjusting to that new reality. The survey included a nationally representative sample of 460 6th through 12th grade public school teachers in December of last year.

Most teachers—59 percent—believe their students are using generative AI products for school purposes. Meanwhile, 83 percent of teachers say they have used ChatGPT or similar products for personal or school use, representing a 32 percentage point increase since the Center for Democracy & Technology surveyed teachers last year.

The survey also found that schools are adapting to this new technology. More than 8 in 10 teachers say their schools now have policies either that outline whether generative AI tools are permitted or banned and that they have had training on those policies, a drastic change from last year when many schools were still scrambling to figure out a response to a technology that can write essays and solve complex math problems for students.

And nearly three-quarters of teachers say their schools have asked them for input on developing policies and procedures around students’ use of generative AI.

Overall, teachers gave their schools good marks when it comes to responding to the challenges created by students using generative AI—73 percent of teachers said their school and district are doing a good job.

That’s the good news, but the survey data reveals some troubling trends as well.

Far fewer teachers report receiving training on appropriate student use of AI and how teachers should respond if they think students are abusing the technology.

  • Twenty-eight percent of teachers said they have received guidance on how to respond if they think a student is using ChatGPT;
  • Thirty-seven percent said they have received guidance on what responsible student use of generative AI technologies looks like;
  • Thirty-seven percent also say they have not received guidance on how to detect whether students are using generative AI in their school assignments;
  • And 78 percent said their school sanctions the use of AI detection tools.

Only a quarter of teachers said they are “very effective” at discerning whether assignments were written by their students or by an AI tool. Half of teachers say generative AI has made them more distrustful that students’ schoolwork is actually their own.

A lack of training coupled with a lack of faith in students’ work products may explain why teachers are reporting that students are increasingly being punished for using generative AI in their assignments, even as schools are permitting more student use of AI, the report said.

Taken together, this makes the fact that so many teachers are using AI detection software—68 percent, up substantially from last year—concerning, the report said.

“Teachers are becoming reliant on AI content-detection tools, which is problematic given that research shows these tools are not consistently effective at differentiating between AI-generated and human-written text,” the report said. “This is especially concerning given the concurrent increase in student disciplinary action.”

Simply confronting students with the accusation that they used AI can lead to punishment, the report found. Forty percent of teachers said that a student got in trouble for how they reacted when a teacher or principal approached them about misusing AI.

What role should AI detectors play in schools’ fight against cheating?

Schools should critically examine the role of AI-detection software in policing students’ use of generative AI, said Lee, the professor from Stanford.

“The comfort level we have about what is an acceptable error rate is a loaded question—would we accept one percent of students being incorrectly labeled or accused? That’s still a lot of students,” he said.

A false accusation could carry wide-ranging consequences.

“It could put a label on a student that could have longer term effects on the students’ standing or disciplinary record,” he said. “It could also alienate them from school, because if it was not AI produced text, and they wrote it and were told it’s bad, that is not a very affirming message.”

Additionally, some research has found that AI detection tools are more likely to falsely identify English learners’ writing as produced by AI.

Low-income students may also be more likely to get in trouble for using AI, the CDT report said because they are more likely to use school-issued devices. Nearly half the teachers in the survey agree that students who use school-provided devices are more likely to get in trouble for using generative AI.

The report notes that students in special education use generative AI more often than their peers and special education teachers are more likely to say they use AI-detection tools regularly.

Research is also finding that there are ways to trick AI detection systems, said Lee. And schools need to think about the tradeoffs in time and resources of keeping abreast with inevitable developments both in AI, AI-detection tools, and students’ skills at getting around those tools.

Lee said he sees why detection tools would be attractive to overwhelmed teachers. But he doesn’t think that AI detection tools should alone determine whether a student is improperly using AI to do their schoolwork. It could be one data point among several used to determine whether students are breaking any—what should be clearly defined—rules.

In Poland, Maine, Shawn Vincent is the principal of the Bruce Whittier middle school, serving about 200 students. He said that he hasn’t had too many problems with students using generative AI programs to cheat. Teachers have used AI-detection tools as a check on their gut instincts when they have suspicions that a student has improperly used generative AI.

“For example, we had a teacher recently who had students writing paragraphs about Supreme Court cases, and a student used AI to generate answers to the questions,” he said. “For her, it did not match what she had seen from the student in the past, so she went online to use one of the tools that are available to check for AI usage. That’s what she used as her decider.”

When the teacher approached the student, Vincent said, the student admitted to using a generative AI tool to write the answers.

Teachers are also meeting the challenge by changing their approaches to assigning schoolwork, such as requiring students to write essays by hand in class, Vincent said. And although he’s unsure about how to formulate policies to address students’ AI use, he wants to approach the issue first as a learning opportunity.

“These are middle school kids. They are learning about a lot of things this time in their life. So we try to use it as an educational opportunity,” he said. “I think we are all learning about AI together.”

Speaking from a robotics competition in Houston, Goyal, the high school student from Houston, said that sometimes he and his friends trade ideas for tricking AI-detection systems, although he said he doesn’t use ChatGPT to do the bulk of his assignments. When he uses it, it’s to generate ideas or check grammar, he said.

Goyal, who wants to work in robotics when he graduates from college, worries that some of his teachers don’t really understand how AI detection tools work and that they may be putting too much trust in the technology.

“The school systems should educate their teachers that their AI-detection tool is not a plagiarism detector [...] that can give you a direct link to what was plagiarized from,” he said. “It’s also a little bit like a hypocrisy: The teachers will say: Don’t use AI because it is very inaccurate and it will make up things. But then they use AI to detect AI.”


This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Budget & Finance Webinar
Innovative Funding Models: A Deep Dive into Public-Private Partnerships
Discover how innovative funding models drive educational projects forward. Join us for insights into effective PPP implementation.
Content provided by Follett Learning
Budget & Finance Webinar Staffing Schools After ESSER: What School and District Leaders Need to Know
Join our newsroom for insights on investing in critical student support positions as pandemic funds expire.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Student Achievement Webinar
How can districts build sustainable tutoring models before the money runs out?
District leaders, low on funds, must decide: broad support for all or deep interventions for few? Let's discuss maximizing tutoring resources.
Content provided by Varsity Tutors for Schools

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Classroom Technology Video How Pedagogy Can Catch Up to Artificial Intelligence
Educators need to start considering how AI's capabilities should change what students learn, experts say.
1 min read
052224 EW LeadSym 406 BS
Chris Ferenzi for Education Week
Classroom Technology From Our Research Center The AI Classroom Hype Is All Wrong, Some Educators Say
Amid all the encouragement to try the technology, there are plenty of educators who don’t plan to start.
1 min read
Illustration of a large, sinking iceberg forming the letters "AI" as a business professional stands on the tip of the iceberg that remains above water with his hands on his hips and looking out into the large sea.
Classroom Technology What Worries District Tech Leaders Most About AI? (It’s Not About Teaching)
A new report from the Consortium for School Networking explores district tech leaders' top priorities and challenges.
3 min read
Motherboard image with large "AI" letters with an animated magnifying glass pans in from the left.
Classroom Technology From Our Research Center How Educators Are Using AI to Do Their Jobs
Educators are slowly experimenting with AI tools in a variety of ways, according to EdWeek Research Center survey data.
2 min read
Tight crop of a white computer keyboard with a cyan blue button labeled "AI"