Opinion
School Climate & Safety Opinion

Handcuffed for Eating Doritos: Schools Shouldn’t Be Test Sites for AI ‘Security’

Why are unproven surveillance tools being piloted in schools?
By J.B. Branch — November 12, 2025 4 min read
Crowd of people with a mosaic digitized effect being surveilled by AI systems.
  • Save to favorites
  • Print

When a bag of Doritos triggers a police response, something in our approach to school safety has gone terribly wrong. Last month in Baltimore, an artificial intelligence system mistook a bag of chips for a gun, leading officers to handcuff a student at gunpoint and force him to his knees. Only after the arrest did they notice the crumpled chip bag.

This was a moment that could have ended in tragedy. Many Black families would have been reminded of Trayvon Martin, who was killed while holding Skittles, or closer to home in Baltimore, Freddie Gray, who died in police custody.

This incident isn’t an anomaly or a glitch. It’s a symptom of a growing, and troubling, trend: schools becoming test sites for unproven AI technology.

From safety plans to surveillance systems

I began my career as a teacher.

I have seen how quickly “safety initiatives” can turn into surveillance. I have witnessed students body-slammed, arrested, and suspended for behavior that should have been met with support. Over time, metal detectors, school resource officers, and now AI-based “threat detection” tools have transformed too many school hallways into something resembling a security checkpoint, not a place of learning.

Security companies promise that AI can remove human bias, and police are expanding their use of AI. But these programs often reproduce and accelerate institutional biases. These systems are trained on the same biased data that have resulted in well-known policing injustices. They capture our societal biases and use them to form the underlying basis of their algorithms. The results are systems that sanitize bias by translating it into “just numbers.”

When one digs further into the research behind many of these “unbiased” systems—often billed as fail-safes—their credibility starts to crumble. Gun-detection tools are prone to false positives, especially in varied lighting or crowded spaces. This is alarming for anyone who has stepped foot in a crowded school hallway during class exchanges with students darting left and right.

Meanwhile, facial-recognition systems misidentify people with darker skin at significantly higher rates. Again, this should be concerning as an equity and safety standpoint.

It is difficult to challenge these disparities when we treat the biased algorithms that produce them as “neutral.” This was the case in Baltimore as the developer, Omnilert, insisted the system “functioned as intended.” If pointing guns at a child with a snack is success, then we must question these systems.

The opportunity cost of tech-first safety

District leaders are under extraordinary pressure from rising concerns about gun violence, community calls for safer schools, and federal and state grants that make new security technology appealing. But we should ask: Why are unproven surveillance tools being piloted in schools before they are tested anywhere else?

We should be especially wary of these tools being piloted in schools serving Black and Latino students. It continues a long history of perpetuating the myth that children of color are more violent and require enhanced security measures to prevent them from becoming “super predators.”

Schools are supposed to be places where young people learn, feel supported, and make mistakes without fear of criminalization. But every dollar for an AI camera system is a dollar not spent on counselors or restorative-justice programs. These are school safety solutions that have real evidence behind them.

Educators must ask: Why are government grant dollars being eagerly thrown at companies pitching AI counselors, AI therapists, and now AI systems before hiring more human school counselors or more human therapists?

Technology cannot build relationships with students. It cannot mediate conflict on the playground. And it cannot heal historical trauma or repair trust. At the end of the day, school staff will always know their students’ quirks, personalities, and how to build trust far quicker than an AI system will.

A safer path forward

School leadership should treat AI safety tools, and the hype that surrounds them, with a healthy dose of skepticism. Ideally, these systems should be a last resort, and communities should instead prioritize alternatives like increasing school counselors and support staff. Further, any AI security system requires public transparency and community input. There must also be human oversight in real time to review life-threatening alerts, assess bias and civil rights implications, and follow clear protocols to prevent unnecessary escalation. In short, communities must demand responsibility before being sold unproven technology.

This isn’t a story about a malfunctioning camera, a single bad decision, or one school. It’s about the moral cost of turning the schoolyard into a marketplace for security companies and treating students, especially students of color, not as learners but as threats.

Schools keep children safe not through algorithms but through adults who know them, support them, and believe in their potential.

When technology becomes a substitute for human empathy and accountability, we are not protecting students. We are abandoning them.

Related Tags:

Events

Artificial Intelligence Live Online Discussion A Seat at the Table: AI Could Be Your Thought Partner
How can educators prepare young people for an AI-powered workplace? Join our discussion on using AI as a cognitive companion.
Student Well-Being & Movement K-12 Essentials Forum How Schools Are Teaching Students Life Skills
Join this free virtual event to explore creative ways schools have found to seamlessly integrate teaching life skills into the school day.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Bridging the Math Gap: What’s New in Dyscalculia Identification, Instruction & State Action
Discover the latest dyscalculia research insights, state-level policy trends, and classroom strategies to make math more accessible for all.
Content provided by TouchMath

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

School Climate & Safety 4 Ways Schools Can Build a Stronger, Safer Climate
A principal, a student, and a researcher discuss what makes a positive school climate.
4 min read
A 5th grade math class takes place at Lafargue Elementary School in Effie, Louisiana, on Friday, August 22. The state has implemented new professional development requirements for math teachers in grades 4-8 to help improve student achievement and address learning gaps.
Research shows that a positive school climate serves as a protective factor for young people, improving students’ education outcomes and well-being during their academic careers and beyond. A student raises her hand during a 5th grade class in Effie, La., on Aug. 22, 2025.
Kathleen Flynn for Education Week
School Climate & Safety Schools Flag Safety Incidents As Driverless Cars Enter More Cities
Agencies are examining reports of Waymos illegally passing buses; in another case, one struck a student.
5 min read
In an aerial view, Waymo robotaxis sit parked at a Waymo facility on Dec. 8, 2025 , in San Francisco . Self-driving taxi company Waymo said it is voluntarily recalling software in its autonomous vehicles after Texas officials documented at least 19 incidents this school year in which the cars illegally passed stopped school buses, including while students were getting on or off.
Waymo self-driving taxis sit parked at a Waymo facility on Dec. 8, 2025, in San Francisco. Federal agencies are investigating after Austin, Texas, schools documented incidents in which the cars illegally passed stopped school buses. In a separate incident, a robotaxi struck a student at low speed as she ran across the street in front of her Santa Monica, Calif., elementary school.
Justin Sullivan/Getty Images via TNS
School Climate & Safety Informal Classroom Discipline Is Hard to Track, Raising Big Equity Concerns
Without adequate support, teachers might resort to these tactics to circumvent prohibitions on suspensions.
5 min read
Image of a student sitting outside of a doorway.
DigitalVision
School Climate & Safety Officer's Acquittal Brings Uvalde Attack's Other Criminal Case to the Forefront
Legal experts say that prosecutors will likely consider changes to how they present evidence and witness testimony.
4 min read
Former Uvalde school district police officer Adrian Gonzales, left, talks to his defense attorney Nico LaHood during a break on the 10th day of his trial at Nueces County Courthouse in Corpus Christi, Texas, Tuesday, Jan. 20, 2026.
Former Uvalde school district police officer Adrian Gonzales, left, talks to his defense attorney Nico LaHood during a break on the 10th day of his trial at Nueces County Courthouse in Corpus Christi, Texas, Tuesday, Jan. 20, 2026. Jurors found Gonzales not guilty.
Sam Owens/Pool