Facial-Recognition Systems Pitched as School-Safety Solutions, Raising Alarms
On the heels of horrific school shootings and mounting concerns about student safety, technology companies and K-12 leaders are starting to consider a new strategy: facial recognition.
In Arkansas and New York, districts already have laid out significant investments for systems that promise to combine surveillance cameras with machine learning algorithms to identify people, objects, and even behaviors that could present safety threats.
Consumer companies such as Face-Six, Suspect Technologies, and FaceFirst—most of which initially deployed their technologies in law enforcement, public safety, and retail settings—are exploring the education market, from day-care centers to universities.
And on Tuesday, online streaming giant RealNetworks released a new facial recognition software called SAFR (pronounced “safer”) that it says will be freely available and can be used with existing security camera installations, beginning immediately.
“School safety has become one of the top national issues in the United States in 2018,” said Rob Glaser, RealNetworks’ chairman and CEO, in a press release. “We are proud to give our … technology solution to every school in America.”
The blitz has begun even as a wide range of school security experts, privacy advocates, and technology leaders are issuing sharp warnings about facial recognition’s potential downsides.
Trying to prevent school shootings, deny campus access to parents who have been banned, and monitor students inside the school building are different safety challenges that require different approaches, experts say.
Even if facial recognition can be deployed in a manner that is well-aligned to such a challenge, there remain significant questions about its effectiveness and accuracy, particularly when it comes to identifying people of color.
And then there are the massive concerns about potential privacy and civil-liberties violations. Just last week, for example, Microsoft President Brad Smith called for government regulation of facial recognition, writing that the tools now entering the market raise “issues that go to the heart of fundamental human rights protections like privacy and freedom of expression.”
Add it all up, and K-12 leaders and policymakers should think twice before looking to facial recognition, said school-safety consultant Kenneth Trump.
“There’s a strong drive by the security industry to push their products under a target-hardening approach to school safety,” Trump said. “But is this the best use of time, energy, and resources right now? My answer is ‘no.’”
From Consumers to Schools
In an interview, Glaser of RealNetworks said he shares some critics’ concerns. But rather than put their heads in the sand, he said, schools and companies should start working together now to figure out the best uses—and unresolved challenges—of facial recognition.
“We didn’t want to be overly cautious in how we entered the market,” Glaser said.
“You can say, ‘This hasn’t all been figured out yet, so let’s not start the deployment-slash-experimentation yet.’ Or you can say, ‘The people who care about this issue the most are exactly the ones who should engage up front.’”
Like many other facial-recognition companies circling the K-12 market, RealNetworks’ primary base is elsewhere.
Since its founding in 1994, the company has built a steady stream of consumer and commercial digital media applications. One of the more popular is RealTimes, a mobile app that creates slideshows out of the photos and videos on users’ smartphones.
The app’s popularity also had a spinoff benefit, Glaser said: It provided the company with a database of 20 million faces, half of which RealNetworks has been able to use as a “training set” to teach machine-learning algorithms how to recognize individual faces.
In late 2016 and early 2017, Glaser said, his team started to realize the power of the algorithms they were developing. They started thinking about potential commercial applications.
Then, last December, Glaser was dropping his own children off at the independent University Child Development School in Seattle. He noticed the school’s new security system, which involved a gate and a security camera intended to better control access to the campus.
“I went to the head of school and said, ‘My company is working on a new technology, do you want to see if we can help you out here?’” Glaser said.
The school, concerned about the increasing density and traffic of the surrounding neighborhood, agreed. RealNetworks designed a system that let staff members, parents, and regular visitors opt to have their images and identities stored in a database. During the school day, when the gate to UCDS’s campus is locked, the security camera scans the faces of those who want to enter. The company’s algorithms scan the visitors’ images, looking for matches to those in the database.
At first, the software was just a supplement: If it found a match, SAFR would alert a school staff person, who would manually open the gate.
But over a couple months, the school community grew comfortable with the algorithms’ accuracy—and appreciative of the increased convenience it afforded.
Now, the software can unlock the gate itself. Almost every staff member, as well as the families of about one-third of the school’s 325 students, have opted in, said Paula Smith, UCDS’s head of school. Such interest helped convince RealNetworks to quickly try take its tool national.
To spur interest, the company is making its software free (“We’ll make our money elsewhere,” Glaser said.) The state of Wyoming has already expressed interest.
“There’s a national debate raging on school safety, and it’s very polarizing,” Glaser said. “But we’re here sitting on something that’s not polarizing to this community.”
Can the Technology Prevent School Shootings?
Other facial-recognition companies describe similar enthusiasm.
“The number of concerned administrators, security personnel, and even parents contacting us has grown over the past year,” said FaceFirst CEO Peter Trepp via email. “Some see face recognition as an answer to daily concerns such as identifying expelled students, banned parents, and other unwanted campus visitors. We have also been contacted by family members of children who have been killed in school shootings, in hopes of preventing future tragedies.”
The technology’s ability to achieve the latter objective “depends on the circumstances,” Trepp added.
Consider the February massacre at Marjory Stoneman Douglas High in Parkland, Fla., he wrote. There, expelled student Nikolas Cruz—whose troubled and violent history was well-known to both school and law-enforcement authorities—came on to campus armed with an AR-15 assault rifle, which he used to kill 17 students and staff members.
“A situation such as this one is tailor-made for face recognition, which can send alerts to on-campus personnel when such an individual appears on campus,” Trepp wrote.
But the reality, said Trump, the school-safety consultant, is that most school shootings involve someone “who is already inside the school on a legitimate basis.” It’s unclear how facial-recognition might help in those scenarios.
And trying to prevent a school shooting is hardly the same as enforcing everyday school discipline. That’s what Jeffrey Rabey, the superintendent of New York’s Depew school system, told the Buffalo News he hopes to do with the facial-recognition technology his district is considering.
“If we had a student who committed some type of offense against the code of conduct, we can follow that student throughout the day to see maybe who they interacted with, where they were prior to the incident, where they went after the incident, so forensically we could also use the software in that capacity as well,” Rabey told the News in May.
In general, schools don’t appear to have really thought through their rationales for deploying facial recognition, said Faiza Patel, the co-director of the Liberty & National Security Center at the Brennan Center for Justice.
Nor have they fully considered the nuts-and-bolts details of what such a deployment might entail, Patel said, such as:
- Will surveillance cameras and facial-recognition software be deployed only at an entrance or all over school?
- Will schools use “whitelist” systems, in which the database includes those who are allowed access to a school, or “blacklist” systems, which focus on those who are banned?
- Will they use facial recognition to look only for adults’ identities, or also to examine students’ movements, dress, and behavior?
- And how are schools—where minority children already often face disproportionately severe discipline—accounting for research indicating that most facial-recognition is better at identifying white men then women, people of color, or children?
“There’s a fantasy element to all this right now,” Patel said.“Districts shouldn’t deploy facial recognition without first figuring out what problem they’re trying to address, whether the technology is a fit, and how they’re going to manage the well-known risks.”
Facial Recognition and Privacy Concerns
And such practical considerations don’t even get to broader concerns around privacy and surveillance, experts said.
It remains unclear what data, exactly, facial-recognition systems will be storing. Some approaches may not even be legal in a place like Florida, where state legislation prohibits public schools from collecting, obtaining, or retaining biometric information on students.
Questions about how, where, and for how long the data from facial-recognition systems will be stored are also likely to be contentious.
And who will have access to the information? Will it be shared with law enforcement agencies or immigration officials? What about third-party vendors? Can companies use the data collected by facial-recognition systems in schools to improve their algorithms for other commercial uses?
Glaser said RealNetworks has already worked through some of those issues. All of the data collected via SAFR is encrypted and stored locally, on district servers, he said. If a government agency requests (or subpoeanas) information, RealNetworks will have nothing to share. The company has “architected SAFR for privacy to ensure we don’t have any access to K-12 face data for training the algorithm or any other purpose,” he said.
But significant privacy-related questions remain, he acknowledged, and others are sure to emerge as facial recognition is deployed in more school settings.
And some privacy groups say that issues of technical compliance with laws and best practices should not be the only considerations.
The New York Civil Liberties Union, for example, has been challenging the decision by the Lockport school district to use more than $4 million in state bond money to purchase a new facial-recognition system.
Students and families shouldn’t have to worry that their every move is being monitored, or that their picture might end up in a law-enforcement or immigration database just because they showed up to campus, said Stefanie Coyle, the education counsel at the New York Civil Liberties Union.
And big-picture, Coyle said, schools need to be aware of the ways in which even well-intended technology deployments may contribute to a growing culture of constant surveillance.
“We don’t think facial recognition belongs in schools,” she said. “So many questions are up in the air, and there’s so much potential for abuse.”