IT Infrastructure & Management

Privacy Group Cautions Schools on Technology That Flags Children at Risk of Self-Harm

By Benjamin Herold — September 27, 2021 6 min read
Conceptual image of students walking on data symbols.
  • Save to favorites
  • Print

Popular software tools that scan students’ online activity and flag children at risk of self-harm and mental-health crises are “unproven” and come with significant risks, a new report warns.

“No independent research or evidence has established that these monitoring systems can accurately identify students experiencing suicidal ideation, considering self-harm, or experiencing mental-health crises,” according to the Future of Privacy Forum, a Washington-based think tank. “Self-harm monitoring systems introduce greater privacy risks and unintended consequences for students.”

The report, titled “The Privacy and Equity Implications of Using Self-Harm Monitoring Technologies: Recommendations for Schools,” comes on the heels of numerous news media investigations of such tools. In 2019, for example, Education Week published an in-depth look at how digital surveillance systems led schools to flag students for sending files containing the word “gay” and for the content of personal photos accidentally uploaded to their school-issued devices.

The reach of such systems continues to grow, thanks in large measure to the COVID-19 pandemic, which forced more students online and sparked an apparent rise in student suicides and mental-health crises. Popular ed-tech company Gaggle, for example, now claims 1,500 school district clients and counting.

“In a school setting—whether virtual or in person—adults have a legal obligation to keep kids safe,” Gaggle CEO Jeff Patterson said in a statement. “Gaggle believes firmly in the importance of protecting student privacy and is a long-standing supporter of the Future of Privacy Forum’s Student Privacy Pledge 2020 and would welcome opportunities to continue to collaborate with FPF.”

Amelia Vance of the Future of Privacy Forum stopped short of saying K-12 leaders should forgo such systems altogether but warned educators to do extensive diligence before adopting them.

“Schools should not employ self-harm monitoring unless they have robust mental-health resources established and common-sense data protections in place,” said Vance, the director of youth and education privacy for the group.

Self-harm monitoring systems raise privacy, equity, legal concerns

The new report describes self-harm monitoring systems as “computerized programs that can monitor students’ online activity on school-issued devices, school networks, and school accounts to identify whether students are at risk of dangerous mental-health crises.”

Such systems typically collect and scan digital information ranging from students’ web-browsing histories to the contents of their documents and email messages, using algorithms and sometimes human reviewers to search for keywords that might indicate trouble. When content is flagged, alerts are typically sent to school or district administrators, who sometimes take the information to third parties such as law enforcement.

The companies who make such tools regularly tout hundreds or thousands of lives saved and catastrophes averted.

A Gaggle spokeswoman, for example, said in a statement that the company saved 1,408 lives last year alone. That number is based on either reports back from district clients and/or flagged content that contained a “clear and definitive” suicide plan. Gaggle is among the companies that uses trained human reviewers to determine which flagged content merits an alert to school officials.

Still, the Future of Privacy Forum suggests it’s unclear whether self-harm monitoring systems can accurately identify a high percentage of at-risk students while avoiding “false flags” of children who are not really considering harming themselves or others.

And even when self-harm monitoring systems do work as advertised, it’s not clear that merely flagging students’ digital content reliably leads to an appropriate mental-health intervention.

The group’s new report also details a range of other potential problems:

  • Legal violations: While schools are required by the Children’s Internet Protection Act to block obscene or harmful content on their networks and devices, it remains unclear whether the federal law clears the way for self-harm monitoring technologies as filters, the Future of Privacy Forum says. It’s also unclear how the Family Educational Rights and Privacy Act applies to the information such technologies gather on students, and surveilling and flagging students’ off-campus online activity may in some circumstances violate Fourth Amendment protections against unlawful searches and seizures.
  • Equity concerns: Vulnerable children and students from “systematically marginalized groups” may face an elevated risk of harm from monitoring technologies, the new report maintains. Poor students who lack their own personal devices may have more of their online activity surveilled because they’re forced to rely on school-issued computers, for example. Students who are gay, lesbian, bisexual, or transgender may also be targeted for harassment and stigmatization based on how their online activity is scanned and flagged.
  • Privacy concerns: Overcollection and oversharing of information on students’ mental-health status could expose students to sanction by school staff or law-enforcement personnel who are not properly trained to interpret the information in context, the group warns. Sensitive student data that are not deleted in a timely manner also pose a risk.
    • Curtailing intellectual freedom: Some researchers also warn of a “chilling effect,” in which students are hesitant to search for needed information or resources for fear of being watched.

    Among those sharing such concerns is the National Association of School Psychologists, which has not taken an official position on whether schools should use self-harm monitoring technologies.

    “We would raise cautions about the possibility of wrongly identifying students or misuse of data,” a spokeswoman for the group said via email.

    Monitoring systems not a substitute for mental- health services

    NASP and the Future of Privacy Forum were also aligned in recommending that K-12 districts ensure they have an adequate number of school psychologists, counselors, and social workers to support the needs of students who are at risk.

    “Monitoring systems cannot serve as a substitute for robust mental- health supports provided in school or a comprehensive self-harm prevention strategy rooted in well-developed medical evidence,” the report says.

    Other recommendations include working with parents and community members to develop a shared understanding of values and priorities before adopting monitoring technology; developing clear policies about what information is collected, who has access to it, and how long it is stored; and clearly communicating those policies to school staff and parents alike.

    “It is imperative that school districts approach any self-harm monitoring system holistically, taking into account the totality of harms that could arise from hastily adopting technology without well-developed implementation policies and the necessary accompanying school-based mental-health resources,” the report concludes.

    GoGuardian, makers of widely used filtering and monitoring services now used by roughly 14,000 schools and districts nationwide, applauded the recommendations as “thoughtful.”

    “We recognize the important role that school leaders play in balancing student privacy and safety in the digital age and are committed to building solutions that support that balance,” a company spokesman said in a statement.

    Events

    Reading & Literacy K-12 Essentials Forum Reading Instruction Across Content Disciplines
    Join this free virtual event to hear from educators and experts implementing innovative strategies in reading across different subjects.
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    School & District Management Webinar
    Harnessing AI to Address Chronic Absenteeism in Schools
    Learn how AI can help your district improve student attendance and boost academic outcomes.
    Content provided by Panorama Education
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
    Sponsor
    Science Webinar
    Spark Minds, Reignite Students & Teachers: STEM’s Role in Supporting Presence and Engagement
    Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
    Content provided by Project Lead The Way

    EdWeek Top School Jobs

    Teacher Jobs
    Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
    View Jobs
    Principal Jobs
    Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
    View Jobs
    Administrator Jobs
    Over a thousand district-level jobs: superintendents, directors, more.
    View Jobs
    Support Staff Jobs
    Search thousands of jobs, from paraprofessionals to counselors and more.
    View Jobs

    Read Next

    IT Infrastructure & Management Sizing Up the Risks of Schools' Reliance on the 'Internet of Things'
    Technology is now critical to both the learning and business operations of schools.
    1 min read
    Vector image of an open laptop with octopus tentacles reaching out of the monitor around a triangle icon with an exclamation point in the middle of it.
    DigitalVision Vectors
    IT Infrastructure & Management How Schools Can Survive a Global Tech Meltdown
    The CrowdStrike incident this summer is a cautionary tale for schools.
    8 min read
    Image of students taking a test.
    smolaw11/iStock/Getty
    IT Infrastructure & Management What Districts Can Do With All Those Old Chromebooks
    The Chromebooks and tablets districts bought en masse early in the pandemic are approaching the end of their useful lives.
    3 min read
    Art and technology teacher Jenny O'Sullivan, right, shows students a video they made, April 15, 2024, at A.D. Henderson School in Boca Raton, Fla. While many teachers nationally complain their districts dictate textbooks and course work, the South Florida school's administrators allow their staff high levels of classroom creativity...and it works.
    Art and technology teacher Jenny O'Sullivan, right, shows students a video they made on April 15, 2024, at A.D. Henderson School in Boca Raton, Fla. After districts equipped every student with a device early in the pandemic, they now face the challenge of recycling or disposing of the technology responsibly.
    Wilfredo Lee/AP
    IT Infrastructure & Management Aging Chromebooks End Up in the Landfill. Is There an Alternative?
    Districts loaded up on devices during the pandemic. What becomes of them as they reach the end of their useful lives?
    5 min read
    Brandon Hernandez works on a puzzle on a tablet before it's his turn to practice reading at an after school program at the Vardaman Family Life Center in Vardaman Miss., on March 3, 2020.
    Brandon Hernandez works on a puzzle on a tablet before it's his turn to practice reading at an after-school program at the Vardaman Family Life Center in Vardaman Miss., on March 3, 2020. Districts that acquired devices for every student for the first time during the pandemic are facing decisions about what to do at the end of the devices' useful life.
    Thomas Wells/The Northeast Mississippi Daily Journal via AP