A whopping 96 percent of the apps schools require or recommend aren’t safe for children, primarily because they share information with third parties or contain ads, concludes a report by Internet Safety Labs K-12.
Apps that allow tech providers, marketers, and advertisers access to personal information about children and their families can, at minimum, be used to create highly targeted ads aimed at kids, says the report, which was released Dec. 13.
These apps are “monetizing your data, selling it to data brokers that are building these ever-growing portfolios on you,” explained Lisa LeVasseur, the executive director of Internet Safety Labs. She is a former software engineer and an author of the report.
Worse, when personal information is abused, it can put kids at risk of predators, cause emotional trauma, and perhaps even physical danger, if location information is shared, the report warns.
To get their arms around the sheer number of apps used in schools, researchers examined a random sampling of 13 schools in each state and the District of Columbia, examining a total of 663 schools that serve about 456,000 students collectively. The total number of apps used by all those schools was 1,722 .
Apps that get the ‘Do Not Use’ label
The researchers labeled a particular app ‘Do Not Use’ if it contained any advertising, had deeply embedded software registered to a data broker or shared information—in ways that are difficult to detect or more explicitly—with one of a number of big tech companies that profit from advertising and internet sales, including Amazon, Facebook, and Twitter.
Seventy-eight percent of the apps studied fell into that category. Another 18 percent of apps were considered “high risk”—which LeVasseur would not recommend for schools—because of similar, though slightly less pronounced, privacy and information-sharing problems.
While those criteria may seem to set a high bar, LeVasseur said it is an appropriate one when children are concerned.
LeVasseur said people often joke that these days, because of technology “we all have no privacy. Haha, isn’t this funny? It’s really not funny. It’s really gross. It’s really harmful. And, you know, it’s really quite damaging.”
What’s more, custom-built apps that districts often use to communicate with families often have even more potential privacy red flags than off-the-shelf apps. And some of the educational apps that districts recommend students use really weren’t built with kids and their privacy needs in mind, LaVasseur said.
In fact, more than a quarter of the apps that districts recommend—28 percent—weren’t developed with children in mind first.
Another eyebrow-raising finding: More than two-thirds of the apps the organization studied send data to Google.
LeVasseur’s advice to school districts? “Fight the urge” to take on dozens of apps. “Less is more,” she said. “You really have to scrutinize this stuff. And you have to vendor manage. You have to get in there and demand a lot more information” from companies selling apps.
And echoing educators, she said school districts also need to build capacity for vetting technology used in schools, considering just how much is out there and how difficult it can be to figure out what privacy protections a particular platform has.
Schools “don’t have the resources that they need,” LeVasseur said. “If this is the scale of the thing, they need more support.”