Users signing up for an Instagram account are now required to provide their date of birth, rather than simply affirming that they are above the age of 13, according to a Reuters report.
The Facebook-owned social media company has pitched the change as an effort to limit children’s exposure to content meant for adults, including advertisements for gambling and birth control.
Instagram won’t verify the provided age of new users or require existing users to provide their birthdate, which means young users could still fib to gain access. The company will, however, continue to use machine learning to enhance its ability to predict a user’s age and gender. Current users who signed up via Facebook have provided Instagram with enough age data to make informed predictions, the company claims.
Instagram is one of the most popular social media platforms among teenagers, according to a 2018 survey from Common Sense Media. The app, which allows users to alter and post pictures and scroll through a feed of photos from accounts they follow, has drawn concerns related to cyberbullying, racist and offensive content, and mental health issues.
Instagram is pitching the news as a positive step towards protecting young users, but not everyone sees it that way.
Lisa Hinkelman, a researcher who founded and serves as executive director of the nonprofit Ruling Our Experiences, thinks social media companies’ efforts to manage young users’ behavior won’t be effective unless educators and adults proactively help students navigate social media.
“Regardless of the controls that we put in place or the mechanisms for access or parts of certain sites, kids are savvy and figure out the workarounds,” Hinkelman said. “I think it’s less about how we restrict and more about how we equip.”
She’s concerned that Instagram’s latest change will improve the company’s ability to track children’s interests and micro-target advertisements to them. Rather than focusing on tracking students or simply warning of digital dangers, Hinkelman recommends taking a “more systematic approach.”
“How do we walk alongside kids as they’re accessing and learning new technologies, helping them navigate challenges and pressures they’re going to have? What do they do when they encounter inappropriate content, what do they do when inappropriate pictures are AirDropped to them? What do we do when we find out that they have a [separate, more casual] ‘finsta’ account?” Hinkelman said. “It’s much bigger than how we set controls.”
A version of this news article first appeared in the Digital Education blog.