Classroom Technology

Teachers Demand Answers From Social Media Companies: ‘What’s It Going to Do to Our Kids?’

By Alyson Klein — July 14, 2022 4 min read
Girl using smartphone with notifications of social media icons
  • Save to favorites
  • Print

Bullying emanating on social media from a device that many kids keep right by their bedsides. “Challenges” that encourage the destruction of school property. Violent threats to commit a massacre at an elementary school.

Social media has transformed childhood, and given the adults who work with or care for kids a litany of concerns that previous generations of educators and parents could never have imagined.

Now, many educators want to know how Meta, the company that owns some of the oldest and most popular social media platforms—Instagram, Facebook, and WhatsApp—plans to help schools handle those challenges, or even acknowledge their own role in creating them.

Many don’t feel that they’ve gotten a clear answer yet.

The issue has become particularly heated since documents released last year through a whistleblower revealed that Meta conducted extensive research on the negative impact of its platforms on children’s mental health and the spread of false information, but failed to act on any of those findings.

“I do think that they owe an explanation. I think they owe it not just to the parents and the educators but to the world,” said Bill Bass, the innovation coordinator for the Parkway School district in Missouri, in an interview after Meta’s head of global security, Antigone Davis, spoke to a room full of educators at the International Society for Technology in Education’s annual conference last month in New Orleans. “I think there is a lack of trust that is inherent now” between educators and social media companies, even those companies that are working to secure student data and think about student mental health.

‘A whole slew of bullying that we couldn’t look at’

During the ISTE panel, Davis outlined her company’s lesson plans for parents and teachers, parental management tools, and Meta’s efforts to “build up social learning tools within that digital literacy.”

And the company has tools on its platforms to identify “potentially bullying content and to remove it if it violates our policies,” Davis said. But, she added, “there’s a whole slew of bullying that we couldn’t look at and tell what is happening.”

For instance, she used the example of classmates making fun of a fellow female student’s skirt and one saying “nice skirt” in a comment on a picture. “There’s zero way for us to know that is bullying without additional context,” Davis said. “Sometimes, we can get that additional context but generally we’re not going to.”

Meta has tools that allow teens—and other users— to red flag words that might be used to bully them or that they don’t want to see on their feeds.

While he was pleased to hear that Meta has teacher resources, Matthew Winter—an instructor for the Utah Education Network, which works with districts throughout the Beehive State on technology needs—wishes all prominent social media companies could somehow figure out how to give educators a tutorial on their many features so they can help kids and parents.

He wants to know, “this is what happens on Snapchat when a kid logs in and this is what happens when they get into Instagram. This is what Instagram Live is. This is what TikTok is,” he said.

Right now, “we have to go out and explore it. We have to figure it out first.” And that can be time-consuming for teachers who have a lot on their plates, he pointed out, and especially for those who are not tech savvy.

‘A Band-Aid that still keeps kids in that ecosystem’

Educators at ISTE pressed Davis and Jacqueline Beauchere, the global head of platform safety for Snap Inc., the company behind Snapchat, on how the companies aim to ensure the safety of kids under the age of 13, who aren’t legally allowed to use their flagship platforms but often sign up anyway.

They noticed a stark difference in their answers.

Beauchere said Snapchat just isn’t for younger users.

“We are not designed for children under the age limit,” Beauchere said. “I can’t emphasize that enough. Snap is 14 plus. Those rules are there for a reason, and they really need to be abided by.”

But Davis suggested her company could find a way to safely offer younger kids access to social media platforms.

“Opening the door for the ability to have some degree of much more monitored technology for younger people may be part of what we need to do,” Davis said. Meta, she said, already has “Messenger Kids,” a platform with what she described as stringent parental controls.

Bass found that answer “shortsighted. I don’t think it’s a solution. It’s a Band-Aid that still keeps kids in that ecosystem.”

One thing Davis did not mention: Instagram for Kids. Meta was initially planning to develop a version of the social media platform for younger children, but paused its plan after pushback from critics who, like Bass, saw it as just another way for the company to hook kids early.

And Maureen, a former teacher from Western Canada who now works for an education nonprofit but did not want to give her last name because it is not her job to speak to reporters, didn’t “hear a corporate responsibility to do something other than a business model of trying to get kids into the apps,” she said in a brief interview after the panel ended. The platforms are “doing a lot of damage to kids’ mental health. There are great things about social media, there are places to use it, but these corporations need to step up.”

Winter, from the Utah Education Network, agreed. “I think there needs to be a little bit more foresight [from companies] about what they put out in the future. What is it going to do to our kids?”


This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Student Well-Being Webinar
Attend to the Whole Child: Non-Academic Factors within MTSS
Learn strategies for proactively identifying and addressing non-academic barriers to student success within an MTSS framework.
Content provided by Renaissance
Classroom Technology K-12 Essentials Forum How to Teach Digital & Media Literacy in the Age of AI
Join this free event to dig into crucial questions about how to help students build a foundation of digital literacy.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Classroom Technology Spotlight Spotlight on Academic Integrity & AI
This Spotlight will help you examine how teachers are combatting AI cheating, discover how to structure lessons in AI literacy, and more.
Classroom Technology Opinion The Promise and Peril of AI for Education
As GPS did for our sense of direction, AI could erode students’ connection to knowledge.
8 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Classroom Technology What Educators Need to Know About AI’s Impact on Black Students
Four experts weigh the balance between providing access to AI and protecting students from its dangers.
3 min read
Teacher Helping Female Pupil Line Of High School Students Working at Screens In Computer Class
Classroom Technology Q&A Google Executive: What AI Can and Can't Do for Teachers
Jennie Magiera, Google's head of education impact, discusses the role AI should have in K-12 education.
8 min read
Close-up stock photograph showing a touchscreen monitor with a woman’s hand looking at responses being asked by an AI chatbot.