Student data privacy advocates say the storm clouds around Facebook from the evolving Cambridge Analytica scandal are a reminder that schools, educators, and students should be asking tough questions about how their data is being protected or used.
“Should individuals trust Facebook is a question that a lot of folks are waking up” and are asking themselves, said John Verdi, the vice president for policy at the Future of Privacy Forum, in an interview.
And should schools trust the tech company?
That question underlies a few calls district technology leaders have made this week to Linnette J. Attai, the project director for the Consortium for School Networking‘s privacy initiative, asking her to unpack the situation for them. She is fielding inquiries from districts that use Facebook—often as a way for educators or schools to connect with parents, and sometimes when teachers use the social media platform for their lessons. Some ed-tech companies also allow users to authenticate their login via their Facebook accounts.
“I think the world is waiting for this to shake out,” Attai said in an interview. “I see people trying to digest what is happening and understanding how it may—or may not—impact their policies on social media.”
At the center of the controversy is the revelation, from a March 17 New York Times article, that a British voter profiling firm gained access to personal information from about 50 million Facebook users’ accounts, apparently without their knowledge.
The London-based company, Cambridge Analytica, was hired by President Trump’s campaign in 2016. Cambridge Analytica used Facebook data to try to sway voters to support Trump.
The starting point of that data trail was a Cambridge University researcher named Aleksandr Kogan, who created a personality quiz app. About 300,000 Facebook users downloaded the quiz, according to a post on Facebook by Mark Zuckerberg, the company’s founder and CEO. That gave Kogan access to them and—because of Facebook permissions at the time—to all of their friends, ballooning the number of people affected to nearly 200 times the initial group.
In response to this controversy, a high-profile #deleteFacebook movement has started, but it is too soon to determine how much momentum it will have in the long run.
“This was a breach of trust between Kogan, Cambridge Analytica, and Facebook,” said Zuckerberg, in his first response issued yesterday. “But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.”
Zuckerberg’s response, posted on Facebook,came after a long silence since the story broke on Saturday.
Steps that would prevent this from happening again were taken several years ago, Zuckerberg said. Going forward, the social media platform agreed to:
- Audits: Investigating all apps that had access to large amounts of information before the company changed its platform to dramatically reduce data access in 2014, and conducting a full audit of any app with suspicious activity.
- Restricting developers’ access to data: This step, Zuckerberg said, would further prevent other kinds of abuse. For example, Facebook will remove developers’ access to the data of a user who hasn’t used an app in three months and will reduce the amount of data shared on signing in to an app to the user’s name, photo, and email address.
- Increased transparency about privacy permissions. In the next month, the company will put a tool at the top of each individual’s News Feed highlighting the apps that he or she has used and an easy way to revoke permissions for others to use data from those apps. That tool exists, but it is less visible now.
Zuckerberg’s announcement comes in the wake of myriad calls for the company to account for its role in what happened, with legislators on both sides of the Atlantic vowing to hold hearings on the issue. The Federal Trade Commission has launched an investigation, according to the Washington Post, and lawsuits are likely to be filed against Facebook, Kogan, and Cambridge Analytica, according to the LawFare blog.
“We’ve seen schools and districts use Facebook as a primary method of communication,” said Doug Levin, the founder and president of EdTech Strategies, LLC. They often do this to add interactivity to their websites, and to show their connection to the community, he said.
Although Facebook is offered for free, he cautioned that it comes at a price. “Privacy experts have long been concerned about schools pushing parents onto the third-party platforms that are based on selling advertising and user data,” he said.
In his recent report, Tracking: EDU—Education Agency Website Security and Privacy Practices, Levin discovered that “Facebook ad trackers were found on over 25 percent” of the 159 school websites he studied, and 10 state websites. Levin identified user tracking tools like Facebook Connect, a single sign-on app allowing users to interact on other websites through their Facebook accounts; Facebook Social Graph, and Facebook Social Plugins that facilitate deep integration with Facebook.
“This just makes it so much easier for Facebook to understand what the user is interested in, who else they know, and even where they are,” Levin said. In most cases, the districts’ website privacy policies did not acknowledge that sort of data sharing is occurring, and “it’s likely not appropriate for school districts to be embedding those sorts of third-party trackers on their sites,” he said.
Attai, who is the founder of PlayWell, LLC, a privacy and safety consultancy, said schools aren’t the only ones questioning what’s happened with Facebook. “Everyone is looking and saying this was unexpected, even for a company that is fairly aggressive around its data use.”
The question of how schools will use Facebook in the future will “shake out over time, and it will be driven as much by local community norms, expectations, and concerns, as it is over the overall regulatory climate that is starting around Facebook on this issue.”
Facebook and the Chan Zuckerberg Initiative also supported the development of the Summit Learning Platform, a technology system for personalized learning. Last December, two school districts suspended using the platform out of concerns about content, the alignment of assessments and curricula, and students’ data privacy. However, of all schools that used the platform last year, 93 percent continued to use it during the 2017-18 school year, according to Summit Public Schools. The platform is used in about 330 schools across the country, the organization said.
For Verdi at the Future of Privacy Forum, “this entire ecosystem in the education space is based on trust,” he said.
Regardless of Facebook’s policies about data use and data sharing, the question is, “What are educators comfortable with?” Districts need to ensure that they are good actors and protectors of data privacy if they choose to use the social media platform, he said.
A version of this news article first appeared in the Digital Education blog.