Schools issue permission slips to get parent approval for students to take field trips, learn about sexual health, or play sports.
But some experts say school leaders should consider adding a technology-driven concern to that list: Using ChatGPT and similar tools powered by artificial intelligence.
School districts that had previously banned ChatGPT—including New York City, the nation’s largest—are now puzzling through how to use the tool to help students better understand the benefits and limitations of AI.
But, when every question that a ChatGPT user asks is incorporated into the software program’s AI training model, privacy concerns come into play, experts said. And that goes for other generative AI products available to students.
Allowing ChatGPT to collect information from students that is then used to develop the tool itself would appear to run up against the Family Educational Rights and Privacy Act (better known as FERPA), which prohibits the collection or analysis of identifiable student data for purposes other than education, said David Sallay, director of youth and educational privacy for the Future of Privacy Forum, a nonprofit organization.
And ChatGPT’s privacy policy states that the tool isn’t intended to be used by anyone under the age of 13 and that those between the ages of 13 and 18 should get permission from a parent.
Still, he expects many districts haven’t taken the step of getting formal permission from parents. “I think a lot [of schools] are just using it and not telling anyone,” Sallay said. “That’s what happens with a lot of ed tech.”
Last school year, the Peninsula School district near Seattle collected permission slips to allow students to use AI tools like ChatGPT in the classroom, Kris Hagel, the district’s executive director of digital learning, said during a Nov. 1 Education Week webinar on AI.
But this school year, “we’ve kind of been a little bit more loose,” Hagel said.
Instead of requiring permission slips for each student, “we let parents know at the beginning of the year that our 8th grade and above students would most likely be using AI,” he said. “I think it’s a good idea to just let parents know what’s going on in the classroom, what tools you’re using.”
Getting parental approval for students to use AI tools is a smart move, said Tammi Sisk, an educational technology specialist for the Fairfax County Public Schools in Virginia, who also served as a panelist for the Education Week webinar. Her school district is still developing its AI policy.
“I don’t see how we get around parent permission, especially if it’s a consumer product, like ChatGPT,” Sisk said. The tool is “also not super transparent as to what [it’s] ingesting.”
Students using an AI tool specifically designed for education—think Khan Academy’s Khanmigo chatbot, for instance—might experience more of a protected environment, but teachers and school leaders should check each tool’s privacy guidelines before deciding what to do, experts said.
Permission slips provide another benefit for schools: Helping parents better understand how AI is being used in the classroom, said Stacey Hawthorne, the chief academic officer for Learn21, a nonprofit organization that works with schools on their use of education technology.
“This is a really, really good opportunity to have conversations with parents about AI,” Hawthorne said during the Education Week webinar.
Potential data privacy problems still exist with permission slips
But schools shouldn’t just get the permission slip and call it a day, said Amelia Vance, the president of the Public Interest Privacy Center, a nonprofit that works on child and student data privacy issues.
No matter students’ age, the best thing for educators to do “from an actual safety perspective and well-being perspective is to also teach kids how to limit or minimize the amount of personal information that they’re putting into the service,” Vance said.
Vance recommends that schools advise students to “turn off their history,” a feature ChatGPT added in the spring that allows users’ to ask questions without the conversation being later used as training data for the tool.
Students should also be cautioned not to input essays about personal trauma, or even information as simple as the name of their school, their age, where they live, or their birthdate, Vance added.
She likened that type of advice to the warnings many adults—who are now in their 20s and 30s—heard back in middle and high school about not providing too many specifics to strangers they spoke to in chatrooms.
“It’s going to be important to make sure kids know what could be personally identifiable and what they probably shouldn’t put in even when [ChatGPT] says they’re not going to keep the information,” Vance said.