Ed-Tech Policy

What Educators Should Know About Biden’s ‘AI Bill of Rights’

By Alyson Klein — October 24, 2022 5 min read
Image shows a conceptual circuit board with "AI" grid lit up in the center.
  • Save to favorites
  • Print

Tracking student progress and flagging kids at risk of failure. More customization of lessons to meet individual students’ needs. Professional development tailored to individual teachers. Automatic essay grading.

Those are just some of the tasks in K-12 education that experts say could—or already are—being performed with the help ofartificialintelligence. AI is already transforming retail, agriculture, medicine, and other industries. Its impact on K-12 education is only expected to grow, along with nearly everything else in the economy.

With that in mind, the White House released a bill of rights for AIearlier this month. Here are some critical facts educators should know about it.

The AI bill of rights is centered around five principles

You should be protected from unsafe or ineffective systems. That means, among other things, that AI systems need to be tested before they are rolled out, and then carefully monitored to make sure they’re working as intended.

No one should face discrimination by algorithms. Systems should be used and designed in an equitable way. AI systems reflect the biases of the people who program them. That’s why, for instance, an algorithm that’s designed to decide who gets a financial loan may inadvertently disadvantage Black borrowers. Having people from different backgrounds design AI-powered systems is one possible solution.

You should have protections against abusive data practices and agency over how data about you is used. AI systems rely on data, and student data privacy is obviously a big issue in any tech that’s powered by AI.

You should know that an automated system is being used and understand how and why it influences outcomes that impact you. K-12 schools could play a big role here too, in helping students understand the technology and how it impacts the world around them.

You should be able to opt out, where appropriate, and have access to a person who can quickly consider and fix problems you encounter. That would seem to imply that companies that create learning software with AI would have to respond quickly to any problems educators or parents raise.

The AI guidance has no real legal authority

The bill of rights is simply guidance for areas of the economy that rely on AI, though that’s increasingly in nearly every area of the economy. If anything, its principles may apply to AI use by the federal government, according to an analysis in Wired magazine. But it’s not going to force Facebook or Netflix or even a state criminal justice system to make changes to the way they use AI, unless they voluntarily decide to embrace the principles.

The U.S. Department of Education is one of a number of agencies that is supposed to follow up on the bill of rights. It is expected to release specific recommendations for using AI in teaching and learning by early in 2023. The recommendations should include guidelines for protecting student data privacy when using AI.

What data privacy experts see as problematic

Amelia Vance, the founder of Public Interest Privacy Consulting and an expert on schools and data privacy, thought the general tenor of the document was the right one, but she wondered just how much outreach the White House had done to K-12 education groups, given some of the examples used in the guidelines.

For instance, in elaborating about data privacy, the document said that ideally, data should be most accessible to those who work directly with the people that the data pertains to. And it gives, as one example, a teacher getting more access to their students’ data than a superintendent.

“There are many school districts who have decided that they want the superintendent or principal to have access and be able to see across the schools [how] the teachers are serving their students,” Vance said. “It just raises some really serious questions again about who they talked to” in making recommendations for K-12.

What’s more, it might not be practical for schools to always get parental permission before allowing students to use learning technology that relies in part on AI, she said. But that’s how some might interpret the guidelines.

“I think it’s largely the same reason that many superintendents and teachers are struggling with parents wanting to be able to individually approve the reading their kid has to do,” she said, referring to a push by parents in some communities to review curricular materials before they are used with students. “It’s often impractical. It is difficult for teachers to build their curriculum. It’s difficult for the school to move forward to make sure that everyone is learning the same things and that learning is provided in an equitable way.”

What do people from companies that create tools for student learning think?

Having guidelines can be helpful to companies, particularly those that want to reassure schools they will safeguard data and root out bias.

“If somebody wanted to build an AI system, there’s some nice guardrails there to help you build a better system,” said Patricia Scanlon, the founder and executive chair of SoapBox Labs. It designed a natural language processing technology specifically for children’s voices that is used in educational products developed by McGraw Hill and other companies.

Like other international companies, SoapBox Labs, which is based in Ireland, will have to comply with pending European guidelines for AI, which may be stricter. What’s more, unlike the White House AI bill of rights, those guidelines may come with an enforcement mechanism.

Earlier this month, SoapBox Labs became the first company to receive the Prioritizing Racial Equity in AI Design Product Certification, developed by two education nonprofits, Digital Promise and the EdTech Equity Project.

School districts may feel more comfortable using certain products if an outside evaluator confirms that they meet certain privacy and bias mitigation standards, Scanlon added. “It can give some confidence so not everybody has to be an expert in AI,” she said. “I think the stakes are just higher in education than they are for your Netflix recommendation,” which can also be driven by AI algorithms.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Mathematics Webinar
Pave the Path to Excellence in Math
Empower your students' math journey with Sue O'Connell, author of “Math in Practice” and “Navigating Numeracy.”
Content provided by hand2mind
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Recruitment & Retention Webinar
Combatting Teacher Shortages: Strategies for Classroom Balance and Learning Success
Learn from leaders in education as they share insights and strategies to support teachers and students.
Content provided by DreamBox Learning
Classroom Technology K-12 Essentials Forum Reading Instruction and AI: New Strategies for the Big Education Challenges of Our Time
Join the conversation as experts in the field explore these instructional pain points and offer game-changing guidance for K-12 leaders and educators.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Ed-Tech Policy What the Head of ChatGPT Told Congress About AI's Potential
Sam Altman, the CEO of the company that created ChatGPT, thinks that AI-generated content needs to be labeled as such.
3 min read
Artificial intelligence and schoolwork image with hand holding pencil with digital AI collage overtop
iStock/Getty
Ed-Tech Policy Schools Are Major Targets of Cyberattacks. A Bipartisan Effort in Congress Aims to Help
There have been 1,619 publicly disclosed K-12 cyberattacks between 2016 and 2022.
3 min read
Silhouette of a hacker in a hoodie using laptop with binary code overlay.
iStock/Getty Images Plus
Ed-Tech Policy We Asked ChatGPT: Should Schools Ban You?
The debate about the benefits and drawbacks of artificial intelligence, and more specifically ChatGPT, is heating up.
1 min read
Vector illustration of the letters AI partially breaking through the red circle and slash symbol representing it being banned
Tech luminaries and prominent AI researchers signed an open letter calling for temporarily putting the brakes on development of AI technologies.
iStock/Getty
Ed-Tech Policy Congress Tells TikTok CEO: The App Is Bad for Students and Privacy
TikTok spreads misinformation, endangers children’s mental health, and jeopardizes their privacy, lawmakers said.
3 min read
Supporters of TikTok hold signs during a rally to defend the app at the Capitol in Washington, Wednesday, March 22, 2023. The House holds a hearing Thursday, with TikTok CEO Shou Zi Chew about the platform's consumer privacy and data security practices and impact on kids.
Supporters of TikTok hold signs during a rally to defend the app at the Capitol in Washington on March 22, 2023. The House held a hearing the next day with TikTok CEO Shou Zi Chew about the platform's consumer privacy and data security practices and its impact on kids.
Jose Luis Magana/AP