Personalized Learning

Artificial Intelligence in K-12 Education: Unintended Consequences Lurk, Report Warns

By David Saleh Rauf — May 28, 2020 3 min read
  • Save to favorites
  • Print

Artificial intelligence packs the potential to broadly reshape education inside and outside the classroom, but district officials need to consider several critical questions before adopting AI-powered tech tools to avoid unintended consequences, concludes a recently-released report from the Consortium for School Networking.

The CoSN report highlights some of the most promising applications for artificial intelligence in the K-12 landscape, namely providing adaptive and personalized instruction that could give teachers the chance to create more constructive one-on-one learning opportunities. And it also lays out some of the top concerns facing educators as AI becomes used in more districts, specifically data bias, privacy, and the potential to “perpetuate education inequity.”

Those themes were also the focus of a recent Education Week Special Report that examines the use of AI in schools and how the uptick of tech use in education during the coronavirus school closures could affect AI’s future role in the classroom.

The promise and potential “over promise” of AI in education are explored in the CoSN report, which notes that the ever-evolving technology can help teachers to “solve administrative problems, automate certain tasks, afford teachers the time to construct more meaningful face-to-face learning opportunities, and realize that promise of personalization at scale through the implementation of adaptive assessments, intelligent tutoring systems, and platforms that support adaptive learning.”

But artificial intelligence also could lead to “greater achievement gaps,” the report warns:

“Given teacher shortages and retention issues across the country, AI could be viewed as a means of providing ‘something’ where ‘nothing’ otherwise exists. And yet, this approach has the potential to further perpetuate education inequity by advancing a cheaper but inferior education system. For example, if students spend more time interacting with or through technology than in face-to-face settings, it could negatively impact their ability to interact socially. More concerning, if students only, or predominately, learn via AI ... they may not have opportunities to develop higher-order thinking skills. Therefore, education leaders need to be aware that AI could lead to greater achievement gaps.”

Four notable takeaways from the CoSN report:

1. AI is already in the classroom, regardless if you recognize it or not. Whether built into an online platform tracking student metrics, used in a more obvious virtual assistant or as a security system combining cameras with facial and object recognition, artificial intelligence technology is already baked into many schools in a variety of ways.

In education, AI can be found in learning analytic platforms, online courseware, voice assistants, and support structures within other apps. For example, consider the AI in Microsoft Office that might recommend a PowerPoint layout, serve as a Presenter Coach, suggest a formula in an Excel spreadsheet, or allow a student to dictate and translate a paragraph. More obvious applications of AI in education include adaptive or intelligent platforms to support student learning and teaching as well as an emerging sector of facial recognition and sensor systems to address school security."

2. AI tools have not been built with student privacy in mind. The report notes that most AI to date has been designed for commercial purposes, not educational environments. That means most AI on the market does not support school system compliance with federal and state data privacy laws, including state-specific student data privacy laws.

As students interact with more intelligent systems, everything from their voice to their handwriting to their browsing habits could be viewed as forms of digital fingerprints," the report reads, warning of the potential for "long-term profiling" ramifications if a district is using an AI tool not tailored for education. "Despite the promise of using data to better inform instruction, create personalized learning experiences, and provide a more holistic picture of each student as a learner, district leaders need to consider how they will protect student privacy. The use of AI-powered applications may put students, especially those from vulnerable populations, at risk of future discrimination. In addition, they may give students the perception that they are learning and living in a constant state of surveillance."

3. There is ongoing concern about flaws and biases in the data used by algorithms to power AI. That means school district leaders should ask how the data sets were formed and be aware of the risk for bias or discrimination. It also showcases the need for transparency about the data sets and the effectiveness of an AI tool, a point also emphasized by education researcher Robert Murphy in this Education Week interview.

Given the prevalence of racial, religious, and gender bias in society, AI algorithms that leverage books, media, and even news articles could perpetuate this bias," says the CoSN report. "Bias could manifest in how the platform teaches particular skills, corrects different answers, represents different individuals, and even attempts to interpret students' emotions."

4. Educators need to identify the ultimate educational goal for using AI. The most critical question for educators thinking about integrating AI into the classroom is a basic one: Why?

What learning opportunity might AI create or facilitate? What is the educational goal, and how might AI help to achieve it?," the report asks. "Put directly, if AI is the solution, what is the problem that it is trying to solve?"

A version of this news article first appeared in the Digital Education blog.