Special Report
Artificial Intelligence

Will COVID-19 Spur Greater Use of Artificial Intelligence in K-12 Education?

May 19, 2020 6 min read
BRIC ARCHIVE
  • Save to favorites
  • Print

The rush to adopt new technology during coronavirus-driven remote learning could lead educators to use more tools powered by advanced artificial intelligence. But that more optimistic vision for AI could be tempered by budget shortfalls resulting from the virus outbreak that “may seriously delay” school districts from making those types of investments in the near future.

That’s how Robert F. Murphy sees it. He is an independent education consultant with more than two decades of research experience, including positions as a senior policy researcher for the international think tank RAND Corp. and as the director for evaluation research at SRI International, a scientific research center.

Murphy, in a paper authored last year for RAND, focused on AI applications in K-12 classrooms and, as the featured speaker during a webinar hosted by a company developing AI-based language learning tools, has cautioned that artificial intelligence is not likely to transform education the same way it already has other high-profile industries such as transportation, drug discovery and health care.

Instead, he has argued that AI will continue to play a back-up role to enhance the classroom experience, assisting teachers with second-language learning, feedback on writing drafts, early diagnosis of reading problems, and through adaptive instruction for remediation.

“The recent phase of remote learning does not change my feelings about AI’s prospects for disrupting education relative to its potential to disrupt other fields,” Murphy said in an email interview with Education Week. “However, I do believe the current COVID-19 distance learning situation will bring renewed attention to the need for online instructional systems that support adaptive instruction and provide automated feedback and support to students when teachers and parents cannot be present.”

Murphy recently had an email conversation with Education Week Contributing Writer David Saleh Rauf about the evolution of advanced AI tools for education. This interview has been edited for brevity and clarity.

How can teachers, parents, and students trust that AI is going to be making the best recommendations to improve an educational outcome or that it’s even making good recommendations? Has there been sufficient research to determine what value teachers and students could derive from an AI solution in the classroom?

Robert F. Murphy

Currently, there is a lack of research and vendor reporting in both areas—in our understanding of the accuracy of statistical AI-based applications in education [with the exception of some automated applications that score writing. The application automatically scores the writing.] and to what extent they are providing value beyond similar applications that don’t include advanced AI techniques. As more of these applications are introduced in the market, there will likely be discussions about establishing industry standards for the disclosure by vendors of certain information about the products, including product performance. This information might include a description or ranking of the “knowability” of system decisions; limitations and accuracies of model predictions; the consequences of inaccurate decisions for students and teachers; and how the models were trained, including details of the data sets used and how the learned models were evaluated for potential bias.

The accuracy of statistical AI systems is highly dependent on access to large sets of data. In some cases, those data can be biased and reinforce racial, gender, or other stereotypes upon which the AI tool is being trained. What are the implications of biased data creeping into AI systems used for education?

Concerns over algorithmic bias will depend on the application, its role in the school and classroom, and the consequences of system decisions for students and teachers. For example, the consequences of bias creeping into the decisions from a curriculum-recommendation engine for teachers are likely to be fairly minor compared with the possible consequences for students of a biased AI-based early-warning system that might disproportionately and incorrectly identify one group of students for remediation based on gender or race while completely missing students with real needs for the same reason. This is why for AI applications such as early-warning systems where the consequences are significant, I advocate that the output of the systems should only be used as one data point in the decision making process along with the professional judgments of teachers and administrators based on their personal experience and knowledge of individual students.

What’s the biggest obstacle preventing some of these advanced statistical AI systems from being used in classrooms around the country on a large and meaningful scale? Is it funding and product development, the lack of acquired data sets needed to build and train the machine-learning algorithms, trust issues regarding privacy, or a combination of all those factors?

If I had to rank these obstacles by their strength of influence on the development of advanced AI solutions for education, it would be (1) lack of appropriate data sets to train algorithms, (2) funding for development, and (3) data-privacy issues. The vast amounts of data needed to train sophisticated and unbiased AI learning applications across a host of subject areas and grade levels is just not readily available in education. The only groups with easy access to the type of fine-grained data that might be of use for training advanced AI algorithms are the current publishers and developers of online learning platforms and applications that are being used at scale (for example, some online learning platforms currently in use in the U.S. and English-language learning apps serving the Chinese market). This is a very small network of people. Without this data, advanced AI capabilities are not possible. But even if the required training data were widely available, the available funding for development of AI-based solutions for the K-12 education market will only ever be a small fraction compared with what is being invested in other markets, such as health care, autonomous vehicles, the military.

There’s a ton of funding, ranging from venture capital used by startups to big investments made by public corporations, going into AI shaping technology all around us. How come the investment in AI for the ed-tech sector isn’t as robust? Could that change following the COVID-19 school closures and the uncertain future of when brick-and-mortar classes will resume?

The K-12 education market is a notoriously difficult and costly market for product vendors due to a number of factors: small discretionary budgets, compliance requirements, long sales cycles involving committee approvals, etc. Unfortunately, at this time, it’s difficult to envision large new investments by venture-capital firms and public corporations in the development of new products and services for the U.S. K-12 market.

In a webinar you participated in last year, you said that systems outside of education are ultimately going to influence the public’s trust of AI in education. What did you mean?

The general public, including the parents of school-age children, will have their most significant experiences with AI applications outside of education, such as in health care or within their automobile. And the quality of this experience will likely shape the public attitudes toward AI’s use in education. Media coverage, good and bad, will also play a major role in public attitudes. The next news story involving an AI application that garners wide media attention—a major personal-data breach, the discovery of COVID-19 vaccine, a fatal autonomous-driving accident, more productive agricultural yields—will likely have a lasting impact on the public’s general perception of the safety and reliability of all AI applications, including those developed for the education market.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
Managing AI in Schools: Practical Strategies for Districts
How should districts govern AI in schools? Learn practical strategies for policies, safety, transparency, as well as responsible adoption.
Content provided by Lightspeed Systems
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Reading & Literacy Webinar
Unlocking Success for Struggling Adolescent Readers
The Science of Reading transformed K-3 literacy. Now it's time to extend that focus to students in grades 6 through 12.
Content provided by STARI
Jobs Virtual Career Fair for Teachers and K-12 Staff
Find teaching jobs and K-12 education jubs at the EdWeek Top School Jobs virtual career fair.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Artificial Intelligence Video Reading Is Hard to Teach. Can AI Help?
Artificial intelligence might be able to drive cars, treat diseases, and train your front door to recognize your face. But can it help kids learn how to read?
1 min read
Artificial Intelligence What the Research Says AI Chatbots Tend Toward Flattery. Why That's Bad for Students
Flattering technology can make people less willing to admit they are wrong.
6 min read
Illustration of AI robot manipulating a child's mind like a puppet on a string, the girl is using a laptop and interacting with an AI chatbot.
iStock
Artificial Intelligence FAQ: Artificial Intelligence in Schools
Education Week answers some key questions about the use of artificial intelligence in schools.
1 min read
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Students grab Chromebooks during Casey Cuny's English class at Valencia High School in Santa Clarita, Calif., Wednesday, Aug. 27, 2025.
Jae C. Hong/AP
Artificial Intelligence Students Are Worried That AI Will Hurt Their Critical Thinking Skills
Despite those concerns, students are using the tech more and more for schoolwork.
4 min read
Students present their AI powered-projects designed to help boost agricultural gains in Calla Bartschi’s Introduction to AI class at Riverside High School in Greer, S.C., on Nov. 11, 2025.
Students present their AI-powered projects designed to help boost agricultural gains during an introduction to AI class at a high school in Greer, S.C., on Nov. 11, 2025. A new RAND Corp. survey of middle, high school, and college students shows nearly 7 in 10 middle and high school students say they are concerned that using AI for schoolwork is eroding their critical thinking skills.
Thomas Hammond for Education Week