Bring up the idea of even the possibility of artificially intelligent robots replacing some of what teachers do, and you are likely to spark a tornado of anger among many educators. Intelligent machines could never match human interactions, they argue. Such moves would be a giant step toward a digital dystopia in education.
That kind of reaction to the role of AI robots in education clearly played out in our recent Big Ideas survey of K-12 teachers, which featured questions about robotics. The vast majority of teachers, 84 percent, disagreed with the suggestion that student learning would likely improve if more K-12 teachers had AI-powered robots working with them as classroom assistants. More than 90 percent did not think that student learning would improve in classrooms where chronically low-performing human teachers were replaced by artificially intelligent robots.
It makes sense that teachers might think that machines would be even worse than bad human educators. And just the idea of a human teacher being replaced by a robot is likely too much for many of us, and especially educators, to believe at this point.
But consider the case of a computer science professor at Georgia Tech. According to the Global Education & Skills Forum, this professor had a mix of online teaching assistants, and all of them were human except for one. The teaching assistants were available via email to answer questions. Only one student in the class thought one of the teaching assistants was not a human being, because that assistant tended to answer questions much faster than the others. That student was right.
The forum—part of the London-based Varkey Foundation, which brings together leaders from public, private, and social sectors from around the world to show how improving education can help solve global problems—posed a provocative question on its site that caught my attention: “Robots replacing teachers is a good thing—yes or no?”
The better question might have been: Can robots help teachers improve classroom learning?
In China, they are testing that question. Hundreds of kindergarten classes in the country are now using a small robot named KeeKo, which tells stories, poses logic problems, and reacts with facial expressions when students master content. The robots are part of a big push in the country to be the world leader in the use of AI-powered technologies.
“Technology is a wonderful tool, and it can help with many individual tasks,” said Darrell Billington, a 25-year veteran social studies teacher at Fairview High School in Boulder, Colo., who responded to our national survey of teachers. “But in education, there needs to be some sort of relationship. I don’t think artificial intelligence is there yet.”
But researchers are trying to get there.
Consider the work of Cynthia Breazeal, an associate professor of Media Arts and Sciences at the MIT Media Lab, who leads the Personal Robots group.
The group is conducting randomized control trials of the use of an AI-powered, teddy bear-sized and -looking robot named Tega in Boston-area schools that have large English-language-learner populations. The goal of the robots is to improve the language and literacy skills of 5- and 6-year-olds. Researchers are tracking gains in the youngsters’ vocabulary and oral language development to determine how the use of human teachers and artificially intelligent robots together in classrooms compares with instruction without robots.
“We’re starting to see some exciting and significant learning gains,” Breazeal said. “I am very encouraged.” But she conceded that a longer, bigger study is the next step.
What is particularly interesting is the research Breazeal and her colleagues are doing around social robots. In their study “Growing Growth Mindset With a Social Robot Peer,” young children played a puzzle-solving game with a peer-like robot. The social robots were fully autonomous and programmed to either exhibit a “growth mindset” (modeled after the work of Carol Dweck and Angela Duckworth) or a “neutral” mindset. Breazeal found that children who played with the growth-mindset robot were more persistent when trying to solve the puzzles compared with the kids working with the neutral robot.
And Breazeal points out that it is not just young children who respond positively to social robots. The team has used social robots with MIT undergraduates and older adults. “We see a social-emotional benefit across age groups,” she said.
30% of teachers cited “grading” as a task that robots could do to help improve their teaching.
That social connection also seems to be much stronger with physical robots rather than intelligent tutors or agents students view on computer screens. Jamy Li, an assistant professor in the Human Media Interaction group at the University of Twente in the Netherlands, conducted a review of 33 studies that examined how adults and children interact with physical versus virtual robots. The analysis, published in 2015 in the International Journal of Human-Computer Studies, found that adults and children tend to have more positive interactions with physical robots and find them more believable than virtual robots.
Now, of course, there are all kinds of red flags that go up when you start talking about artificially intelligent robots playing a bigger role in teaching. Data privacy is a big one, with huge fears that kids would share personal information with an artificially intelligent robot they trust, and that information could get in the hands of people who should not see it. Plus, if the information that is input into the robots to allow them to learn is biased or skewed, that would make the judgments of the robot flawed.
And there is the value of human connections. If students started feeling much more comfortable interacting with robots rather than human beings, and preferred the machines, they might jeopardize their willingness and ability to have meaningful conversations or relationships with other people. In some ways, you already see those troubling signs in how many young people (and even some older folks!) prefer to text back and forth to each other rather than have a face-to-face conversation.
Breazeal recognizes those downsides. For starters, the AI field right now is not diverse or inclusive and that could affect the kinds of technologies being developed and fuel potential biases in the software. And, “we need to be thinking more deeply around ethics,” she said, “particularly with AI with children.”
But that’s exactly why educators should not be putting their heads in the sand and hoping they never get replaced by an AI-powered robot. They need to play a big role in the development of these technologies so that whatever is produced is ethical and unbiased, improves student learning, and helps teachers spend more time inspiring students, building strong relationships with them, and focusing on the priorities that matter most. If designed with educator input, these technologies could free up teachers to do what they do best: inspire students to learn and coach them along the way.
And what the developers of these technologies might need to consider is what matters most is often in the eye of the beholder.
In our survey of teachers, we also asked them to rank duties they think AI robots could replace to help them do a better job teaching. The top-ranked response (44 percent of teachers) said “taking attendance, making copies, and other administrative tasks,” 30 percent said “grading,” and 30 percent said “translating/communicating with emerging bilinguals.”
But Billington, the Colorado teacher, takes exception to turning attendance over to robots. That is often the one time in which he has a face-to-face interaction with some students. “Do they look happy? Are they sad? What is their mood? I would be sad if I had to give that one up.”
On the other hand, when we spoke, Billington began to calculate aloud the time it takes to grade essays: “If I take three minutes per student, and there are 120 students, that’s six hours of work. And most assignments take longer than that to grade.”
He paused, adding: “If AI could help us figure out a way to help us grade faster, that would be amazing.”
As it is, Billington remains heavily skeptical of AI-powered robots becoming a regular feature in U.S. classrooms in the foreseeable future. But he also cautions educators to never say never. It would be “stupid,” he said, “to think it can’t happen.”
| < Idea #2 || Idea #4 > |
A version of this article appeared in the January 08, 2020 edition of Education Week as How Come Robots Can’t Be Teachers?