There’s extraordinary interest in AI, fueled by waves of enthusiastic declarations that “everything is about to change.” Dan Meyer, the vice president at Amplify, is skeptical of such claims. Meyer, who’s charged with overseeing how Amplify’s curricular and technology offerings work for students and teachers, was previously the chief academic officer at the learning platform Desmos. A former math teacher, he’s got a doctorate from Stanford in math education and can be found discussing math instruction on CNN, Good Morning America, and TED.com. After a recent AEI debate where he raised big questions about classroom AI, I was curious to hear more. Here’s what he had to say.
—Rick
Rick: You’ve been working at the intersection of math and digital technology for a long time. What have you learned?
Dan: Most digital education tools were designed for two purposes: distributing information and evaluating understanding. Over the last decade, I think we have seen the limits of those tools. Laurence Holt found that they work well for the 5% of kids who use them as intended. But there are other purposes for technology in classrooms. Kids love to be creative and connect socially with each other. Digital tools are great for both purposes. Rather than submitting answers to a machine for judgment, students can use technology to create sketches, explanations, estimates, graphs, and math art and share them with each other to benefit everyone’s engagement and learning. In short, if kids seem disengaged when they’re learning with computers, it is likely because adults have programmed the computers to do boring things. We can program them to do interesting things instead.
Rick: How has AI changed how curriculum providers work with teachers and schools?
Dan: I have a conflict of interest here, but I think the sloppy, incoherent nature of AI-generated lesson materials is a great advertisement for materials developed by teams like Amplify—people who are skillful, discriminating, and fantastic party guests. Seriously, I hear from teacher after teacher that the lessons generated by AI tools are less of a lesson and more of a lesson sketch, one which leaves out too many details to be useful. I may live to regret this assessment, but before AI comes for all of us at Amplify, I believe we’ll see it come for Teachers Pay Teachers and other supplemental print products first.
Rick: How do lessons learned from earlier digital resources carry over to AI?
Dan: Good curriculum materials—digital or otherwise—are conversation starters. They get a conversation going between kids and teachers about what they both know and how those ideas connect. Those conversations require vulnerability. They’re affirming and demanding simultaneously. “What you know is valuable,” the teacher says, “and there is more for you to know.” Those conversations flourish in high-trust social relationships. AI chatbots might be fine at rebooking my flight or letting me know what interest rates are currently available on a savings account. They seem fine, even, at forming sycophantic, flattering relationships with a small but growing number of people. I see no evidence, however, that they can foster the kind of relationship that 95% of kids need for learning.
Rick: An Education Week survey recently found that only 3% of teachers report using AI “a lot.” What should we make of that?
Dan: That figure was 2% last year. It hasn’t budged. From my time interviewing teachers and working in classrooms, it seems there is a growing number of teachers finding point solutions for AI, especially before and after class—emails to parents; recommendation letters; an idea for an unfamiliar lesson topic; a goal for a student’s individualized education program—but nothing that transforms the great in-class challenges of teaching. How does AI help teachers invite and develop the diverse ideas that show up in the average classroom? Teachers haven’t seen answers they like here. By contrast, teachers know how a digital projector solves those challenges. I imagine north of 90% of teachers would say they use their digital projector “a lot.” There are technologies you couldn’t pry from a teacher’s hands, and AI isn’t one of them.
Rick: What do you think that signals about AI’s potential role in classrooms?
Dan: Where AI is most successful, you’re going to see it recede invisibly into the background. If a teacher has to go to an AI tool, it is likely that the tool has already failed. That’s because the teacher must turn to other tools—presentation software, a writing workspace—to turn the AI output into something classroom-ready. Successful AI will embed itself into existing tools in ways that understand the teacher’s deepest needs—to invite and develop ideas from diverse kids. As one example, we are piloting an AI tool that runs beneath our curriculum, analyzing student ideas at the prompting of our curriculum experts, pulling out exemplar responses, and framing them with classroom-ready discussion resources. It is very possible that a teacher won’t even identify this process as AI, but they will identify their classroom discussions as easier and richer.
Rick: You’ve argued that AI couples high opportunity costs with weak evidence of impact. Can you unpack that?
Dan: “Scores are down! We need to do something!” True as that statement might be, AI has not yet proved its value as that something. Proponents of AI would have us believe that we have no other choice here. Nothing else works. Why not try AI? But we have paid dearly for lessons about what does work. For example, through hundreds of millions of federal dollars, we have learned that many tutoring interventions don’t work. The ones that do work have similar features: Students a) working with trusted community members (parents, paraprofessionals, etc.), b) in person, for c) multiple times per week, d) doing work that is aligned to their curriculum. If a school system wants to encourage teacher experimentation with AI, I find no fault with that. But AI is not a school improvement plan, especially compared with other more evidence-based interventions.
Rick: You’ve argued that research claiming big benefits from AI in schools is overstated because the studies tend to conflate AI with other interventions. Can you say more?
Dan: I’m sure you and your readers have seen the happy headlines about the benefits of AI in schools. But scratch even a millimeter beneath the surface of those studies and you’ll find very weak evidence. You’ll often find that the control group gets nothing, as it should, and the experimental group gets AI but also something extra—from extra time and tutoring to extra-rich parents. In other studies with tighter controls, the effect size diminishes considerably. In some cases, it even reverses, with kids who used AI to support their essay writing effectively de-skilled—experiencing diminished cognitive activity and an inability to recall the arguments they’d made compared with students who didn’t use AI. Generative AI is still new, by the standards of research, and I want to hold it to the right standard of evidence. Too many people are declaring victory for education when we are still in the top of the second inning.
Rick: Personal tutors and chatbots have the potential to change student-teacher relationships, for better or worse. Any tips on how schools can get this right?
Dan: In addition to everything I’ve said about chatbots already, I’ll add one more challenge. If every student is having a different conversation with a different chatbot about a different area of math, for example, what value then is the classroom as a group? Most teachers and students will tell you that some of their most meaningful, productive moments in a classroom occur when they learn from one another in whole-class activity. When a teacher nods and affirms a tentative idea you offered and other kids build on it. The more education becomes “personalized” behind 1:1 chatbot tutors, the harder it is for teachers to take advantage of the “socialized” classroom environment. That’s a challenge for ed-tech tool builders. And school leaders would do well to survey their students throughout an AI implementation on feelings of belonging and safety and take early action on downward changes.
Rick: Some AI enthusiasts argue that the traditional academic curriculum is now obsolete. They urge schools to prioritize “21st-century skills” and labor-market preparation. As someone who’s spent years noodling on math curricula, what’s your take?
Dan: Looking at the sweep of research at this point, it seems clear that generative AI is benefiting experts much more than novices. Novices struggle to formulate the right questions and evaluate a large language model’s answers. For that reason, I suspect AI will change how kids learn much more than what they learn. You’ll need to know a lot in order to make good use of generative AI. I also look to research from David Autor and David Demming indicating that the value of social skills is increasing due to technological changes. Math education would do well to continue the standards work of the last decade, emphasizing skills like argumentation, modeling, and communication.
Rick: What’s the one big thing a school or system leader should know when it comes to AI and schooling?
Dan: The introduction of generative AI has not changed the fundamentals of your work: getting absentee students back to school; making sure kids feel supported and known; communicating results and challenges with parents; creating a positive working environment for teachers; and keeping kids safe. For all of its power in the world outside of schools, generative AI has not transformed the reality of any of those challenges and may, in the case of student mental health, exacerbate them. The work is still the work.
This conversation has been edited for length and clarity.