At the Holmen school district in Wisconsin, school leaders and staff who are part of the behavior intervention team are now experimenting with generative artificial intelligence to address student behavior challenges.
The district’s student services director used Gemini, Google’s generative AI model, to create a custom AI assistant that helps school leaders and behavior interventionists brainstorm and align student support with the district’s social-emotional learning intervention playbook, said Ben Tashner, the associate principal at Holmen Middle School. (The school isn’t inputting personally identifiable information when using Gemini to brainstorm ideas—just generic situational information, Tashner said.)
The school, which has about 900 students, has one behavior interventionist that the administrative team collaborates with to support students’ specific needs, Tashner said. Together, they’ve found that the custom AI assistant is helpful in providing ideas to begin the development of interventions, he said.
The generative AI tool provides another layer in the intervention team’s discussion to ensure they’re providing the best support for students, Tashner said. The output can sometimes make him think, “that makes sense—why didn’t we think of that? But it does it in a simplified way, and you just edit what you get,” he said.
In a video conversation with Education Week, Tashner discussed how he and his staff use generative AI for behavior interventions and what effects it has had so far.
This interview has been edited for length and clarity.
What prompted your use of AI to address student behavior?
I meet with teachers every week. We’re hearing a lot of similar things, like, “This student struggles with putting effort into work,” “This student doesn’t do any of the work to learn,” or “These students have certain tendencies to avoid work.”
What is our next step? We need to be solution-focused. As all things, behaviors are shown because kids are trying to communicate in a certain way and they can’t or they’re lacking those skills to be able to communicate effectively.
Our belief is that we have skills that we need to teach students. How do we help students find success when these barriers are getting in the way? So we’ve kind of been playing around [with Google Gemini] to do this.
Could you give an example of how this works?
Gemini, in the background, has our SEL intervention playbook. Phase one would be getting the student buy-in that we need to work together on a skill. Phase two would be, we’re going to practice this skill in isolation, one on one. Phase three is the student applying it in the setting.
If I have an 8th grade student who keeps throwing food during lunch—I’ve talked to them eight times, they’ve gotten lunch detention, all these things, this is dragging on. I go to Gemini to ask questions about next steps and skills to work on and conversation starters based on the SEL intervention playbook. [For instance, I could say to the student], “You find joy in the reaction from peers when you throw food. How do you get peers’ attention without throwing food?”
We’re still working out what this looks like, but it helps align [what we’re doing] with the skills students are lacking.
What was this process like before using AI?
You would have those conversations and try different things. They might not work, and we might revert back to, “OK. Now what’s next?” You might reach out to other resources, networking with other people. “Here’s a situation I have. Do you have anything similar?” With the AI tool, the part where I was researching online myself or trying to network with other people, the tool is pulling it all in there with the focus we’ve set up in the background. That part has been nice. You see it, and you’re like, “Oh, yeah. That’s a great idea. Why don’t we have the conversation that way?” But it’s able to pull that up right away.
How has this tech-oriented approach been most helpful?
It has really highlighted [specific skills gaps in students] that we might not have noticed right away. So we talk with the students and the family about the intervention, and they’re all like, “Yeah, this is an issue. How do we help with this?” We get that buy-in and investment to help the student with that skill.
Do you have any concerns about using AI for this purpose?
With it being programmed for our specific [SEL intervention] phases, a downside could be that the variety of responses may be limited the longer we use it within our system. It’s so geared towards how we wanted it built that it wouldn’t be able to branch off; that we’ll get the same responses every time.
Do you have any advice for school leaders?
Learn what’s out there for you to utilize. There’s some fear of [AI] taking over or being used the wrong way. A “why” for me is how to use [AI] appropriately so then we can also teach students how to use AI effectively and appropriately—so they’re not using it just to get an answer but to learn.