A growing number of special education teachers say they use artificial intelligence platforms to draft all or part of students’ individualized education programs, even as many districts lack policies about how the rapidly evolving technology can be used.
Educators have long reported struggles to keep up with the paperwork associated with IEPs. Now, they say AI platforms can help them write the federally mandated, personalized documents that detail goals for students with disabilities more quickly and with greater detail, allowing them to commit more time to instruction.
But using the technology to create IEPs opens up a host of practical, ethical, and legal questions generally left unanswered by a dearth of official guidance.
“Teachers, and especially special education teachers, are overwhelmed by paperwork, and it’s crowding out time for instruction and collaboration,” said Olivia Coleman, an assistant professor of exceptional student education at the University of Central Florida and a co-principal investigator at the Center for Innovation, Design, and Digital Learning. “Many report that they just don’t feel prepared to independently write IEPs after completing their teacher-preparation programs.
“Obviously, AI is extremely appealing as it as it can save time and reduce that cognitive load,” she said.
Coleman, who previously worked as a special education teacher, is optimistic that the technology could help teachers create stronger IEPs, but only if it’s used responsibly—serving as a “writing partner” and not an autopilot to generate the important documents.
Among the biggest landmines she and others have identified: Large-language models like ChatGPT are trained using literature that doesn’t adequately or accurately reflect the experiences of people with disabilities, creating the threat of bias in their outputs; educators violate student privacy laws if they input data like test results into unsecured platforms; and AI apps sometimes fabricate studies or misrepresent their findings.
Use of AI to write IEPs is growing, often without guidance
Fifty-seven percent of special education teachers who responded to a recent survey said they used AI to help them with IEPs or plans to accommodate students’ disabilities under Section 504 of the Rehabilitation Act of 1973 during the 2024-25 school year, up from 39% in 2023-24.
The survey of 1,018 parents and 806 teachers, including 275 special education teachers, was conducted by the Center for Democracy and Technology, a nonprofit organization that researches technology and civil rights, between June and August.
The survey found that 15% of respondents used AI to write IEPs or 504 plans in full, up from 8% the previous year. Thirty-one percent said they use AI to identify trends in student progress for goal setting, 30% said they use it to summarize IEPs and 504 plans, and 28% said they use AI to help choose accommodations.
As in other parts of education, the use of AI in special education comes as districts struggle to draft policies and create professional development to train teachers in appropriate use of the technology. Meanwhile, platforms continue to rapidly evolve, making the task even more difficult—and more urgent—as the ground shifts under administrators’ feet.
Just two states—Ohio and Tennessee—have adopted requirements for districts to create AI policies, according to an Education Week tracker. Thirty-three states have guidance on AI in schools, according to AI in Education, an organization that provides resources for educators. That guidance largely focuses on student use of AI rather than teacher use and varies widely.
While many states address concerns about student data privacy, Georgia has one of the only mentions of IEPs specifically. The state’s guidance says educators shouldn’t use AI for “high stakes” purposes, like IEPs.
“Streamlining administrative processes at the detriment of the human element can lead to mistrust and challenges associated with AI’s ethical use in the classroom setting,” says the document, issued January 2025. “For example, a teacher may find using AI to write IEP goals as a benefit to save time, but the parent/guardian of a student with a disability might view the use of AI as disconnected from the individual needs of his or her child.”
Across disciplines, just 22% of the 806 6th through 12th grade teachers who responded to the CDT survey said they’d received any training or guidance on the risks of AI, like inaccuracy or bias in outputs.
Teachers trade tips on how to use AI for IEPs
There’s limited data on how teachers use AI for IEPs. Reports of usage range from individuals using consumer platforms, like ChatGPT or Claude, on their own to entire districts vetting and purchasing AI platforms to aid in writing the documents.
On online message boards, teachers trade tips and prompts. Some educators have created IEP tools on ChatGPT that explain requirements for the document and help draft each component. For example, a user-created bot trained on guidance from New York’s state education department asks teachers to describe their students’ disability category, behaviors, and strengths, adding follow-up prompts to further develop a narrative.
On a more formal level, ed-tech companies like Magic School AI and Playground IEP have developed specialized platforms to help write IEPs, coordinate meetings, track data, and adapt lessons to fit students goals and accommodations. Those programs can be vetted and adopted districtwide.
“District-approved tools are going to be safer because, when the districts enter into those agreements, they are doing their due diligence to make sure their students’ information is being safeguarded,” Coleman said.
Parents’ concerns about the quality of IEPs are longstanding
The CDT survey included responses from 336 parents of a child with an IEP or 504 plan, and 64% said it is a “good idea” for teachers to use AI with “developing or informing” creation of the special education plans.
That finding may surprise some educators because special education plans involve sensitive, private student data, and parents have long complained that, too often, they amount to check-the-box, boilerplate documents that don’t provide meaningful objectives to help their children to grow.
“Those are valid concerns,” Coleman said. “But they’ve been concerns long before AI.”
The software many teachers have long used to write IEPs offers drop-down menus of pre-written goals they can easily drop into the documents with the click of a mouse.
Coleman is currently reviewing the quality of 1,100 anonymized IEPs written before the advent of consumer AI platforms, and she can often identify when a cluster of plans were written by the same teacher because they use repetitive language and objectives, she said.
Coleman and Danielle A. Waterfield, a doctoral student at the University of Virginia, published research on how teachers use AI for IEPs and the quality of resulting documents in February in the Journal of Special Education Technology. Their theory: that AI could help carry some of the cognitive load for stressed teachers, freeing up time and mental capacity to teach more effectively.
They found that when experienced special educators compared IEP goals written by teachers to goals written with the help of ChatGPT, reviewing factors like whether goals could be adapted and used by teachers in various classes, there was no statistical difference in their ratings.
In a similar 2024 study by researchers at the University of North Carolina, goals written by teachers who’d been advised how to use ChatGPT were reviewed more favorably than those written by a control group of teachers who did not receive training.
Teachers see promise in using AI to write better IEPs
Teachers told researchers that creating individualized goals that are measurable, achievable, and timebound is one of the most challenging parts of writing IEPs.
That’s why they see promise that AI may serve as a tool, not a replacement for their personal judgement. The technology can help them by synthesizing notes in a student’s file, analyzing data, and suggesting ways to measure progress.
In follow-up interviews, teachers also indicated a need for more training on how to use AI responsibly and effectively.
Organizations like the Center for Innovation, Design, and Digital Learning offer online “office hours” when educators can speak with professors and wrestle with practical and ethical questions.
Coleman and Waterfield have also developed a framework for the ethical use of AI in IEPs that is currently under review for publication. It includes a decision tree to help educators weigh ethical considerations. Among its recommendations:
- Plans should be reviewed and revised to ensure they are personalized and accurately reflect student data.
- Educators should use a checklist, included in the framework, to check for bias in their drafts.
- Educators should disclose the use of AI to parents, students, and other members of the IEP team, indicate which content is AI-generated, and document edits to show how they arrived at the final document.
- Educators should document and track the effectiveness of the prompts they use to create learning goals.
“We don’t want the loss of individualization,” Coleman said. “AI can help you produce a draft, but it should not be your final draft.”
Special education teacher workload concerns drive rising interest in AI tools
Special education teachers’ concerns about heavy workloads and inadequate support contribute to high turnover, said Elizabeth Bettini, an associate professor of special education at Boston University who studies special education teachers’ working conditions and morale.
“It’s not surprising” that those teachers are turning to AI, Bettini said. “Special education teachers are overwhelmed. They have pretty extensive paperwork obligations, and there’s no time in the day set aside for that.”
Bettini has found special education teachers work about 10 hours a week outside of the school day, largely on paperwork and case management. And, though IEPs are a significant part of their work, many schools don’t set aside any time to create them, said Bettini, who has not conducted research on AI.
Some AI experts have advocated for using the technology for more than drafting IEPs. In a 2024 “hackathon” hosted by the the Stanford University Accelerator for Learning, developers proposed AI tools to simplify the complex language in IEPs for families, translate the documents into other languages, and record and transcribe IEP meetings. General education teachers could also use AI to tailor lessons and activities to incorporate a student’s goals and accommodations, developers said at the event, which was documented in a white paper.
But there is no silver bullet to solve teacher burnout, Bettini said. Administrators concerned about the burden of paperwork should also carve out protected time in the school day for teachers to complete it, provide adequate professional development to support the work, and have classified employees, like office managers, handle tasks like scheduling IEP meetings, contacting parents, and making copies, she said.
Administrators who aren’t comfortable with AI may be tempted to ignore its use, Coleman said, but doing so bypasses an important opportunity to create guardrails for teachers to avoid ethical, legal, and accuracy pitfalls, she said.
“IEPs are supposed to be the cornerstone of a free, appropriate public education for students with disabilities,” Coleman said. “So if they have issues or are not high quality, what’s happening with actual implementation? What’s happening with actual implementation if the document you are using is poor, in and of itself?”