Keeping humanity at the center: Accessibility and AI in education
In an upstate New York classroom, students with cerebral palsy and autism start a writing activity without touching a keyboard. Tobii eye-tracking is paired with AI-enhanced augmentative and alternative communication (AAC). With a blink or glance, phrases complete, questions form, and jokes land. What used to take minutes per word now feels like a conversation-and the teacher's attention stays on connection, not data entry.
Across Alaska, a speech-language pathologist runs small groups with ChatGPT, Perplexity, NotebookLM, and SchoolAI. One tab generates prompts, another curates vocabulary, another simulates dialogue. The tech doesn't replace her judgment; it extends it. Practice becomes personal, engaging, and measurable.
How educators are using AI for students with disabilities
These stories are playing out across the country. A recent report from the Center for Democracy and Technology found nearly 6 in 10 special educators used AI to help develop individualized education and accessibility plans during the 2024-2025 school year.
Under federal law-IDEA and Section 504-students with disabilities are entitled to specialized instruction, services, and accommodations. For reference, see the Individuals with Disabilities Education Act. The same report noted that 73% of students with disabilities engaged in back-and-forth conversations with AI, compared to 63% of peers without individualized plans.
Opportunities and real risks
AI can open access-faster communication, personalized practice, and assistive tech that adapts in real time. But it can also create new problems if we move too fast or without guardrails.
- Social connection: 57% of students with individualized plans said AI in class made them feel less connected to their teacher.
- Academic skills: 64% of students believe AI weakens core skills like writing, reading, and research if it's used without purpose.
- Mental health: More than 40% of students say they or a friend used AI for mental health support-an area that requires trained humans, not chatbots.
- Privacy and security: Biometric and medical data (e.g., retinal scans) can be exposed if protections are weak. Schools are frequent ransomware targets.
- Bias and labeling: Students with disabilities are suspended and expelled at higher rates and represent 65-75% of youth in the juvenile justice system. Feeding that data into AI can amplify unfair labels-especially when school data flows to law enforcement.
What the NEA emphasizes
In 2024, NEA released guidance on responsible AI use with a clear theme: keep humans central. Their lens is equity, accessibility, and safety-practical, not theoretical. You can explore their resources here: nea.org/ai.
Five principles to anchor your decisions
- Leadership by those most impacted: Involve disabled students, educators, and families in design, selection, and evaluation. Nothing about us without us.
- Account for different levels of AI literacy: Provide scaffolded training and simple workflows so everyone can participate-enthusiasts and skeptics alike.
- Keep humanity at the center: Use AI to support relationships, communication, and creativity-not replace them.
- Prioritize data protection and bias mitigation: Demand transparency, strong privacy practices, and diverse decision-making.
- Include educators, students, and families as experts: Policy should reflect real classroom needs and lived experience.
Practical guardrails you can implement now
- Define the purpose: For every AI use, write a one-sentence goal (e.g., "speed up AAC phrasing without reducing student choice"). If the purpose isn't clear, pause.
- Protect privacy first: Avoid sending biometric or medical data to third-party systems. Prefer local processing or vendors that contractually ban data retention and model training on student data.
- Get consent: For sensitive use (AAC, biometrics, behavioral data), secure informed family consent and offer a meaningful alternative.
- Set classroom norms: Label what is AI-okay (idea generation, practice prompts) and AI-off-limits (final drafts, mental health advice). Teach students how to cite AI assistance.
- Keep the human loop: Pair AI activities with face-to-face discussion, oral defenses, or reflection. Connection comes first.
- Audit bias: Spot-check outputs for harmful assumptions about disability, race, or behavior. Document issues and escalate to your vendor.
- Minimize data exhaust: Turn off chat history where possible. Use privacy-friendly modes. Delete data you don't need.
- Plan for outages and breaches: Maintain offline or no-login alternatives so learning doesn't stall and data stays safe.
High-impact use cases that respect access and autonomy
- AAC acceleration: Context-aware phrase prediction with clear user control and the ability to edit or reject suggestions.
- Language practice: Conversational role-plays for SLP sessions with personalized vocabulary lists and saved exemplars for progress monitoring.
- IEP/504 drafting support: Draft goals or accommodations from educator notes, then refine as a team. AI suggests; humans decide.
- Multimodal access: Generate multiple formats-visual schedules, audio summaries, simplified texts-without sacrificing rigor.
Quick classroom checklist
- We can explain why we're using AI in one sentence.
- A human verifies all critical outputs before use with students.
- Students know what help from AI is allowed-and how to disclose it.
- Family consent is on file for sensitive data or use cases.
- We log issues (bias, errors, privacy concerns) and share them with admin or the vendor.
Action steps for this month
- Map one lesson where AI can reduce a barrier (e.g., time to communicate) without eroding core skill-building.
- Run a 20-minute PD on your team's AI norms and a shared definition of "assist vs. replace."
- Review vendor policies for data retention and training. If unclear, ask for a signed addendum.
- Co-create classroom guidelines with students: what's allowed, what needs citation, and when to close the laptop.
The bottom line
AI can widen access only if we keep people first. Put disabled students and their families in the lead, protect their data, and keep educators in the driver's seat. With clear purpose and strong guardrails, technology can support high-quality learning where every student has a voice and a path forward.
Want to build staff AI literacy?
If you're planning structured PD by role, explore curated options here: Complete AI Training: Courses by Job.
Your membership also unlocks: