Illinois' new education laws: immigrant student rights and AI in classrooms - what school leaders need to do now
Two statewide measures took effect Jan. 1, 2026 and they point in one direction: protect every student's access to school and set clear guardrails for artificial intelligence in learning.
Here's the short version. Illinois clarified that K-12 public schools must enroll and serve students regardless of immigration status. It also set expectations for how schools adopt AI - with transparency, data protections, and training for staff.
What changed for immigrant students
Illinois reinforced a long-standing principle: every child has a right to a free public education, period. This aligns with the U.S. Supreme Court's Plyler v. Doe decision, which prohibits states from denying students a public education based on immigration status.
If you lead enrollment, counseling, or front office operations, these are the priorities:
- Do not ask about or record a student's or family's immigration status. It's unnecessary for enrollment and creates a chilling effect.
- Accept common residency and age documents (lease, utility bill, affidavit, birth certificate, prior school records). Offer alternatives when families lack standard paperwork.
- Train staff to avoid questions that could expose status (e.g., Social Security numbers) unless required for a specific program, and always provide an alternative identifier.
- Ensure all communications (letters, forms, website pages) state clearly: immigration status does not affect enrollment or services.
- Protect student data. Limit access to sensitive information and have a protocol for outside requests. Route any law-enforcement request through district counsel.
If you need a quick legal reference for staff training, see Plyler v. Doe (1982) via Cornell Law's summary here.
What changed for AI in the classroom
The new law pushes districts to use AI with transparency, human oversight, and respect for privacy. Expect local policies that address disclosure to students and families, staff training, procurement standards, and incident reporting.
- Transparency: Tell students when AI tools are used, what they do, and any data they collect.
- Human-in-the-loop: Keep grading, placement, and discipline decisions under human review. AI can assist, not decide.
- Data privacy: Vet vendors for student-data use, retention, and sharing. No hidden model training on student work without explicit approval.
- Equity and bias: Require bias testing and accessibility checks before classroom use.
- Academic integrity: Give teachers clear guidance on AI-assisted work, citation norms, and misconduct responses that go beyond "detect-and-punish."
- Professional learning: Provide role-based training for teachers, counselors, and administrators - both on classroom use-cases and on legal/ethical guardrails.
For context and training materials, the U.S. Department of Education's recommendations on AI in teaching and learning are a useful starting point here.
Action checklist for superintendents and principals
- Update enrollment procedures and scripts. Remove any status-related questions and add clear language about the right to enroll regardless of immigration status.
- Run a 45-60 minute in-service for registrars, secretaries, and counselors on compliant enrollment and data requests.
- Publish a short AI statement. Include transparency, privacy, human oversight, and an incident-reporting process.
- Audit current edtech. Flag any app that trains models on student data or lacks a clear data-deletion path.
- Create teacher-friendly AI guidelines with examples: lesson planning, feedback, differentiation, and how to disclose AI use to students.
- Stand up a quick-review committee (instructional tech, curriculum, legal) to approve AI tools before classroom pilots.
What to communicate to families
- Enrollment rights: "Your child can enroll in our schools regardless of immigration status." Provide this message in the top five languages in your community.
- AI transparency: What tools the district uses, where, why, and how student data is protected. Offer an opt-out when feasible.
- Academic integrity: What's acceptable AI assistance and how students should cite or disclose AI use.
Professional learning: fast options
If your staff needs hands-on AI training aligned to classroom practice, explore role-based course paths here. Focus PD on practical workflows: feedback at scale, UDL-aligned differentiation, rubric design, and safe data practices.
Common pitfalls to avoid
- Requiring Social Security numbers or immigration documents for enrollment.
- Letting AI tools auto-grade or auto-decide without human review, especially for high-stakes decisions.
- Using AI tools that train on student submissions without explicit, written approval and a data-deletion timeline.
- Rolling out AI to classrooms before teachers have clear examples, guardrails, and disclosure templates.
Bottom line
Illinois sent a clear message: keep doors open for every student and use AI with care. If you update enrollment, secure student data, and give teachers practical guidance, you'll be on solid ground this semester.
Your membership also unlocks: