How Louisiana Schools Can Implement AI: Practical Steps While the State Builds a Framework
Louisiana is assembling a nearly 30-person work group of educators, state leaders, and national experts to recommend how K-12 schools should use artificial intelligence. The charge: define age-appropriate lessons, set guardrails, train staff, and publish clear warnings about risks. The team, led by Louisiana Tech University president Jim Henderson, will present recommendations to the state board on March 10.
The goal is simple: prepare students for an AI-driven workforce without trading away privacy, safety, or real learning. That balance will require clear policy, steady professional development, and disciplined tool selection.
What the State Is Doing
The new committee will study classroom use across grade levels, establish guidance for responsible and ethical use, and propose policy updates as the tech shifts. The work follows a board directive from August for a statewide plan for AI in classrooms.
It also arrives alongside an executive order from Gov. Jeff Landry that restricts public schools from using AI tools developed by certain countries, including China, citing state security concerns. Expect the group to align implementation guidance with that order.
What Research and Classrooms Are Signaling
Adoption is already widespread. A recent report from the Center for Democracy and Technology found that 85% of teachers and 86% of students used AI last school year. Students reported feeling less connected to teachers during AI use, and many educators worry about weaker critical thinking and research skills.
The evidence that AI improves learning outcomes is mixed. AI also produces incorrect information at times. Any rollout must pair access with guardrails and tight feedback loops. Center for Democracy & Technology
Immediate Steps Districts Can Take (Next 90 Days)
- Appoint an AI lead at district and school levels; form a small review team (curriculum, IT, legal, special education).
- Inventory current AI use (teachers, students, central office); freeze high-risk tools until vetted.
- Publish a plain-language AI use policy for families and staff (what's allowed, what's restricted, why).
- Pilot with purpose: choose 1-2 tools for clear use cases (e.g., reading fluency, formative feedback), run opt-in pilots, gather metrics, and report results.
- Stand up PD that focuses on pedagogy first: lesson planning supports, differentiation, feedback-not shortcuts that erode thinking.
- Set a data baseline: track student outcomes, engagement, teacher workload, and academic integrity incidents pre/post pilot.
- Create an "AI disclosure" routine: students and staff label AI assistance in assignments and communications.
Curriculum Ideas by Grade Band
- K-2: What AI is (pattern finders), simple examples (voice assistants), digital citizenship, and who to ask for help online.
- 3-5: Prompts and outputs, spotting mistakes, bias basics, and classroom rules for acceptable use.
- 6-8: Credibility checks, citation norms for AI-assisted work, data privacy, and prompt iteration for studying.
- 9-12: AI for research planning (not source fabrication), structured brainstorming, code assistance with comments, model limits, and ethics case studies.
Teacher Training That Respects Pedagogy
- Micro-PD: 30-45 minute sessions on safe prompts, error checking, and lesson planning support.
- Model the practice: use AI to create exemplars, rubrics, and differentiation plans-then refine by teacher judgment.
- Assessment integrity: design tasks that require process evidence (notes, drafts, oral checks) and personal context.
- Community of practice: share prompts that worked, pitfalls, and outcomes monthly.
If your team needs structured upskilling, see curated options by role here: AI courses by job.
Vetting AI Tools: A Simple Rubric
- Purpose fit: Clear instructional goal aligned to standards; evidence or pilot data available.
- Data handling: FERPA compliance, data minimization, deletion timelines, and clarity on whether prompts or student data train vendor models.
- Accuracy and bias: Error rates disclosed, educator controls, and transparency on content sources.
- Access and equity: Works on low-end devices, accessibility features, multilingual support, offline or low-bandwidth mode if possible.
- Controls: Age gating, content filters, audit logs, admin dashboards, district SSO.
- Cost and contracts: Clear pricing, cancellation terms, and parent consent flows.
Student Data and Privacy Checklist
- Run a data protection impact assessment before classroom use.
- Use district-managed accounts; block personal sign-ups.
- Collect the minimum data required; avoid uploads of sensitive info.
- Clarify retention periods, deletion rights, and third-party sharing.
- Disable model training on district data unless explicitly approved.
- Log usage and conduct periodic audits with IT and legal.
Academic Integrity and Assessment
- Teach students how to cite AI assistance and verify claims with credible sources.
- Avoid reliance on AI "detectors"; they produce false positives and negatives.
- Use process-based assessment: planning notes, drafts, source lists, and brief oral defenses.
- Update plagiarism policies to include AI use, disclosure expectations, and consequences.
From Consumers to Creators
State leaders want students building with AI, not just using it. That can start with computer science pathways and real projects.
- Introduce prompt design, data collection ethics, and simple model concepts in middle school.
- Offer high school tracks in Python, APIs, and small-scale AI projects tied to community problems.
- Run hack days or capstones where students design a tool, define risks, and present safeguards.
Current Pilots to Watch
The state has begun trying tools such as Khanmigo and Amira to see how they perform for teachers and students. Districts should observe pilot outcomes, ask for data, and compare results against local needs. Khanmigo overview | Amira Learning
Timeline and What to Prepare by March 10
- Run a limited pilot with clear metrics (learning outcomes, engagement, time savings, and integrity incidents).
- Draft or refine an AI acceptable use policy and parent communication.
- Document wins, issues, and costs; share with the state to inform the work group's recommendations.
"AI is here," Henderson said. The task now is practical: teach responsible use, protect students, and measure what actually helps learning-so policy keeps pace and classrooms keep quality.
Your membership also unlocks: