DepEd unveils human-centered AI policy for schools, putting teachers and learners first

DepEd's new AI policy keeps teachers and students in charge-AI can help, but it doesn't take the wheel. It lays out risk-based rules, privacy checks, and equity steps for safe use.

Categorized in: AI News Education
Published on: Feb 26, 2026
DepEd unveils human-centered AI policy for schools, putting teachers and learners first

DepEd issues AI policy for basic education: Keep humans at the center

The Department of Education (DepEd) has released DepEd Order No. 003, s. 2026, setting national guidelines for how artificial intelligence should be used across basic education. The order puts a clear stake in the ground: AI can support teaching and learning, but teachers and learners stay in control.

The policy responds to fast growth in AI use in classrooms and DepEd offices, and the lack of unified rules on data privacy, content quality, and pedagogy. It introduces principles anchored to international frameworks, including Asean and UNESCO, and adopts a risk-based approach in line with the EU AI Act.

What the policy emphasizes

  • Human-centered, pedagogically sound use: AI is a tool to enhance instruction, not replace teacher judgment or student thinking. Teachers and learners remain at the center of the process.
  • Developmentally appropriate tools: Use AI in ways that meet real learner needs and keep humans in control of the output and decisions.
  • Inclusion and equity: AI access must extend to the poorest and most disadvantaged learners, with attention to linguistic and cultural diversity.
  • Protect and strengthen human agency: AI use must not erode students' intellectual or relational skills. It should build 21st-century skills-analyzing, evaluating, innovating, communicating, and applying learning in real contexts.
  • Risk-based governance: AI applications are categorized by risk, with stricter controls for high-risk uses and space for safe innovation with minimal or limited-risk tools.

Why this matters for schools

Without clear standards, schools face real risks: data privacy breaches, misleading or incorrect instructional content, and uneven AI practices that widen gaps between classrooms. The order provides a structure to reduce those risks while making space for responsible innovation.

Action steps for school leaders and teachers

  • Publish a school AI use policy aligned to DepEd Order No. 003, s. 2026. Define acceptable classroom and administrative uses, boundaries, and accountability.
  • Inventory AI tools currently in use (classroom, assessment, admin). Classify each by risk level and set approval paths for higher-risk uses.
  • Strengthen data privacy: avoid entering sensitive personal data; vet vendors for compliance; apply data minimization; conduct privacy or impact checks for higher-risk tools; get appropriate consent.
  • Guard pedagogy and integrity: specify what AI assistance is allowed; require students to show their process (drafts, prompts, reflections); prefer AI for feedback, differentiation, and accessibility-not for replacing original work.
  • Verify content: double-check AI outputs, cite sources, and keep teacher review in the loop, especially for instructional materials.
  • Plan for equity: provide alternatives for learners without devices or connectivity; support local languages; consider low-bandwidth or offline options.
  • Build capacity: run short, practical trainings on prompt writing, evaluation of AI outputs, privacy, and classroom routines for safe use. See the AI Learning Path for Teachers for structured upskilling.
  • Set governance basics: appoint an AI focal person; create a simple incident reporting flow; log use cases and outcomes; add AI criteria to procurement.

Making risk-based use practical

The order's risk lens is consistent with global frameworks. As a rule of thumb, keep tighter controls on tools that influence high-stakes decisions or process sensitive data, and lighter controls on classroom aids that support learning without profiling students.

  • Lower risk examples: writing assistance, translation, formative feedback, content outlining-when reviewed by teachers and used without student identifiers.
  • Higher risk examples (require stronger safeguards): remote proctoring, biometric or face analysis, predictive analytics on student performance, or tools that automate high-stakes decisions.

Document decisions, justify the risk level, and record mitigations (privacy steps, human oversight, accuracy checks). Review regularly.

Documentation you should prepare

  • Parent and student communication templates explaining AI use
  • Student AI use guidelines and honor code
  • Lesson plan checklist for AI-supported activities
  • Simple risk register of AI tools and use cases
  • Vendor evaluation checklist (privacy, accuracy, accessibility, cost)
  • Data retention and incident response procedures

Context and further reading

For policy and systems leaders

Division, regional, and central offices can standardize templates, centralize vendor vetting, and support schools with training and diagnostics. For structured governance resources, see the AI Learning Path for Policy Makers.

Bottom line: keep people first, think in terms of risk, and make your safeguards visible. That's how AI becomes a practical ally in Philippine classrooms-without compromising learning, equity, or trust.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)