AI in Year 13? Educators Urge Earlier Teaching as Government Unveils New Secondary Subjects

AI enters senior curriculum in 2028 with a Year 13 subject and more STEM courses. Experts call for AI literacy from Year 1, plus teacher training, clear rules, and equity.

Categorized in: AI News Education Government
Published on: Sep 12, 2025
AI in Year 13? Educators Urge Earlier Teaching as Government Unveils New Secondary Subjects

AI enters the senior curriculum in 2028 - experts say start much earlier

The government will roll out a set of new secondary school subjects for Years 11-13 from 2028. A specialised Year 13 subject on Generative AI is flagged for later development, alongside a stronger emphasis on machine learning, cybersecurity, digital systems, and digital ethics across senior years.

Education Minister Erica Stanford says students should learn how generative AI works and how to use it responsibly. The move signals a broader pivot toward science, technology, engineering and maths, with new subjects also including mechanical engineering, infrastructure engineering, building and construction, civics, politics and philosophy, Pacific studies, and primary industry.

The case for earlier AI literacy

Associate Professor Kathryn MacCallum welcomes the plan but argues Year 13 is too late to begin. Many students already use AI tools well before senior years, often without guidance on safe and effective use.

Her position is clear: introduce explicit AI literacy from Year 1 and integrate it with digital citizenship. Students need to learn where AI fits, where it doesn't, and how it can influence their decisions. This requires more than tool tutorials; it's about concepts, judgment, and ethics threaded through all subjects.

Design principles for an AI curriculum

  • Start early and build progressively from Years 1-13 (concepts first, tools second).
  • Teach how AI works at a high level (data, models, patterns, limits), not just how to use apps.
  • Integrate digital ethics, privacy, bias, and safety into every level.
  • Blend AI with core subjects (English, sciences, social sciences, arts) to show real use cases.
  • Be explicit about when not to use AI (original thought, assessment integrity, sensitive contexts).
  • Ensure equitable access to devices, connectivity, and assistive tools.

Teacher capacity and delivery risks

Former secondary teacher Dr Nina Hood warns there are not enough specialist teachers to staff the new subjects. Capacity building will be required across the system.

  • Recruitment: attract specialists from industry and universities.
  • Partnerships: co-teach or mentor with industry experts where appropriate.
  • Professional learning: short courses and micro-credentials focused on pedagogy plus AI fundamentals.
  • Resource banks: centrally developed lesson plans, exemplars, and safe-use policies to reduce teacher workload.

Consultation and timelines

The secondary teachers' union (PPTA) says teachers have limited detail on subject descriptors and haven't been adequately consulted. They argue two years is not enough to develop a full curriculum and recruit specialist staff for new subjects.

For durable reform, the process needs transparent development, early drafts for feedback, and clear assessment guidance. School leaders need timelines, funding signals, and professional learning pathways now-not months before rollout.

University Entrance and the status of outdoor education

Mount Aspiring College principal Nicola Jacobson says reclassifying outdoor education as vocational (removing University Entrance/Academic status) will discourage students who plan to attend university. She argues this creates a false divide between "academic" and "vocational" and undervalues broader skill sets that NCEA has traditionally recognised.

What education and government leaders should do next

1) Build a coherent AI pathway from Year 1 to Year 13

  • Years 1-6: Basic data concepts, safe use, media literacy, simple AI examples.
  • Years 7-10: Bias, privacy, prompt quality, evaluating AI outputs, simple models in context.
  • Years 11-13: Applied projects, ethics cases, sector use (health, law, engineering, creative), and a specialised Generative AI subject in Year 13.

2) Fund teacher upskilling and recruitment

  • Offer paid release time and micro-credentials aligned to curriculum outcomes.
  • Establish school-industry partnerships for mentoring and project briefs.
  • Provide centrally authored units and assessments to ensure consistency and reduce duplication.

3) Set clear assessment rules and integrity controls

  • Define acceptable AI use by task type; require disclosure when AI assists.
  • Use assessment designs that check process, drafts, oral defenses, and practical demonstrations.
  • Update UE recognition to avoid devaluing practical subjects that build real capability.

4) Prioritise equity and infrastructure

  • Guarantee device and internet access for all students in AI-related courses.
  • Adopt privacy-first policies and safe default tools for schools.
  • Ensure accessibility features benefit neurodiverse and disabled learners.

5) Pilot, learn, scale

  • Run pilots in diverse schools in 2026-2027 with independent evaluation.
  • Publish exemplars, what worked, and what didn't-early and openly.
  • Scale with phased training and targeted funding in 2028.

Useful references and resources

Optional upskilling for school leaders and teachers

If you're planning professional learning in AI for staff, curated course lists can accelerate selection and rollout.

Bottom line

Introducing a Year 13 Generative AI subject is a step forward, but the real win is a coherent AI and digital literacy pathway that starts in primary school. Get teacher capability, assessment rules, and equity right now, and the 2028 rollout can land cleanly-with broad benefits across the curriculum.