Trust cannot be automated: education unions set the course for AI in schools
AI is being pushed into classrooms with big promises: efficiency, "personalised learning", and data-driven decisions. Educators see a different picture-questions about who controls teaching, what happens to autonomy, and how equity and labour rights hold up when algorithms sit between teachers and learners.
At a global conference in Brussels on 4-5 December, more than 200 union leaders, educators, and experts met to build a human-centred path. The message was clear: technology is a tool; people decide how it's used.
Technology is a tool. Teachers are the authors.
Belgian unions opened the event with a united call across languages: protect data, fund infrastructure, and defend teacher autonomy. As one leader put it, "Technology may be the tool, but humanity is the author. A chatbot can answer a question, but only a teacher's voice can say: 'I believe in you.' AI must amplify our voices, not replace them."
The real AI and what it does to work in schools
Researchers challenged the sales pitch. Efficiency claims often move work around rather than reducing it-into crafting prompts, checking and rewriting outputs, guarding privacy, managing ethical risks, and handling student misuse.
Some systems test teacher-less or teacher-reduced models, raising alarms for professional judgment, accountability, and quality. Add rising energy use, costs, and a widening AI divide, and the picture gets tougher for public education.
AI literacy needs more than tools and tricks. It has to include the human dimension-rights, democracy, and the health of the planet.
Evidence from South Korea's AI Digital Textbook reforms showed what top-down digitalisation looks like in practice. Teachers, parents, and civil society pushed back over costs, data protection, commercial influence, screen overuse, inequities, and weak consultation.
Inside classrooms, AI changes relationships. It reshapes student identities and peer dynamics, shifts teacher authority and visibility, and adds hidden labour as educators make brittle systems work. Too often, teacher voices are sidelined. They belong at the centre.
Union perspectives: nothing about us without us
- Tools should serve teachers, not control them. Opt-in/out rights matter, including the right to refuse harmful or unethical systems.
- Critical AI literacy starts with strong teacher training grounded in human rights and sound pedagogy.
- Equity issues are real: data built in corporate and geopolitical centres can misrepresent communities and fuel bias and digital colonialism.
- Assistive tech can help diverse learners, but evaluation must focus on equity, rights, and participation.
- Digital divides are growing through weak infrastructure, paywalls, and restrictive licenses. Public investment and independence from private platforms are essential.
As one expert noted, AI is probabilistic-good at plausible text, weak on meaning. "AI can create content, but not context." Education is drifting into a global experiment in data colonialism that can create echo chambers. Blind faith in vendor promises is not a strategy.
From analysis to action
Day two was about power and practice. Unions mapped how collective bargaining can guard against deskilling, job loss, and surveillance, while placing firm limits on data use and automated decisions.
They also explored how global standards, including the ILO/UNESCO Recommendations on the Status of Teachers, need updates for platform work, data governance, and algorithmic management. Digital tools that unions use to organise must respect privacy, democracy, and workers' rights.
International organisations warned of a policy vacuum-AI outpacing regulation and evidence. Too many classroom tools were built for business needs, trained mostly on data from the Global North, leaving many languages and school realities out of scope. The takeaway: teachers must retain control of pedagogy, or public education will weaken.
Regional commitments that move the needle
Member organisations committed to national campaigns for strong data and AI laws in schools, regional taskforces, shared research, and joint training for activists. A united front will help set common bargaining principles and a shared voice.
Universities are key allies for ethical frameworks. Diversity at the decision table is non-negotiable. Any AI project involving Indigenous communities requires free, prior, and informed consent, protection of intellectual and spiritual property, and clear safeguards.
Unions pledged to inform members, offer training, and equip negotiators with the language needed to address AI at the table.
Five pillars for a human-centred path
- Connect unions through a global network.
- Share independent, timely research.
- Drive advocacy at national and international levels.
- Build union capacity through training and organising.
- Provide thought leadership that keeps teachers-and human relationships-at the core.
The closing call was simple and firm: "We are the light-the human wisdom that illuminates the path forward. Human First. Always. Teachers Lead. Technology Serves. Education Unites."
What to do next (this term)
- Adopt a school or district AI policy with teacher-led governance. Protect the right to opt in or out and ban high-stakes automated decisions about students or staff.
- Negotiate AI clauses: no increase in workload without time and pay, data minimisation and retention limits, right to disconnect, no surveillance, and no opaque vendor models in evaluation.
- Set up an AI review committee with a teacher majority to vet tools, assess bias, and monitor effects on learning, workload, and wellbeing.
- Audit every tool. Demand data protection agreements, local relevance, accessibility, and clear classroom value. Build student AI literacy grounded in ethics and civic goals.
- Track costs and energy use. Prefer public-interest and open solutions where feasible to avoid vendor lock-in.
- Document harms and escalate through your union. Press regulators for clear rules and real enforcement.
For reference, see UNESCO's guidance on generative AI in education and research here. If you're building staff capacity, explore role-based AI upskilling options here.
Trust can't be automated. With organised educators, professional ethics, and democratic accountability, AI will serve public education-not the other way around.
Your membership also unlocks: