Human first, then AI: Lessons from WISE 12 on responsible classroom technology
At WISE 12 in Doha, a workshop on "Responsible AI in Education: The Evidence we need to build, scale and trust it" made one thing clear: put human emotions and relationships at the center, then build the tech around them. The summit's theme - "Humanity.io: Human Values at the Heart of Education" - matched the tone in the room. AI can speed up learning, but it should serve teachers and students, not replace the bond that moves learning forward.
Across the discussion, experts agreed: AI works best as a knowledge accelerator and a partnership builder. It can personalize content, free teachers from repetitive tasks, and make measurement more consistent. But without a human-first frame, it risks dulling curiosity and outsourcing thinking.
Define "learning" before you deploy AI
Jawad Ashgar, Edtech and AI lead at the Gates Foundation, called out the scale of the challenge: too many students are behind, and progress is slow. "We want to make sure that these technologies are contributing towards learning and we also need to have very clear definitions about what contributes to improve in learning."
The funding is already moving. Investment in AI for education rose from about $2.5bn in 2022 to $7bn in 2025, with estimates of $30-50bn by the end of the decade. If schools don't anchor purchases to evidence and clear outcomes, money will run faster than learning.
Assistive tech can help - but don't outsource thinking
Modupe Olateju, Founder of The Education Partnership Centre and fellow at the Brookings Institution's Center for Universal Education, highlighted AI's role in assistive technologies for learners with difficulties. The caution was just as strong: "Technology is not the problem, it's usually how humanity chooses to use the technology that is the problem."
Her concern: simple thinking is being handed to machines by default. "There's this assumption that AI will give you a sharper answer... which isn't necessarily the truth because we must all remain human." Build AI supports, yes - while protecting core cognitive work.
Coaching at scale: feedback loops, not shortcuts
Haroon Yasin, CEO and Co-founder of Taleemabad, shared an experiment grounded in real classrooms. "We began an experiment and had a data set of over 20,000 observations. We used this data potentially to train AI to listen into classrooms and give feedback to the teacher."
The takeaway for schools: AI can strengthen coaching - if it's tied to an instructional model, observation rubrics, and supportive feedback. No surveillance theater. Just clearer signals to help teachers improve.
Craft matters: better teaching, better coaching
Merlia Shaukath, Founder and CEO of the Madhi Foundation, kept the focus on practice: "Our primary focus right now is how we want our teachers to teach better and how to get our teacher coaches to coach better." Tools should reduce friction so teachers can spend more time teaching, not clicking.
Personalized learning in familiar channels
AbdulHamid Haidar, Founder and President of Darsel, explained how the team delivers curriculum-aligned, personalized learning through WhatsApp in partnership with schools. Meeting students where they already spend time can improve engagement and lower barriers to access, especially in bandwidth-constrained contexts.
What education leaders can do next
- Define success upfront: Pick 2-3 learning outcomes (e.g., reading fluency, problem-solving) and the measures you trust. Tie every AI pilot to those metrics.
- Start small: Run 8-12 week pilots in a few classrooms. Compare against a matched control. Keep sample sizes honest and the documentation simple.
- Protect teacher time: Use AI to cut planning, grading, and admin. Set a target (e.g., reclaim 3-5 hours/week) and verify it with time audits.
- Strengthen coaching: Combine short, structured observations with AI-generated prompts or exemplars. Feedback must be specific, kind, and actionable.
- Guardrails first: Establish data privacy, consent, and transparency practices before rollout. See UNESCO's guidance on AI in education for policy baselines (UNESCO Guidance).
- Avoid cognitive offloading: Build "think time" into lessons. If AI drafts an answer, require students to edit, explain, or critique it.
- Support inclusion: Prioritize assistive features (speech, dyslexia-friendly outputs, translation) and fund devices/connectivity where needed.
- Ask vendors for evidence: Require clear claims, independent evaluations, interoperability, and a plan for educator training and support.
- Measure, then scale: Track learning gains, teacher time saved, and student engagement. Kill what doesn't work. Scale what does.
- Communicate with families: Explain where AI is used, why, and how student data is protected. Keep language plain and specific.
Human-first checklist for classrooms
- Relationships before recommendations: Does the tool strengthen teacher-student connection?
- Clarity over complexity: Can teachers use it without a manual and see impact within two weeks?
- Feedback loops: Does it generate useful, bite-sized feedback for students and teachers?
- Evidence on learning: Is there credible data on outcomes, not just usage?
Context and resources
This conversation took place at WISE, a Qatar Foundation initiative focused on practical solutions for education systems. To explore the summit and its ongoing work, visit WISE.
If you're planning professional development around AI literacy for staff, you can browse role-based options here: AI courses by job.
Bottom line
AI can speed up planning, personalize practice, and sharpen feedback. But the heart of learning is still human: relationships, curiosity, judgment. Build the tech around that - and make the evidence prove it.
Your membership also unlocks: