Beyond hype and fear: Irmak Atabek's KidsAI builds kid-safe AI in Dubai

Irmak Atabek wants kids to meet AI without the hype or fear-no pretending to be human, clear limits, plain talk. From the UAE, KidsAI's Olii makes learning safe, local, and fun.

Published on: Nov 05, 2025
Beyond hype and fear: Irmak Atabek's KidsAI builds kid-safe AI in Dubai

Irmak Atabek: "We're living in a moment of extreme hype and fear around AI"

Irmak Atabek, co-founder and CEO of KidsAI, has a clear goal: help kids make sense of AI and thrive with it. After being selected for the AI Campus at Dubai's International Finance Centre, she moved the company to the UAE and doubled down on building child-first technology and media.

Her stance is simple: AI should never pretend to be human. Kids deserve clarity about what they're talking to, how it works, and where its limits are.

Ethical AI for kids starts with honesty-no pretending to be human

KidsAI builds age-appropriate AI behavior that is transparent, non-deceptive, and cognitively safe. The team avoids anthropomorphizing, sets clear boundaries, and designs for learning and emotional well-being.

Through an Innovation Hub, KidsAI works with experts in AI ethics, developmental psychology, and education to define a healthy relationship between children and machines. Their assistant, Olii, encourages questions, sparks curiosity, and consistently explains what it can and can't do.

The Gulf context: high ambition, real need for accessible tools

Across the Gulf, governments are investing heavily in AI literacy and infrastructure. In the UAE, AI literacy is part of the school curriculum-clear evidence of long-term thinking about education and ethics.

Families are curious and optimistic, yet many feel underprepared to guide their kids. KidsAI steps in with safe, localized, values-consistent learning experiences that help parents and educators keep pace.

Why the UAE matters for building globally relevant child-tech

Operating from the UAE has pushed KidsAI to think global while being precise with local culture and identity. The country is positioning itself as a bridge between East and West-open to collaboration, serious about testing and scale, and thoughtful about regulation.

That mix gives teams room to build AI products for kids that feel inclusive, responsible, and built to last.

Personalization and localization that teach "how AI works"

Hype and fear are loud. KidsAI rejects the "AI is magic" narrative and breaks concepts down into kid logic: What is bias? What is a dataset? How does a machine "learn" from examples?

If a child grasps fairness on a playground, they can grasp fairness in an algorithm. KidsAI personalizes across languages, cultural references, and local symbols-so the AI doesn't just speak a child's language, it reflects their world and uses each interaction to teach how it works.

Beyond software: "Project Olii" makes AI literacy fun

KidsAI is also building media. "Project Olii" is a fast, funny animated series for ages 6-9 that follows Zoe (9), a curious coder, and Minjun (10), a hands-on builder. Together, they create Olii, a playful AI-powered robot that turns everyday kid life into lessons about automation, hallucination, and algorithmic decision-making.

Zoe's character is intentional-she inspires girls to step into tech by being bold, witty, and effortlessly smart. The series will launch on YouTube to make AI literacy engaging and accessible worldwide.

Why three women founders matter here

Data reflects society. If teams lack diversity, kid-facing AI will too. KidsAI treats development as both a technical and cultural responsibility-clean, inclusive inputs lead to cleaner, more inclusive outcomes.

As women and as mothers, the founders bring lived experience to products that will literally talk to the next generation. Ethics, behavior, and governance aren't afterthoughts-they're part of the design brief.

How schools and partners can work with KidsAI

KidsAI collaborates with schools, educational organizations, and public institutions on AI transformation, digital safety, and practical AI literacy. They offer seminars and workshops for parents, teachers, and students.

An "AI & Children" Certification Program-co-created with experts in education, psychology, and AI ethics-launches in January. From March 2026, KidsAI will pilot the Olii assistant in 20 selected schools and is already seeing strong demand from both public and private institutions.

Practical takeaways for educators, IT leaders, and developers

  • Disclose the system: make it obvious the assistant is an AI, not a person; avoid human-like personas and voices.
  • Design by age band: adapt tone, content, and capabilities to cognitive stages; set clear guardrails and default to safety.
  • Protect data: minimize collection, use clear parental consent flows, and document retention policies.
  • Reduce bias: audit datasets, test for representation gaps, and include child-development experts in red-teaming.
  • Make it explainable: show "why this answer," use simple analogies, and let kids ask "how did you get that?"
  • Localize deeply: support dialects, cultural examples, and symbols; avoid one-size-fits-all content.
  • Plan for safety ops: incident reporting, educator review tools, and defenses against adversarial prompts.
  • Give educators control: dashboards, curriculum links, and activity logs parents can review.
  • Measure outcomes: track learning gains, engagement quality (not just time-on-task), and well-being signals.

Resources

If you're a school, district, or public institution interested in pilots or the certification, KidsAI is open to collaboration. This is where ethics, pedagogy, and engineering meet-and kids deserve our best work.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)