Dr. Beulah Adigun's Work with AI in Educational Leadership
Artificial intelligence is changing how schools operate, plan and learn. For Dr. Beulah Adigun, assistant professor of educational leadership at Oklahoma State University, it's a practical tool woven into how she prepares current and future leaders. She centers her work on three pillars-teaching, research and service-using AI with a cautious, collaborative and transparent approach. "My goal as an academic is to prepare educational leaders for success," she said. "AI is one part of that preparation."
Her stance is clear: use AI as a supportive tool, not a substitute. Human judgment, critical thinking and peer review stay in the lead.
Teaching: The Leader Conversation Lab
In the classroom, Dr. Adigun builds realistic, interactive learning experiences. Through her Leader Conversation Lab, students engage with diverse stakeholder personas generated with free AI tools using real cultural, demographic and relational context.
The aim is simple: improve sense-making and empathy so leaders can build collaborative spaces that work. Students meet personas like parents, principals, elected officials and school mental health counselors-each with specific concerns. Through inquiry, they learn to see the issue from the stakeholder's view and co-create practical next steps.
The lab moves beyond "knowing what to do" and into practicing the conversations that lead to shared action. After sessions, students post reflections, compare AI responses and note gaps, then adjust their approach. She also brings in real stakeholders-community education leaders, career partners and elected officials-so students test their skills outside the lab.
Research: AI, Leadership and Well-Being
Dr. Adigun's research focuses on psychosocial processes that influence student and teacher engagement and well-being. She plans to study the Leader Conversation Lab itself, synthesizing student experiences to publish practical guidance for building experiential spaces where leaders can test ideas safely.
Her broader collaborative work examines how AI affects educational leadership-especially for marginalized communities. She studies whether leaders feel more autonomous, competent and connected when they use AI, and how that use affects well-being.
Equity is a constant thread. She highlights bias in AI systems and uneven access to tools between underserved communities and more resourced regions. For context on ethical use, see UNESCO's work on AI in education here.
Service: Expanding Access with ECHO Nigeria
In service, Dr. Adigun addresses gaps in AI awareness and adoption-especially in underfunded settings. Alongside a postdoctoral fellow and two graduate research associates, she co-leads ECHO Nigeria, a virtual professional development platform for educators across Nigeria and nearby regions.
After hearing strong interest in AI, the team built a training program focused on what mattered locally. They selected tools that work with limited budgets and connectivity.
Examples include Plotagon, which lets teachers create animated lessons offline without paywalls, and CurriAI, a Nigerian-built platform for generating curriculum materials like classroom quizzes. The goal: give educators practical options that expand their toolkit while respecting resource constraints. Used this way, AI becomes a supportive layer in leadership, not a replacement for professional judgment.
How Education Leaders Can Apply These Ideas
- Set a clear rule: AI supports; it does not replace human evaluation, ethics or context.
- Build persona-based simulations using free tools. Include local culture, roles and constraints.
- Run reflection threads after simulations. Compare AI outputs, flag weaknesses and iterate.
- Add real-world practice. Invite parents, principals, counselors and community partners to class discussions.
- Track outcomes. Define what "better conversations" mean-clarity, consensus, actions taken-and measure them.
- Prioritize equity. Choose low-cost, offline-capable tools and offer alternatives for limited bandwidth.
- Be transparent. Note where AI contributed to content, review or analysis; keep human checks in place.
- Address data privacy. Avoid feeding sensitive student or staff data into public tools.
- Start small. Pilot one course or team, gather feedback, then expand.
- Keep learning. Pair pedagogy with tech literacy and share practices across departments.
A Collaborative, Transparent Approach
Across every stage, Dr. Adigun promotes peer review, human evaluation and clear disclosures when AI is used. "AI is not replacing what I need to do in terms of my due diligence as a scholar," she said. "I still need to think, deduce and interpret."
She credits colleagues and graduate assistants for co-creating the work. Collaboration brings multiple perspectives and shared accountability, which strengthens both process and outcomes.
Her stance is balanced: AI can do impressive tasks, but it has limits and risks. The better question isn't how we feel about it-it's how we use it so educators can feel good about the results.
Further Learning
If you're building AI capacity for your team, browse practical options by role here: AI courses by job role. Start with one use case, test it with your staff, and scale what works.
Your membership also unlocks: