Executives think AI adoption is high. It isn't.
Leaders believe AI is embedded across the business. The data says otherwise. Multiverse reports 59% of leaders think employees collaborate with AI daily, but only 42% of employees say they actually do.
The gap is wider where it matters most. While 23% of CEOs believe staff are delegating entire tasks to AI, just 8% of employees agree. For data analysis that feeds decisions, 58% of leaders assume adoption-only 36% of workers confirm it, a 25-percentage-point gap.
There's also a significant mismatch on automating repetitive work, an 18-percentage-point gap on optimizing multi-stage processes, and a 15-percentage-point gap on simple admin tasks. Translation: leaders are planning for a future that their teams haven't reached yet.
Why the perception gap exists
Executives see a few strong AI users and assume scale. Without instrumentation, anecdotes become strategy. Meanwhile, employees worry about accuracy, compliance, and job security-so they experiment quietly or not at all.
Tool sprawl and generic training make it worse. Teams don't know what's approved, what "good" looks like, or how quality is measured. So adoption stalls, or stays in the shadows.
AI usage skews senior
AI use rises with seniority. 52% of mid-level workers say they collaborate with AI daily, versus 21% of junior employees. Nearly half of middle managers (48%) use AI day to day, compared with 20% of individual contributors.
"AI is not a monolithic tool, and its application varies wildly between a junior developer, a middle manager, and a CEO. The 30% gap in adoption we see between seniority levels is a clear signal that the one-size-fits-all approach to AI is failing," said Gary Eimerman, chief learning officer at Multiverse. "To bridge this divide, businesses must move beyond generic training and implement custom AI upskilling paths tailored to the unique daily workflows of every individual."
Leaders aren't trained enough to lead
More than half of leaders (55%) report less than five hours of formal AI training. 58% say they're self-teaching with tools like ChatGPT just to cover the basics.
The appetite to fix it is there: 85% of leaders and 78% of employees want more frequent training to keep pace. Yet SAP research indicates only four in ten UK businesses have provided comprehensive AI training, and just 7% have an enterprise-wide AI strategy.
What to do this quarter
- Instrument reality: Track AI usage by team, task, and tool. Separate "assisted" (drafting, summarizing) from "automated" (end-to-end with review) to avoid inflated numbers.
- Create role-based playbooks: Map top 10 tasks per role. Label each task A) Do not use AI, B) AI-assisted with human review, C) AI-automated with defined checks.
- Standards before scale: Define acceptance criteria, audit trails, and human-in-the-loop checkpoints for any automated step. No standard, no automation.
- Applied learning, not lectures: Build 2-4 week sprints where teams solve live workflows and submit artifacts for review. Measure time saved and error rates.
- Appoint AI owners: One accountable lead per function. Their job: curate prompts, templates, SOPs, and monthly improvements.
- Run controlled pilots: Pick 2-3 workflows with high volume and clear success criteria. Publish results and template the process for others.
- Tighten governance: Approved tools list, data classification rules, red-teaming for sensitive use cases, and vendor guardrails in procurement.
- Tie to P&L: Report wins in hours saved, cycle time reduced, and error reduction-not just "adoption." Fund what proves ROI, cut the rest.
A simple framework: the 3x3 AI adoption grid
Use a quick grid to focus effort and training. Rows = role (Individual Contributor, Manager, Executive). Columns = task type (Admin, Analysis, Creation/Build).
- Individual Contributors: Admin = automate documentation; Analysis = first-pass data summaries; Creation = draft code/content with review.
- Managers: Admin = status updates and meetings prep; Analysis = scenario modeling and report generation; Creation = action plans from team inputs.
- Executives: Admin = briefing packs; Analysis = decision memos with risk/assumption callouts; Creation = board narratives and strategic options.
Assign one owner and one KPI per cell. If a cell has no owner, it won't move.
Training that actually sticks
- Job-to-be-done first: Teach prompts and tools in the context of the exact workflow, not generic features.
- Artifacts over attendance: Every session outputs templates, prompts, and SOPs the team can reuse tomorrow.
- Tool-agnostic principles: Focus on task design, data quality, review loops, and risk controls so skills transfer as tools change.
- Cadence: Monthly refreshers and office hours. AI changes weekly-your system should, too.
The risk of overestimating adoption
If you think AI is everywhere but it isn't, you make calls on shaky ground. Strategies overcommit, budgets misfire, and "shadow AI" grows without guardrails.
Close the perception gap with data, standards, and applied learning. Lead by example, but prove it with measured outcomes across real tasks.
Further reading and resources
- Multiverse - the source of the adoption data cited.
- SAP News Center - research and updates on enterprise AI strategy and training.
- AI for Executives & Strategy - playbooks, case studies, and training paths for leadership teams.
- AI Learning Path for CEOs - practical decision frameworks, governance, and role-specific upskilling.
Your membership also unlocks: