AI amplifies team strengths and weaknesses, Google's 2025 DORA report finds
AI is everywhere in dev and acts as an amplifier: strong practices compound, weak ones crack. Solid platforms, data, and small, reviewed changes turn speed into reliable value.

AI magnifies your team's strengths - and weaknesses
AI is in your stack whether you planned for it or not. The latest DORA research shows that AI boosts teams that already operate well and exposes the cracks in teams that don't.
Key takeaways
- AI adoption in development is near universal; median usage is about two hours per day.
- AI is an amplifier: it multiplies strong practices and magnifies dysfunction.
- High-quality internal platforms and value stream management are decisive for AI ROI.
What the DORA study covers
Based on 5,000+ professionals and 100+ hours of interviews, this 142-page report digs into how AI is changing software delivery at scale. If you want the source, see the DORA research hub at dora.dev.
AI is now built into daily development
Between 90% and 95% of developers use AI for work, up roughly 14% from last year. Most don't use it constantly: 39% "sometimes" use AI, while 60% use it about half the time or more when solving problems. Eighty percent report productivity gains; 59% see better code quality. Trust is split: 70% trust AI output, 30% don't.
Practical read: AI speeds throughput, but first drafts are often off. Teams that review, test, and validate consistently turn speed into shippable software. Those that skip the guardrails ship rework.
AI is an amplifier
AI multiplies whatever system it enters. Clear prompts, tight version control, small changes, and automated tests drive compounding gains. Sloppy prompts, weak reviews, and big-bang changes create faster errors, not faster value.
Treat AI like any new teammate with superhuman speed and zero context. Give it constraints, great inputs, and automatic checks. Without those, you'll create messes quicker.
What DORA measured
- Team performance: collaboration and effectiveness
- Product performance: product quality and outcomes
- Software delivery throughput: speed and efficiency
- Software delivery instability: reliability and quality
- Individual effectiveness: personal impact and flow
- Valuable work: sense that work matters
- Friction: blockers to getting work done
- Burnout: exhaustion and cynicism
Seven team archetypes
- Foundational challenges: survival mode, gaps everywhere
- Legacy bottleneck: constant firefighting, unstable systems
- Constrained by process: stable but slow due to bureaucracy
- High impact, low cadence: strong output, unstable delivery
- Stable and methodical: deliberate pace, consistent quality
- Pragmatic performers: reliable, fast, moderately engaged
- Harmonious high-achievers: sustainable, stable, top performance
Speed vs. stability is a false trade-off. The top performers (pragmatic performers and harmonious high-achievers) ship quickly and with quality. They don't choose between speed and reliability-they build systems that deliver both.
Seven practices that separate high performers
- AI policy: clear rules, approved use cases, and data safeguards
- Data ecosystems: clean, documented, and connected internal data
- Accessible data: safe pipelines from internal sources to AI tools
- Version control: everything in Git, everything reviewed
- Small batches: small PRs, frequent merges, fast feedback
- User focus: decisions anchored to user outcomes, not tech theater
- Quality platforms: shared tooling, paved paths, and automation that remove toil
AI success is a systems problem, not a tools problem. Invest in platforms, data, and engineering discipline, and AI will accelerate what already works.
Two multipliers for AI ROI
Platform engineering: About 90% of orgs report adopting it. Strong internal platforms reduce friction and make AI useful by default. Weak platforms stall AI benefits because developers spend time wrestling the environment.
Value stream management (VSM): Map work from idea to production and measure flow. Strong VSM tells you where AI should be applied (reviews, testing, release steps), turning isolated gains into organization-wide improvement.
90-day plan for IT, engineering, and product
- Days 0-30: Publish a lightweight AI policy; pick three high-value, low-risk AI use cases; baseline lead time, deployment frequency, change failure rate, and MTTR.
- Days 31-60: Strengthen your platform's golden paths; enforce small PRs and code reviews; add automated tests and guardrails for AI-assisted changes.
- Days 61-90: Connect AI to vetted internal knowledge; stand up a simple VSM view of your pipeline; create feedback loops with teams and track outcome metrics.
What this means for your org
AI is mainstream. Tools matter less than the system you drop them into. If your platform, data, and delivery practices are strong, AI compounds your advantage. If not, fix the foundation first.
Want the research source? Start here: DORA research. If your teams need structured upskilling on AI use in coding and product workflows, see Complete AI Training: courses by job.
Which archetype fits your team today-and which one do you want to be in six months? Answer that, then use the seven practices to close the gap.