More in-house legal teams use AI despite trust concerns: ACC survey
A new survey of 650+ in-house legal professionals across 30 countries shows a clear shift: 53% already use generative AI in their day-to-day work. Another 14% are experimenting, and 17% are planning deployments. AI is moving from pilot to practice.
As ACC president and CEO Veta Richardson put it, adoption "has accelerated across in-house legal teams" and "reflects a growing willingness among in-house teams to challenge traditional models and explore new approaches." The center of gravity is moving inside the department.
Where AI is paying off today
Current users say the biggest gains are in drafting-advice, contracts, and policies-followed by legal research and team collaboration. These are high-volume, pattern-heavy tasks where small time savings compound fast.
Looking ahead, 64% expect to lean less on third parties for routine work. Half expect lower spend on external services, hinting at new staffing mixes, panel strategies, and how you scope matters with outside counsel.
Why some teams still hesitate
Among non-users and unintended users, 82% cite low trust in quality and reliability. Another 55% worry about how their data could be used. For many, perceived risk still outweighs the upside.
There are operational blockers too: 45% say AI isn't a workplace priority, 36% lack budget, and 9% face company-wide bans. Governance and resourcing are now as important as the tech itself.
What in-house leaders can do now
- Set policy: Define permitted use, data handling, and privilege. Update outside counsel guidelines to require disclosure of AI use and safeguards.
- Pick enterprise-grade tools: Look for data segregation (no training on your inputs), audit logs, access controls, and third-party security attestations (e.g., SOC 2, ISO 27001).
- Start narrow pilots: NDA review, clause drafting, issue spotting, summaries of discovery or investigations. Time-box and compare to a control group.
- Build human-in-the-loop: Require citations, mandate verification on sensitive outputs, and maintain a review checklist to catch hallucinations.
- Train your team: Teach prompt patterns for legal tasks, confidentiality protocols, and how to review AI output like a junior's draft.
- Measure outcomes: Track cycle time, accuracy, rework, and external spend. Use the savings to fund licenses and upskilling.
- Strengthen governance: Name an AI steward, maintain a risk register, and define incident response for any data exposure.
- Revisit sourcing: Shift routine work inside, refocus panels on specialized matters, and align fee structures with the new mix.
Bottom line for legal ops and GCs
AI is becoming a normal operating factor in corporate legal. Early movers are converting routine hours into throughput and reallocating spend. The teams that win will pair clear guardrails with focused pilots, then scale what works.
If you need structured upskilling for legal-specific AI skills, see Complete AI Training: Courses by Job.
For more on the organizations behind the survey, visit Association of Corporate Counsel (ACC) and Everlaw.
Your membership also unlocks: