VA puts AI to work: sharper care, lighter workloads
As chief technology officer and chief artificial intelligence officer at the Department of Veterans Affairs, Charles Worthington is clear about where this is going: "I have never been more excited about the potential for a technology to improve the way VA works for veterans."
From clinical settings to back-office tasks, the VA is scaling practical AI. "We have over 227 use cases in our inventory. Over 100 of them are in an operational phase, and the number of ways we're using AI is growing even more," he says. Mary Russell, senior director of clinical services at CliniComp, expects momentum to build: "As these tools get more and more ingrained in the health care systems, we'll see more and more truly wonderful accomplishments."
Early wins in care delivery
VA clinicians are using AI for screening and risk identification. One example: a device that flags potentially concerning polyps during colonoscopies, improving detection rates by 20 percent. The pattern is consistent-AI surfaces what's easy to miss, while clinicians stay in control.
Operational efficiency at scale
On the administrative side, a secure, VA-wide generative AI assistant is already in use. Over 100,000 employees have tried it; 80 percent report higher job efficiency. Surveys show an average of two hours saved per week through document summarization, research assistance, and help with routine writing-freeing teams to focus on work that directly benefits veterans.
Training responders without the strain
Through the VA's Mission Daybreak initiative, ReflexAI built simulations that help Veterans Crisis Line responders practice complex crisis scenarios. Trainees can rehearse difficult conversations in a safe, repeatable environment with instant feedback-strengthening communication, empathy, and intervention skills without the heavy lift of traditional role plays.
Expanding potential: safety, access, and proactive care
AI can spot weak signals at scale. Dr. Ari S. Lakritz at OSF HealthCare notes that systems can flag statements implying self-harm in social media or free-text surveys-surfacing risk faster and more consistently for clinical review.
Access is another pressure point. U.S. Army veteran Marcus Forman points out how complex and stressful benefits applications can be, especially for older veterans with longer service histories. He sees AI helping evaluate and process claims to speed up connections to care.
Russell highlights a uniquely VA challenge-and opportunity. Veterans have medical records and detailed service records (deployments, locations, addresses). AI could map that history to likely exposures and risks, then prompt early interventions before issues escalate.
What's next: documentation and front-door access
The VA is piloting "Ambient Scribe" to listen during clinician-patient conversations and draft medical notes. The goal: reduce time spent transcribing, increase accuracy, and let clinicians stay present with patients.
On the access side, the team is testing AI to improve contact center interactions and website virtual agents. Faster answers, clearer guidance, fewer handoffs.
Why this matters for healthcare and operations leaders
Real outcomes are showing up: earlier identification of serious health risks, higher cancer detection, support for suicide prevention efforts, and smoother workflows. Clinicians spend more time at the top of their expertise; staff spend less time on busywork.
Veterans notice. Less paperwork. Faster service. More attention where it counts.
How to put this into practice
- Define the job to be done. Start with use cases tied to clinical or operational outcomes (e.g., reduce note time, improve detection rates, shorten call handle time).
- Stand up a governed sandbox. Pilot with clear metrics (time saved per FTE, detection deltas, error rates), then scale what works.
- Equip the workforce. Provide a secure genAI assistant, simple do/don't guidelines, prompt libraries, and a feedback loop for continuous improvement.
- Get data ready. Integrate clinical and service records with strict privacy controls, audit trails, and role-based access. Treat PHI security as non-negotiable.
- Bake in safety. Use bias and performance checks-especially for risk classification and triage. Keep a human in the loop for clinical decisions.
- Procure for outcomes. Require transparency, auditability, and clear service-level targets from vendors.
- Measure what veterans feel. Track turnaround times, satisfaction, and resolution rates-not just internal efficiency.
For risk management and governance, the NIST AI Risk Management Framework is a solid reference for policy and controls. Review it here.
If your team needs practical upskilling in AI for healthcare and operations, explore focused programs by role: Complete AI Training - Courses by Job.
Bottom line
AI is already improving care and easing workloads across the VA. The playbook is simple: start with clear problems, protect data, keep people in control, and scale the use cases that show measurable value.
Your membership also unlocks: