Sutter Health embeds AI decision support in Epic to put up-to-date evidence at the point of care

Sutter Health is weaving OpenEvidence's AI into Epic so doctors get guideline-backed answers with plain-language search at the bedside. Faster decisions, fewer clicks, safer care.

Categorized in: AI News Healthcare
Published on: Feb 17, 2026
Sutter Health embeds AI decision support in Epic to put up-to-date evidence at the point of care

Sutter Health brings AI decision support into Epic EHR

Sutter Health is embedding an artificial intelligence-powered evidence platform from OpenEvidence directly into its Epic workflows. Physicians will be able to use natural language to find current guidelines, studies and summaries at the point of care, with quality and safety standards built in.

The goal is straightforward: reduce time to trusted answers without pulling clinicians out of their flow. The platform surfaces the latest clinical evidence where decisions happen.

Why it matters

"We share a vision for reimagining healthcare for the better," said Laura Wilt, Sutter Health's chief digital officer. "It's how we're transforming the way we serve patients, support care teams and improve outcomes."

Dr. Ashley Beecy, Sutter Health's chief AI officer, added, "Patients benefit when providers have the most current and relevant evidence incorporated into clinical decision-making."

How clinicians will use it

  • Ask questions in plain language and quickly surface guideline-backed answers.
  • Stay inside Epic; no copy-paste across systems or separate logins.
  • Pull in recent studies and care pathways while maintaining safety and quality guardrails.
  • Use it at the bedside or in clinic to support decisions without slowing visits.

The larger trend

Sutter began applying generative AI two years ago to cut burnout and improve sustainability. At the time, Dr. Albert Chan, chief health information officer, said the platform helped providers "recharge." OpenEvidence leaders say the collaboration focuses on healthcare sustainability and medical AI safety.

Recent research points to a blended model for clinical decision support. Mass General Brigham compared GPT-4 and Google's Gemini 1.5 with its long-used diagnostic system, DXplain. While DXplain was more accurate on case diagnosis, researchers found value in pairing the systems, noting: "A hybrid approach that combines the parsing and expository linguistic capabilities of LLMs with the deterministic and explanatory capabilities of traditional DDSSs may produce synergistic benefits." For more on training and methods used in AI health research, see AI Research Courses.

What to watch

Success will hinge on governance, trust and measurable outcomes. Health leaders will want to see faster information retrieval, fewer unnecessary tests, and consistent alignment with current guidelines-without adding clicks.

  • Governance: set clinical review, safety, and update processes.
  • Use cases: start with high-impact queries (diagnostic differentials, drug info, care pathways).
  • Training: brief, workflow-first education for clinicians and support staff; consider AI for Management for governance and safety-focused training.
  • Monitoring: track utilization, accuracy feedback and downstream outcomes.
  • Equity and safety: watch for gaps in evidence coverage and address bias.

On the record

"Digital innovation plays a central role in our work to build a more connected, proactive and sustainable healthcare system," said Wilt. OpenEvidence's clinical leadership also emphasized advancing healthcare sustainability and medical AI safety through the partnership.

Sources and further reading

Upskilling your team

If you're planning clinician education on AI and clinical decision support, you can browse structured AI courses by job role here: Complete AI Training - Courses by Job.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)