Better vendor partnerships can stop clinicians from using consumer AI tools at work, says Swiss Medical Network CIO

Clinicians are using ChatGPT and other consumer apps for clinical work when hospitals don't provide better options, creating serious data and compliance risks. The fix isn't bans-it's building secure AI tools staff actually want to use.

Categorized in: AI News Healthcare
Published on: Mar 26, 2026
Better vendor partnerships can stop clinicians from using consumer AI tools at work, says Swiss Medical Network CIO

Health Systems Should Build Secure AI Tools In-House to Stop Clinicians Using Consumer Apps

Health systems need to work with vendors to develop secure, user-friendly AI tools designed for clinical work. Without them, clinicians will turn to consumer AI applications-creating data security and compliance risks.

Patrick Bizeau, CIO of Swiss Medical Network, made the case at HIMSS26 in March. The problem is straightforward: if clinicians lack approved AI tools that are easy to use, they'll use whatever's available, including ChatGPT and other public platforms.

The Shadow AI Problem

Consumer AI tools weren't built for healthcare. They don't meet regulatory requirements, they don't protect patient data, and they create liability for organizations. When clinicians adopt them without IT oversight, health systems lose visibility and control.

Bizeau's argument cuts through the usual vendor pitch. The solution isn't to ban tools or lecture staff. It's to make official AI tools so practical and secure that clinicians prefer them.

What This Means for Healthcare IT

Health systems should prioritize AI for Healthcare solutions that clinicians actually want to use. That means speed, accuracy, and integration with existing workflows-not just compliance checkboxes.

Understanding Generative AI and LLM capabilities helps IT leaders and clinical teams collaborate on the right specifications. Clinicians need tools that solve real problems in their daily work.

The alternative-trying to prevent clinicians from using consumer AI through policy alone-doesn't work. People find workarounds when official tools don't meet their needs.

Health systems that build or procure purpose-built clinical AI tools first will reduce shadow IT adoption and keep patient data secure.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)