HIMSS26 Preview: Dr. Guido Giunti on AI, digital literacy, and change that sticks
AI won't fix healthcare by itself. People will. That's the core of what Dr. Guido Giunti plans to cover at HIMSS26 - helping clinicians and administrators ask sharper questions, reduce risk, and lead change from any seat in the hospital.
Dr. Giunti is chief data officer at St. James' Hospital and an adjunct professor at Trinity College Dublin and the University of Oulu. His message is simple: know how AI works, know how it fails, and you'll make better calls with less stress on staff.
Why digital literacy comes first
You wouldn't hand a scalpel to someone who doesn't know how sharp it is. AI is no different. As Dr. Giunti puts it, "AI is not magic; it's math and data with a really good PR team."
Through work with the European Union's SUSA Consortium, he frames digital literacy as a core clinical competence - a safety, quality, and sustainability issue. This isn't about turning staff into programmers. It's about building informed skeptics who can question outputs, spot failure modes, and push back when a system drifts.
Skip the training and the tech becomes another burden. Get the basics right and AI can actually lighten the load.
How he approaches transformation
Dr. Giunti likes the tango analogy: coordination, trust, and a few missteps at the start. He co-creates with the people who will use the tool - including the nurse on a 2 a.m. shift. Too many pilots fail because that voice wasn't in the room.
His toolkit blends design thinking, organizational awareness, and strategic foresight. And yes, a little discomfort is healthy. If everyone feels safe, you're probably just optimizing the status quo while real disruption happens outside the room.
What he wants attendees to leave with
AI isn't a shiny object to "implement." It's a prompt to rethink how care is delivered and how teams work. The goal: ask better questions and make smaller, smarter moves that add up.
Here are the kinds of questions he wants more teams to ask:
- Who is this for, and what problem does it actually solve?
- What are the known failure modes, and what's our fallback when it fails?
- Which data trains it, and who checks for bias and drift?
- Does it cut clicks, steps, or cognitive load - or just make dashboards prettier?
- What's the audit trail, and who is accountable for decisions?
- Does it help patients, or are we just digitizing bureaucracy?
You don't need a C-level title to influence change. Sometimes it's a new question, a new ally, or finally retiring that 1997 tool that crashes when you sneeze near it.
Practical steps you can start this quarter
- Require a one-page model card for every AI tool: purpose, data sources, limits, failure modes, monitoring plan.
- Run "AI failure drills" on high-risk workflows and define clear rollback protocols.
- Offer 30-minute micro-sessions for staff: how to read outputs, spot bias, and escalate issues.
- Co-design with frontline staff (include night and weekend shifts) before any pilot.
- Pilot in one workflow with 2-3 concrete metrics (time saved, clicks reduced, error rate) and a stop/go rule.
- Stand up basic data governance: access control, quality checks, bias reviews, and audit logs.
- Name a clinical AI steward per unit to collect feedback and own issue triage.
Session details
Dr. Guido Giunti will speak at HIMSS26 on Tuesday, March 12, from 8:30-9:30 a.m. in Lido 5 | Level 5 at the Venetian in Las Vegas. Expect candid insights, practical takeaways, and a focus on literacy-first digital transformation.
Want structured upskilling for your team?
If you're building AI literacy across roles, explore curated programs by job category at Complete AI Training. You can also scan the latest AI courses to plug into existing staff education plans here: Latest AI Courses.
Your membership also unlocks: