Doctor warns against using AI to diagnose medical symptoms
Millions of Americans are turning to ChatGPT and similar tools for medical advice, but physicians say the practice carries real risks. Dr. Heath Haggard, an internal medicine physician at Southview Medical Group in Birmingham, Alabama, cautions that AI should never replace a doctor's evaluation.
People routinely use AI tools for everyday questions-recipes, travel plans, text suggestions. Medical diagnosis is different. The stakes are higher, and the tools are not equipped for it.
Three specific dangers
Haggard identifies three problems with using AI for self-diagnosis:
- Patient bias: The information people feed into AI can be incomplete or skewed, sending the tool down the wrong path.
- Language dependency: AI relies heavily on how questions are worded, which can produce misleading results.
- Skipped reasoning: AI shortcuts the clinical reasoning process that doctors use-the most dangerous shortcut of all.
"You kind of cut short the path of clinical reasoning and that's probably the most dangerous thing," Haggard said.
AI has a limited role in medicine
Haggard acknowledges that AI may help medical professionals with administrative tasks like paperwork. Using it to diagnose patients or yourself crosses a line.
The guidance mirrors long-standing warnings about relying on search engines for health questions. If you're sick, see a doctor. That remains the safest path to accurate answers.
For more on AI for healthcare, including how the technology is being implemented responsibly in clinical settings, see our coverage.
Your membership also unlocks: