Generative AI Streamlines Radiology Reporting and Boosts Efficiency Without Sacrificing Accuracy

Northwestern Medicine’s AI drafts radiology reports from X-rays, boosting documentation speed by 15.5% without losing accuracy. It also alerts critical cases faster than traditional methods.

Categorized in: AI News Healthcare
Published on: Jun 11, 2025
Generative AI Streamlines Radiology Reporting and Boosts Efficiency Without Sacrificing Accuracy

AI Boosts Radiology Report Efficiency at Northwestern Medicine

Artificial intelligence (AI) is proving its value in healthcare by improving efficiency in data-intensive tasks. At Northwestern Medicine in Illinois, a generative AI tool has been integrated into the live clinical workflow to draft radiology reports from X-ray images. This integration led to a 15.5% average increase in documentation efficiency without sacrificing diagnostic accuracy.

How the AI Model Works

X-rays are essential for diagnosing and staging diseases. Typically, a patient’s imaging data is uploaded to the hospital’s PACS (picture archiving and communication system) and sent to radiology reporting software. Radiologists review the images and clinical data, then write a report to guide treatment.

To speed up this process, a team led by Mozziyar Etemadi developed a generative AI model trained on historical data from Northwestern’s 12-hospital network. The AI drafts a report from the imaging data, which radiologists then review and edit. This draft appears within seconds after the image is acquired, significantly reducing the time spent starting reports from scratch.

The model is tailored specifically for radiology, proving more accurate than general-purpose tools like ChatGPT and far less costly to operate.

Clinical Impact and Efficiency Gains

The AI model was tested on 23,960 radiographs over five months, covering various anatomical sites—not just chest X-rays but also abdomen, pelvis, spine, and extremities. The use of AI reduced average documentation time from 189.2 seconds to 159.8 seconds. Some radiologists saw efficiency improvements up to 40%, translating to over 63 hours saved and a reduction of about 12 radiologist shifts during the study period.

To ensure quality remained high, the team compared reports with and without AI assistance. The rate of addenda—used to correct errors—was nearly identical for both groups. Peer reviews by second radiologists also found no differences in clinical accuracy or report quality between AI-assisted and non-assisted reports.

Automated Alerts for Life-Threatening Conditions

The AI system also flags critical findings like pneumothorax (collapsed lung) using an automated prioritization system. This system showed 72.7% sensitivity and 99.9% specificity for detecting unexpected pneumothorax. Alerts were generated within 21 to 45 seconds after study completion, compared to a median of 24.5 minutes for traditional radiologist notifications.

Unlike prior AI tools that simply output a yes/no result, this model generates detailed text reports. This allows for precise, context-aware alerts. For example, the system can distinguish between a new pneumothorax and one that is known and improving, reducing unnecessary alerts.

Future Directions

The research team is working to improve the AI’s accuracy and expand its capabilities to detect subtler and rarer findings. They are also developing versions for other imaging modalities, including CT, MRI, ultrasound, mammography, and PET, as well as for specialties like ophthalmology and dermatology.

This AI tool aims to support radiologists rather than replace them. As co-author Samir Abboud states, “You still need a radiologist as the gold standard. Our role becomes ensuring every interpretation is right for the patient.”

For healthcare professionals interested in integrating AI tools into clinical workflows or expanding their AI knowledge, exploring specialized training can be valuable. Courses focusing on AI in healthcare can be found at Complete AI Training.