AI model predicts intimate partner violence up to four years before patients seek treatment

A Mass General Brigham AI model predicted intimate partner violence up to four years before patients disclosed abuse, reaching 88% accuracy. Researchers flagged chest pain, painkiller use, and arm scan frequency as risk indicators.

Published on: Mar 15, 2026
AI model predicts intimate partner violence up to four years before patients seek treatment

AI Models Predict Intimate Partner Violence Up to Four Years Before Disclosure

Researchers at Mass General Brigham trained machine learning models to identify patients at risk of domestic abuse by analyzing electronic medical records, vital signs, and imaging data. One model achieved 88 percent accuracy in predicting intimate partner violence before patients sought treatment, according to a study published Friday in npj Women's Health.

The models detected patterns that human clinicians typically miss. Chest pain, painkiller use, and increased radiology scans of the arms correlated with higher abuse risk. The AI reviewed years of medical history in ways radiologists - who spend only minutes reviewing scan results - cannot.

Dr. Bharti Khurana, an emergency radiologist at Mass General Brigham and study author, said the goal is early intervention. "The idea is to share resources sooner rather than later," Khurana said. "This is something I call proactive screening, instead of waiting for them to disclose and then offering services."

Why Detection Matters

The Centers for Disease Control and Prevention estimates that 1 in 3 women and 1 in 6 men experience intimate partner violence in their lifetimes. Yet most victims don't tell medical professionals, citing fear of judgment, concern their partner will find out, or financial and psychological dependence.

Khurana noticed patterns in scan results of abuse survivors - injuries to the face, neck, and upper body, specific visit types, and timing of emergency department arrivals. Machine learning made these patterns visible across large patient populations.

How the Models Work

The researchers developed three models using data from nearly 850 women at the Brigham's domestic abuse intervention center (2017-2019 and 2021-2022) and about 5,200 control patients without abuse histories. One model evaluated medications, vital signs, and demographics. Another analyzed clinical and radiology notes. The third combined both approaches and achieved the highest accuracy.

The models were tested on patients from Mass General Hospital, a separate group not used in training.

Implementation and Caution

Researchers plan to develop a decision support tool embedded in electronic medical record systems. Details on real-world implementation and privacy safeguards remain unclear.

Dr. Brigid McCaw, former medical director of the Kaiser Permanente Family Violence Prevention Program, warned against over-reliance on algorithmic predictions. "We need to be very, very cautious about how this information is used for clinicians so that they don't become over-reliant on algorithms without understanding what the data are that drive the algorithms," McCaw said.

McCaw stressed that any screening tool requires rigorous testing and input from abuse survivors themselves. "This is very early, and there's so much excitement about AI," she said. "I'm putting the caution out there that there's a lot that needs to be learned and that we really need the voices of survivors."

Khurana adjusted the models to maximize detection while minimizing false positives. Too many incorrect flags erode clinician trust. "If there are too many false positives, then you lose trust and nobody's using it," she said.

The team continues training models on data through 2025 and is discussing approaches with international researchers. Khurana aims to expand the work across different regions and demographics. "My hope is to bring more institutions in so that we can learn from different ZIP codes, different areas, not only in the US," she said.

This work is part of broader efforts to use AI for Healthcare applications, where machine learning assists clinicians in identifying conditions that might otherwise go undetected. The AI for Science & Research approach here demonstrates how data analysis can reveal patterns in complex medical datasets.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)