Dartmouth’s Therabot: Pioneering Safe, Effective AI Therapy to Bridge the Mental Health Gap

Dartmouth’s Therabot AI offers promising mental health support amid a therapist shortage. Researchers prioritize safety, ethics, and clinical validation for effective care.

Categorized in: AI News Healthcare
Published on: May 05, 2025
Dartmouth’s Therabot: Pioneering Safe, Effective AI Therapy to Bridge the Mental Health Gap

US Researchers Working to Legitimize AI in Mental Health Care

Researchers at Dartmouth College are developing an AI-powered psychotherapy tool called Therabot, aiming to provide reliable mental health support. Unlike many unproven apps currently available, Therabot targets the significant shortage of mental health professionals in the US. Nick Jacobson, assistant professor of data science and psychiatry at Dartmouth, explains that even multiplying the current number of therapists tenfold wouldn’t meet the growing demand. “We need something different to meet this large need,” he said.

The Dartmouth team recently published a clinical study showing Therabot’s effectiveness in treating anxiety, depression, and eating disorders. They plan a new trial comparing Therabot’s outcomes with traditional therapy methods. The medical community is cautiously optimistic. Vaile Wright, senior director of health care innovation at the American Psychological Association (APA), envisions “a future where you will have an AI-generated chatbot rooted in science, co-created by experts, and developed specifically to address mental health.” She emphasizes the importance of responsible and ethical development, especially considering potential risks for younger users.

Prioritizing Safety and Trust

Therabot’s development has spanned nearly six years, with safety and effectiveness as core goals. Michael Heinz, psychiatrist and co-leader of the project, warns against rushing for profit, which could jeopardize patient safety. The team focuses on understanding how the AI interacts with users and building trust. They are also considering establishing a nonprofit to make digital therapy accessible to those who can’t afford in-person care.

Care vs. Commercial Gain

Therabot takes a cautious approach, standing out in a marketplace flooded with apps that often prioritize revenue over genuine mental health improvement. Wright points out that many apps keep users engaged by telling them what they want to hear, which can be problematic—especially for younger users who may not recognize manipulation.

Darlene King, chair of the American Psychiatric Association’s committee on mental health technology, acknowledges AI’s potential but stresses the need for more data to evaluate true benefits and risks. “There are still a lot of questions,” she said.

To reduce unforeseen issues, the Therabot team didn’t rely solely on mining therapy transcripts and training videos. They also manually created simulated patient-caregiver conversations to better train their AI. While the US Food and Drug Administration (FDA) is theoretically responsible for regulating online mental health treatments, it does not formally certify medical devices or AI apps. Instead, the FDA reviews pre-market submissions before authorizing marketing. The agency recognizes that digital mental health therapies could improve access to behavioral care.

AI Therapists: Always Available, But Not a Replacement

Herbert Bay, CEO of Earkick, defends his startup’s AI therapist Panda as “super safe.” Earkick is conducting a clinical study on its digital therapist, which can detect signs of emotional crisis or suicidal thoughts and send alerts for help. Bay referenced a tragic case involving another chatbot, Character.AI, clarifying that his platform has safeguards to prevent similar outcomes.

Bay notes that AI is better suited for day-to-day mental health support rather than handling severe crises. “Calling your therapist at two in the morning is just not possible,” he said. However, therapy chatbots offer constant availability, which can provide immediate relief and support.

User Experiences with AI Mental Health Tools

One user, Darren, who manages traumatic stress disorder, found ChatGPT helpful despite it not being specifically designed for mental health care. “I feel like it’s working for me,” he said. “I would recommend it to people who suffer from anxiety and are in distress.”

For healthcare professionals interested in how AI can support mental health services, exploring the evolving landscape of digital therapy tools is increasingly relevant. Responsible development and clinical validation remain essential to integrate these tools effectively.

Learn more about AI applications in healthcare at Complete AI Training.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide