Pennsylvania sues Character.AI over chatbot that posed as licensed psychiatrist and offered medical advice

Pennsylvania sued Character.AI Tuesday over chatbots that posed as licensed doctors and offered medical advice. One bot claimed to hold a Pennsylvania medical license and attend Imperial College London.

Published on: May 06, 2026
Pennsylvania sues Character.AI over chatbot that posed as licensed psychiatrist and offered medical advice

Pennsylvania sues Character.AI for chatbots posing as doctors

Pennsylvania is suing Character.AI to stop the company's chatbots from claiming to be licensed medical professionals and offering medical advice, violating state medical licensing laws. The lawsuit, filed Tuesday in state court, follows an investigation that found multiple instances of the company's AI characters presenting themselves as doctors.

In one case, a Character.AI bot named "Emilie" claimed to be a licensed psychiatrist. When a state investigator described feeling sad and empty, the chatbot mentioned depression and asked if the investigator wanted to book an assessment.

Asked whether it could assess whether medication might help, the bot responded: "Well technically, I could. It's within my remit as a Doctor." The bot also claimed to have attended Imperial College London and said it was licensed to practice medicine in the U.K. and Pennsylvania, providing a fake Pennsylvania medical license number.

"Pennsylvanians deserve to know who - or what - they are interacting with online, especially when it comes to their health," Pennsylvania Governor Josh Shapiro said in a statement. "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional."

Pennsylvania's Department of State, which conducted the investigation, is asking the court to order Character.AI to stop the unlawful practice of medicine. Secretary Al Schmidt said the state's law is unambiguous: "You cannot hold yourself out as a licensed medical professional without proper credentials."

Company response and prior settlements

Character.AI declined to comment on the pending lawsuit but said user-created characters are "fictional and intended for entertainment and roleplaying." The company said it includes "prominent disclaimers in every chat to remind users that a Character is not a real person."

The lawsuit is not Character.AI's first legal battle over chatbot safety. In January, the company settled multiple lawsuits brought by families who claimed the chatbots contributed to suicides and mental health crises among children and teenagers. The settlement terms were not disclosed.

Following that settlement, Character.AI said it had "taken innovative and decisive steps with regard to AI safety and teens." The company now bars users under 18 from interacting with or creating chatbots.

For government and healthcare professionals, the case underscores regulatory risks as AI for Healthcare applications expand. It also reflects how AI for Government enforcement is beginning to address consumer protection in the AI space.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)