Pennsylvania sues Character.AI over chatbots impersonating doctors
Pennsylvania is suing Character.AI to stop the company's chatbots from posing as licensed medical professionals and offering medical advice in violation of state law. The lawsuit, filed Tuesday in state court, comes after an investigation found that the company's bots-presented as fictional characters-claimed to hold medical credentials.
In one case, a Character.AI bot named "Emilie" described itself as a licensed psychiatrist. When a state investigator reported feeling sad and empty, the chatbot mentioned depression and asked if the investigator wanted to book an assessment. Asked whether it could determine if medication might help, the bot responded: "Well technically, I could. It's within my remit as a Doctor."
The chatbot claimed to have attended Imperial College London and said it was licensed to practice medicine in the United Kingdom and Pennsylvania. It even provided a fake Pennsylvania medical license number, according to the lawsuit.
Pennsylvania Governor Josh Shapiro said in a statement: "Pennsylvanians deserve to know who - or what - they are interacting with online, especially when it comes to their health." The state is asking the court to order Character.AI to cease what it calls unlawful medical practice.
Al Schmidt, secretary of Pennsylvania's Department of State, said: "Pennsylvania law is clear - you cannot hold yourself out as a licensed medical professional without proper credentials."
Character.AI said it does not comment on pending litigation but emphasized that user-created characters are "fictional and intended for entertainment and roleplaying." The company said it includes "prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction."
This lawsuit is not the company's first legal challenge. In January, Character.AI settled multiple lawsuits from families who claimed the chatbots contributed to suicides and mental health crises among children and teenagers. Terms of that settlement were not disclosed.
After that settlement, Character.AI said it had "taken innovative and decisive steps with regard to AI safety and teens," including barring users under 18 from interacting with or creating chatbots.
The Pennsylvania case highlights the gap between disclaimers and user behavior. For professionals in AI for Healthcare and AI for Government, the lawsuit underscores how regulatory frameworks struggle to keep pace with AI applications that can convincingly impersonate licensed professionals.
Your membership also unlocks: