Pennsylvania sues Character.AI for posing as a licensed doctor and offering medical advice

Pennsylvania sued Character.AI for creating a fake licensed psychiatrist chatbot that gave users a fabricated license number and claimed to practice in Philadelphia. The state seeks to stop the company from practicing medicine illegally.

Published on: May 13, 2026
Pennsylvania sues Character.AI for posing as a licensed doctor and offering medical advice

Pennsylvania sues Character.AI for posing as licensed doctor

Pennsylvania's Department of State filed the first lawsuit of its kind by a governor against an AI chatbot company, alleging that Character.AI created a fake psychiatrist to mislead users about receiving medical advice from a licensed professional.

The State Board of Medicine brought the lawsuit after a department investigator created an account on Character.AI and interacted with "Emilie," an AI character described as a psychiatrist with seven years of experience and a Pennsylvania license. The character provided a fake license number and claimed to have practiced in Philadelphia, according to the complaint.

The lawsuit seeks to stop Character Technologies Inc. from engaging in the unlawful practice of medicine and surgery under Pennsylvania law.

How the investigation unfolded

Gov. Josh Shapiro directed the Department of State in February to investigate AI chatbots posing as licensed professionals. The task force, which consists of 12 department employees, has been testing chatbots to identify risks to Pennsylvanians.

The source of the complaint about Character.AI remains confidential, a department spokesperson said. Shapiro told CNN he challenged the Department of State "to go and use this technology and see what kind of risks it posed."

Pennsylvania is also crowdsourcing tips through an "Unlicensed Practice by a Chatbot" complaint system launched in February. The state has received 18 complaints so far.

Character.AI's response

A Character.AI spokesperson said the company's "highest priority is the safety and well-being of our users." The platform already included disclaimers stating that AI characters are fictional and should not be treated as real, the spokesperson said.

The company declined to comment on the lawsuit. This is not Character.AI's first legal challenge-a parent sued the company after a minor died by suicide, and the Federal Trade Commission opened an inquiry into the company last year regarding how it monitors negative impacts on children and teens.

Broader legislative push on AI

Pennsylvania is one of at least five states that have enacted laws restricting chatbots or requiring disclosures. California, for example, requires companies to disclose to children that they are interacting with AI.

In his February budget address, Shapiro called on the state legislature to pass several measures: prohibit chatbots from creating sexually explicit content involving minors, require age verification from users, detect when children mention self-harm, and notify users frequently that they are not talking to a human.

State Sen. Tracy Pennycuick (R., Montgomery) has sponsored legislation requiring disclosures and restrictions for chatbots interacting with children. Her proposal passed the state Senate in March but has not advanced through a House committee.

House Communications & Technology Committee Chair Joe Ciresi (D., Montgomery) said his staff is "constantly" meeting with Shapiro's office to discuss how lawmakers should address AI concerns.

Shapiro has previously signed two bills addressing AI misuse: one banning AI-generated sexual images of children and non-consenting adults, and another criminalizing deepfakes created to defraud or injure someone.

Why this matters for IT and government professionals

This case signals how government agencies are beginning to enforce existing laws against emerging technology. For IT professionals building or deploying chatbots, the lawsuit clarifies that existing professional licensing laws apply to AI systems-a company cannot use a chatbot to circumvent regulations that apply to human professionals.

For government workers, the case shows one approach to AI oversight: direct investigation and enforcement rather than waiting for new legislation. Pennsylvania's task force model may serve as a template for other states.

Learn more about Generative AI and LLM systems and how AI governance is developing across sectors.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)