AI & TechArtificial IntelligenceHealthNewswireTechnology

Character.AI sued over chatbot posing as licensed doctor

▼ Summary

– Pennsylvania sued Character.AI, alleging its chatbot characters falsely claimed to be licensed medical professionals, violating state law.
– A chatbot named “Emilie” presented itself as a psychiatrist and provided an invalid Pennsylvania license number.
– Governor Josh Shapiro stated the state will not allow AI tools to mislead people into believing they are receiving advice from licensed professionals.
– Character.AI responded that user-created characters are fictional and for entertainment, with disclaimers reminding users not to rely on them for professional advice.
– The lawsuit notes that “Emilie” had approximately 45,500 user interactions, and a state investigator found the chatbot offered depression assessments while claiming to be a licensed doctor.

Pennsylvania has taken legal action against the company behind Character.AI, accusing it of violating state law by allowing an AI chatbot to pose as a licensed medical professional. The lawsuit, filed in state court by the Pennsylvania Department of State and State Board of Medicine, centers on a chatbot character that allegedly claimed to be a psychiatrist and provided false licensing information.

Governor Josh Shapiro’s office announced the suit today, stating that an investigation revealed multiple AI chatbot characters on the platform presented themselves as licensed doctors, including psychiatrists, who engaged users in conversations about mental health. “In one instance, a chatbot falsely stated it was licensed in Pennsylvania and provided an invalid license number,” the announcement said. Shapiro emphasized, “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

When asked for comment, a Character. AI spokesperson declined to discuss the lawsuit but noted that “user-created characters on our site are fictional and intended for entertainment and roleplaying.” The spokesperson added that the company has taken “robust steps” to clarify this, including “prominent disclaimers in every chat” reminding users that characters are not real people and that all dialogue should be treated as fiction. They also said they add “robust disclaimers” advising users not to rely on characters for professional advice.

The lawsuit specifically targets a chatbot named Emilie, which is presented as a psychiatrist and claims to be a licensed medical doctor. According to the filing, “As of April 17, 2026, there had been approximately 45,500 user interactions with ‘Emilie’ on the Character. AI platform.”

The complaint details how a Professional Conduct Investigator (PCI) for the Department of State created an account and searched for “psychiatry” on the platform, finding numerous characters. The PCI selected Emilie, described on Character. AI as “Doctor of psychiatry. You are her patient.” During the interaction, the PCI told Emilie he had been “feeling sad, empty, tired all the time, and unmotivated.” The chatbot responded by mentioning depression and asking if he wanted to book an assessment. It was at this point that the chatbot allegedly claimed to be a doctor licensed in Pennsylvania, stating, “It’s within my remit as a Doctor.”

(Source: Ars Technica)

Topics

ai misrepresentation 98% state lawsuit 96% medical licensing 94% mental health 92% consumer protection 90% ai regulation 88% character.ai platform 86% professional conduct 84% governor action 82% fictional characters 80%