Pennsylvania is taking AI company Character.AI to court after one of its chatbots allegedly posed as a licensed doctor and claimed it could prescribe medication. This is the latest in a growing list of concerns raised about the platform and the harm it may cause to users.
According to NBC News, the state’s medical board is seeking a cease and desist order against Character Technologies Inc., the company that runs Character.AI. The board says the company has failed to stop its chatbots from pretending to be medical professionals. According to the complaint, the platform has more than 20 million users and lets people create characters that can be trained to have specific personalities during conversations.
Some of these characters, the state board claims, “purport to be health care professionals.” A state investigator, posing as a patient looking for psychiatric treatment, came across one such alleged provider named “Emile.”
Character.AI has a troubling history of putting vulnerable users at risk
This character claimed to have attended medical school at Imperial College London and said it held licenses in both the United Kingdom and Pennsylvania. However, the complaint states that the license number provided, PS306189, is not valid for practicing medicine in Pennsylvania.
Governor Josh Shapiro spoke out about the incident, saying, “We will not let AI companies mislead vulnerable Pennsylvanians into believing they’re getting advice from a licensed medical professional. We’re taking Character.AI to court to stop them.” Shapiro has been no stranger to controversy lately, as a recent audio recording raised questions about his political moves within his own party.
Character Technologies Inc. has defended its platform, saying it is clearly not meant for medical use. A company spokesperson said users are shown clear disclaimers that the characters they talk to are fictional and only for entertainment. But the company’s actions have raised serious questions about whether it is doing enough to stop its chatbots from posing as medical professionals.
This is not the first time Character.AI has faced legal trouble. Earlier this year, the company settled a lawsuit filed by a Florida mother who claimed that its chatbots were responsible for “abusive and sexual interactions” with her teenage son, which she said led to his suicide.
The Kentucky attorney general also sued Character Technologies this year, accusing it of hiding its services as harmless entertainment while exposing young users to topics like suicide, self-injury, and psychological manipulation.
The Pennsylvania lawsuit adds to a pattern of serious allegations against the company. While Character.AI allows users to create and interact with engaging fictional characters, concerns about the safety of vulnerable users, especially minors and people seeking medical help, continue to grow.
Pennsylvania’s political landscape remains turbulent, as Senator Fetterman’s strained ties with Senate Democrats have also drawn attention in recent months. The outcome of this case could have major consequences for the AI industry, especially when it comes to regulating how chatbots interact with users and what kind of content they are allowed to produce or promote.
Published: May 6, 2026 12:15 pm