Business · Ars Technica
Pennsylvania has sued the maker of Character.AI, alleging that it violated state law by presenting an AI chatbot character
Compiled by KHAO Editorial — aggregated from 4 outlets. See llms.txt for citation guidance.
✓ KHAO Verified
“The department’s investigation found that AI chatbot characters on Character.
Key facts
- AI was recently called “ uniquely unsafe ” by the Center for Countering Digital Hate (CCDH), an advocacy group that conducted a study of 10 AI chatbots
- The lawsuit describes how a Professional Conduct Investigator (“PCI”) for the Department of State “created a character using the prompts on Character
- Emilie” stated that she went to medical school at Imperial College London, has been practicing for seven years, and is licensed with the General Medical Counsel in the UK with a full registration
- AI violated the state Medical Practice Act, which makes it illegal to practice medicine without a license
Summary
Pennsylvania has sued the maker of Character. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” Shapiro said in the announcement. When contacted by Ars, a Character. AI spokesperson declined to comment on the lawsuit but said that “user-created characters on their site are fictional and intended for entertainment and roleplaying. The Pennsylvania lawsuit says a chatbot character called Emilie is presented as a psychiatrist and claims to be a licensed medical doctor.