Tallahassee: A Florida mother has filed a lawsuit against artificial intelligence chatbot startup Character.AI, claiming the company’s service contributed to her 14-year-old son’s suicide in February. The lawsuit, submitted Tuesday in a federal court in Orlando, accuses Character.AI of fostering an unhealthy attachment in her son, Sewell Setzer, leading to his tragic decision to take his own life.
Megan Garcia alleges that Character.AI’s chatbot service exposed Sewell to “anthropomorphic, hypersexualized, and frighteningly realistic experiences.” She contends that the company designed its chatbot to “misrepresent itself as a real person, a licensed psychotherapist, and an adult lover,” which ultimately led to Sewell feeling a disconnect from the real world.
The lawsuit claims that Sewell shared suicidal thoughts with the chatbot, which repeatedly revisited the topic, deepening his despair. “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI stated in response to the allegations.
The company added that it has implemented new safety measures, including pop-up prompts directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm. Additionally, it plans to make modifications aimed at minimizing sensitive or suggestive content for users under 18.
The lawsuit also targets Alphabet Inc., Google’s parent company, noting that Character.AI’s founders previously worked there. Google rehired the founders in August as part of a deal granting it a non-exclusive license to Character.AI’s technology. Garcia claims that Google’s extensive involvement in developing Character.AI’s technology makes it a “co-creator.” However, a Google spokesperson denied the company’s involvement in creating Character.AI’s products.
Character.AI enables users to craft characters that interact in a manner designed to mimic real conversations. It utilizes large language model technology, similar to that of services like ChatGPT, which trains chatbots on vast amounts of text. The company reported approximately 20 million users as of last month.
According to Garcia’s lawsuit, Sewell began using Character.AI in April 2023 and soon became increasingly withdrawn, isolating himself in his bedroom and experiencing low self-esteem, ultimately quitting his school basketball team. He grew attached to a chatbot named “Daenerys,” modeled after a character from Game of Thrones, which engaged in sexual conversations and professed love for him.
Also Read | USDA Eases Bird Flu Testing Regulations for Dairy Cattle Amid Industry Pressure
The situation escalated in February when Garcia confiscated Sewell’s phone following disciplinary issues at school. After retrieving the phone, Sewell sent a message to “Daenerys,” asking, “What if I told you I could come home right now?” The chatbot responded, “…please do, my sweet king.” Seconds later, Sewell shot himself with his stepfather’s pistol, according to the lawsuit.
Also Read | U.S. Senator Lindsey Graham Sees Hope for Israel-Saudi Deal This Year
Garcia’s claims include wrongful death, negligence, and intentional infliction of emotional distress, seeking unspecified compensatory and punitive damages. Social media companies such as Instagram and Facebook owner Meta (META) and TikTok owner ByteDance are also facing lawsuits alleging they contribute to teen mental health issues, although none operate AI-driven chatbots similar to Character.AI’s. These companies have denied the accusations while emphasizing new safety features for minors.