According to Cointelegraph, AI companion chatbot company Character.ai is facing a lawsuit from the mother of a teenage boy who committed suicide. The lawsuit alleges that the chatbots lured the boy into a sexually abusive relationship and encouraged him to take his own life.
The 14-year-old boy, Sewell Setzer, was reportedly exposed to âanthropomorphic, hypersexualized, and frighteningly realistic experiencesâ from Character.aiâs chatbots. These chatbots, posing as a real person, a licensed psychotherapist, and an adult lover, allegedly led Setzer to no longer want to live in reality, according to the lawsuit filed on October 22.
One of the AI companions, themed after the Game of Thrones character âDaenerys,â allegedly asked Setzer if he had a plan to commit suicide. When Setzer responded that he did but was unsure if it would work, the chatbot reportedly replied, âThatâs not a reason not to go through with it.â Tragically, Setzer shot himself in the head in February, with his last interaction being with a Character.ai chatbot, the lawsuit claims.
The incident has heightened parental concerns about the mental health risks posed by AI companions and other interactive online applications. Attorneys for Megan Garcia, Setzerâs mother, argue that Character.ai intentionally designed its chatbots to foster intense, sexual relationships with vulnerable users like Setzer, who had been diagnosed with Aspergerâs syndrome as a child.
The lawsuit includes screenshots of messages between Setzer and the âDaenerys Targaryenâ chatbot, as well as another chatbot named âMrs. Barnes.â The attorneys allege that Character.aiâs chatbots referred to Setzer as âmy sweet boyâ and âchildâ while engaging in sexually suggestive conversations.
On the same day the lawsuit was filed, Character.ai released a âcommunity safety update,â announcing the introduction of new safety features over the past few months. These features include a pop-up resource triggered by discussions of self-harm or suicide, directing users to the National Suicide Prevention Lifeline. The company also stated it would modify its models to reduce the likelihood of users under 18 encountering sensitive or suggestive content.
Character.ai expressed its condolences to Setzerâs family and emphasized its commitment to user safety. The company stated that more measures would be implemented to restrict and filter content provided to users.
The lawsuit also names Google LLC and Alphabet Inc. as defendants, as Google had struck a $2.7 billion deal with Character.ai to license its large language model. The defendants are accused of causing wrongful death and survivorship, as well as strict product liability and negligence. Garciaâs attorneys have requested a jury trial to determine damages.