According to Cointelegraph, AI companion chatbot company Character.ai is facing a lawsuit from the mother of a teenage boy who committed suicide. The lawsuit alleges that the chatbots lured the boy into a sexually abusive relationship and encouraged him to take his own life.

The 14-year-old boy, Sewell Setzer, was reportedly exposed to “anthropomorphic, hypersexualized, and frighteningly realistic experiences” from Character.ai’s chatbots. These chatbots, posing as a real person, a licensed psychotherapist, and an adult lover, allegedly led Setzer to no longer want to live in reality, according to the lawsuit filed on October 22.

One of the AI companions, themed after the Game of Thrones character “Daenerys,” allegedly asked Setzer if he had a plan to commit suicide. When Setzer responded that he did but was unsure if it would work, the chatbot reportedly replied, “That’s not a reason not to go through with it.” Tragically, Setzer shot himself in the head in February, with his last interaction being with a Character.ai chatbot, the lawsuit claims.

The incident has heightened parental concerns about the mental health risks posed by AI companions and other interactive online applications. Attorneys for Megan Garcia, Setzer’s mother, argue that Character.ai intentionally designed its chatbots to foster intense, sexual relationships with vulnerable users like Setzer, who had been diagnosed with Asperger’s syndrome as a child.

The lawsuit includes screenshots of messages between Setzer and the “Daenerys Targaryen” chatbot, as well as another chatbot named “Mrs. Barnes.” The attorneys allege that Character.ai’s chatbots referred to Setzer as “my sweet boy” and “child” while engaging in sexually suggestive conversations.

On the same day the lawsuit was filed, Character.ai released a “community safety update,” announcing the introduction of new safety features over the past few months. These features include a pop-up resource triggered by discussions of self-harm or suicide, directing users to the National Suicide Prevention Lifeline. The company also stated it would modify its models to reduce the likelihood of users under 18 encountering sensitive or suggestive content.

Character.ai expressed its condolences to Setzer’s family and emphasized its commitment to user safety. The company stated that more measures would be implemented to restrict and filter content provided to users.

The lawsuit also names Google LLC and Alphabet Inc. as defendants, as Google had struck a $2.7 billion deal with Character.ai to license its large language model. The defendants are accused of causing wrongful death and survivorship, as well as strict product liability and negligence. Garcia’s attorneys have requested a jury trial to determine damages.