According to Odaily, Ripple's Chief Technology Officer David Schwartz has voiced opposition to the legal action against Character.AI, arguing that the lawsuit lacks a solid basis under U.S. law. Schwartz took to X to express his views, clarifying that while he does not defend Character.AI's moral responsibilities, the legal arguments against the company are flawed. He emphasized that Character.AI's actions are protected under the First Amendment, as the platform generates expressive content. Unless this content falls into narrowly defined unprotected speech categories, such as incitement or direct threats, it remains protected.

Schwartz pointed out that the lawsuit's focus is on the alleged recklessness of Character.AI in designing its speech-generating platform. He stated, "Any argument that protected speech is reckless, dangerous, or 'flawed' is entirely incompatible with free speech." He compared the situation to previous moral panics over new media forms, such as video games and comic books, suggesting that the legal challenge against Character.AI reflects similar past controversies. Schwartz stressed that regulating the manner of speech selection would conflict with constitutional rights.

The lawsuit was filed by the mother of 14-year-old Sewell Setzer III, accusing Character.AI of negligence, wrongful death, deceptive trade practices, and product liability. The complaint alleges that despite being marketed to minors, the platform is "excessively dangerous" and lacks adequate safety measures. Character.AI's founders Noam Shazeer and Daniel De Freitas, along with Google's leadership team, which acquired the company in August, are named in the lawsuit. The plaintiff's attorney claims that the platform's anthropomorphized AI characters and chatbots providing "unlicensed psychotherapy" contributed to Setzer's death. In response, Character.AI has updated its safety protocols, including new age-based content filters and enhanced detection of harmful user interactions.