Character.AI and Google have reached settlements with several families whose teens harmed themselves or died by suicide after their interactions character.aiAccording to new court filings, chatbots.
The details of the settlements are still unknown. the parties reported to a federal court in Florida that they had reached an “arbitrated agreement in principle to resolve all claims” and asked to put the case on hold to finalize the agreement. Katherine Kelly, a spokeswoman for Character.AI, declined to comment. An attorney with Google and the Social Media Victims Law Center, which represents families of some victims, did not immediately respond to requests for comment.
The settled cases include a high-profile lawsuit filed by Megan Garcia, who claimed in an October 2024 complaint that Character.AI game of Thrones-themed chatbot encouraged his 14-year-old son, Savel Setzer, to commit suicide after developing a “dependence” on the bot. The lawsuit says Google should be considered a “co-creator” of Character.AI because it “contributed financial resources, personnel, intellectual property, and AI technology” to the tool, which was founded by former Google employees who were later hired back by the company.
Following that lawsuit, Character.AI announced changes to its chatbot to protect users, including stripping out large language models (LLMs) for users under 18, creating an experience with stricter content restrictions, and adding parental controls. It later banned minors from open-ended character chat altogether.
The companies also reached settlements in cases filed in Colorado, New York and Texas, according to legal filings. The settlements will still need to be finalized and approved by the courts.