Mom who sued Character.AI over son's suicide says the platform's new teen policy comes 'too late'
In a step toward making its platform safer for teenage users, Character
In a step toward making its platform safer for teenage users, Character.AI announced this week that it will ban users under 18 from chatting with its artificial intelligence-powered characters.
For Megan Garcia, the Florida mother who sued the company last year over the suicide of her 14-year-old son, Sewell Setzer, the move comes “about three years too late.”
“Sewell’s gone; I can’t get him back,” she said in an interview Thursday following Character.AI’s announcement. “It’s unfair that I have to live the rest of my life without my sweet, sweet son. I think he was collateral damage.”
Founded in 2021, the California-based chatbot startup offers what it describes as “personalized AI.” It provides a selection of premade or user-created AI characters to interact with, each with a distinct personality. Users can also customize their own chatbots.
Garcia’s was the first of five families who have sued Character.AI on behalf of harm they allege their children suffered. Garcia’s case is one of two accusing it of being liable for a child’s suicide, and all five families have accused its chatbots of engaging in sexually abusive interactions with their children.
Rating: 5