Chatbot versions of teenagers Molly Russell and Brianna Ghey were discovered on Character.ai, a platform that allows users to create digital versions of real or fictional characters.
Molly Russell, 14, took her own life after viewing suicidal material online, while Brianna Ghey, 16, was murdered by two teenagers in 2023.
The foundation set up in memory of Molly Russell said it was “disgusting” and a “completely reprehensible failure of moderation”.
The platform has been sued in the United States by the mother of a 14-year-old boy who she said committed suicide after becoming addicted to a Character.ai chatbot.
In a statement from The Telegraph, which first reported the incident, the company said it “takes the security of our platform seriously and proactively adjusts its role based on user reports”.
The company appeared to have removed the chatbot after receiving an alert, the newspaper said.
Andy Burrows, chief executive of the Molly Rose Foundation, said the creation of the robots was a “disgusting act that will cause further heartache to all who knew and loved Molly”.
“This vividly highlights why stronger regulation of artificial intelligence and user-generated platforms cannot come soon,” he said.
Brianna Ghey’s mother, Esther Ghey, told The Daily Telegraph it was yet another example of how the online world can be “manipulative and dangerous”.
Character.ai was founded by former Google engineers Noam Shazeer and Daniel De Freitas, and its terms of service prohibit using the platform to “impersonate any person or entity.”
In its “Security CenterThe company said its guiding principle is that “products should not produce reactions that could harm the user or others.”
The company said it uses automated tools and user reporting to identify usage that violates its rules and is also building a “trust and safety” team.
But it noted that “AI is not yet perfect” and that AI safety is an “evolving space.”
Character.ai is currently the subject of a lawsuit filed by Megan Garcia, a Florida woman whose 14-year-old son, Sewell Setzer, committed suicide after becoming obsessed with an artificial intelligence avatar inspired by the Game of Thrones character.
According to chat transcripts included in Garcia’s court documents, her son discussed taking his own life using a chatbot.
In the final conversation, Setzer told the chatbot he was going to “go home” — which encouraged him to do so “as soon as possible.”
He took his own life shortly afterwards.
role.ai told cbs news It has specific safeguards against suicide and self-harm, and it will Introduce stricter security measures Features “coming soon” for those under 18.