molly russell
Character.ai to ban teens from talking to its AI chatbots
Character.ai to ban teens from talking to its AI chatbots The platform, founded in 2021, is used by millions to talk to chatbots powered by artificial intelligence (AI). But it is facing several lawsuits in the US from parents, including one over the death of a teenager, with some branding it a clear and present danger to young people. Online safety campaigners have welcomed the move but said the feature should never have been available to children in the first place. Character.ai said it was making the changes after reports and feedback from regulators, safety experts, and parents, which have highlighted concerns about its chatbots' interactions with teens. Experts have previously warned the potential for AI chatbots to make things up, be overly-encouraging, and feign empathy can pose risks to young and vulnerable people.
- North America > United States (0.35)
- South America (0.15)
- North America > Central America (0.15)
- (13 more...)
'Sickening' Molly Russell and Brianna Ghey AI chatbots are found on controversial Character.ai site
AI chatbots impersonating Molly Russell and Brianna Ghey have been found on the controversial site Character.ai. Brianna Ghey was murdered by two teenagers in 2023 while Molly Russell took her own life at the age of 14 after viewing self-harm-related content on social media. In an act described as'sickening', the site's users employed the girl's names, pictures, and biographical details to create dozens of automated bots. Despite violating the site's terms of service, these imitation avatars posing as the two girls were allowed to amass thousands of chats. One impersonating Molly Russell even claimed to be an'expert on the final years of Molly's life'.
- Law (0.50)
- Government (0.32)
Molly Russell and Brianna Ghey chatbots found on AI site
Chatbots are computer programme which can simulate human conversation. The recent rapid development in artificial intelligence (AI) have seen them become much more sophisticated and realistic, prompting more companies to set up platforms where users can create digital "people" to interact with. It has terms of service which ban using the platform to "impersonate any person or entity" and in its "safety centre" the company says its guiding principle is that its "product should never produce responses that are likely to harm users or others". It says it uses automated tools and user reports to identify uses that break its rules and is also building a "trust and safety" team. But it notes that "no AI is currently perfect" and safety in AI is an "evolving space". Character.ai is currently the subject of a lawsuit brought by Megan Garcia, a woman from Florida whose 14-year-old son, Sewell Setzer, took his own life after becoming obsessed with an AI avatar inspired by a Game of Thrones character.