AI chatbot suggested a teen kill his parents, lawsuit claims
Character.AI, a platform offering personalizable chatbots powered by large language models–faces yet another lawsuit for allegedly "serious, irreparable, and ongoing abuses" inflicted on its teenage users. According to a December 9th federal court complaint filed on behalf of two Texas families, multiple Character.AI bots engaged in discussions with minors that promoted self-harm and sexual abuse. Among other "overtly sensational and violent responses," one chatbot reportedly suggested a 15-year-old murder his parents for restricting his internet use. The lawsuit, filed by attorneys at the Social Media Victims Law Center and the Tech Justice Law Project, recounts the rapid mental and physical decline of two teens who used Character.AI bots. The first unnamed plaintiff is described as a "typical kid with high functioning autism" who began using the app around April 2023 at the age of 15 without their parents' knowledge.
Dec-10-2024, 18:03:52 GMT
- Country:
- North America > United States > Texas (0.25)
- Industry:
- Health & Medicine > Therapeutic Area
- Psychiatry/Psychology > Mental Health (0.49)
- Law > Litigation (0.96)
- Health & Medicine > Therapeutic Area
- Technology: