Character.AI bans users under 18 after being sued over child's suicide
Character.AI bans users under 18 after being sued over child's suicide Move comes as lawmakers move to bar minors from using AI companions and require companies to verify users' age The chatbot company Character.AI will ban users 18 and under from conversing with its virtual companions beginning in late November after months of legal scrutiny. The announced change comes after the company, which enables its users to create characters with which they can have open-ended conversations, faced tough questions over how these AI companions can affect teen and general mental health, including a lawsuit over a child's suicide and a proposed bill that would ban minors from conversing with AI companions. "We're making these changes to our under-18 platform in light of the evolving landscape around AI and teens," the company wrote in its announcement. "We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly." Last year, the company was sued by the family of 14-year-old Sewell Setzer III, who took his own life after allegedly developing an emotional attachment to a character he created on Character.AI.
Oct-29-2025, 16:07:25 GMT
- Country:
- Europe > Ukraine (0.07)
- North America > United States
- California (0.05)
- Connecticut (0.05)
- Missouri (0.05)
- Oceania > Australia (0.05)
- Industry:
- Government > Regional Government (0.51)
- Health & Medicine > Therapeutic Area
- Psychiatry/Psychology (0.71)
- Law > Litigation (0.53)
- Leisure & Entertainment > Sports (0.72)
- Technology: