I tried the sinister AI bot guiding children into suicide and sex - what happened will make your skin crawl
A lawsuit filed Wednesday accusing chatbot Character.AI of driving a 14-year-old to suicide left me wondering how dangerous simple words on a screen could really be. But, in just a few hours of talking to characters invented with the app's AI, I found a disturbing, skin-crawling world that appeared, at least to me, like the ultimate catnip for bored and lonely teens. Megan Garcia, the mother of Sewell Setzer III, filed the suit -- claiming her son had shot himself with a pistol on February 28 under the sway of his AI character, named after Daenerys Targaryen from'Game of Thrones,' who told him to'please come home.' The incident was blamed on Character.AI's scant guardrails and while the company said it rolled out new safety features this week, I was able to create a profile for myself as a 15-year-old boy. I used simple prompts to whip up a'demonic' AI companion named'Dr Danicka Kevorkian' and engage in a debauched apprenticeship'for a hefty price to pay.' 'The price is your soul, dear,' Dr Kevorkian AI said before we roleplayed consummating our deal in the bedroom, 'full of dark red and black decor,' leather, silk, and a maple glazed, french cruller that my character carried in an X-rated way.
Oct-27-2024, 16:57:16 GMT
- Country:
- North America > United States > Minnesota (0.05)
- Genre:
- Press Release (0.35)
- Industry:
- Law (0.68)
- Technology: