An AI chatbot told a user how to kill himself--but the company doesn't want to "censor" it

Open in new window