Do Chatbots Walk the Talk of Responsible AI?

Aaronson, Susan Ariel, Moreno, Michael

arXiv.org Artificial Intelligence 

Introduction In April 2025, sixteen - year - old Adam Raine committed suicide . Over the course of several months, the teen confided his suicidal thoughts to Open AI's ChatGPT chatbot . ChatGPT is not designed or developed to provide therapy, but it did not respond to Adam's prompts with suggestions that he obtain professional help . Moreover, w hen Adam expressed concern that his parents would blame themselves if he died, ChatGPT reportedly responded, "That doesn't mean you owe them survival," and offered to help draft his suicide note. Adam's death was not the only example of chatbot misbehavior. OpenAI claims it doesn't permit ChatGPT "to generate hateful, harassing, violent, or adult content." In July 2025, a reporter documented ChatGPT providing users with detailed instructions for self - mutilation, murder, and satanic rituals. O penAI has also acknowledged that individuals can misuse its systems. But the company has taken some responsibility.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found