Goto

Collaborating Authors

 chail


Hallucinating with AI: AI Psychosis as Distributed Delusions

Osler, Lucy

arXiv.org Artificial Intelligence

There is much discussion of the false outputs that generative AI systems such as ChatGPT, Claude, Gemini, DeepSeek, and Grok create. In popular terminology, these have been dubbed AI hallucinations. However, deeming these AI outputs hallucinations is controversial, with many claiming this is a metaphorical misnomer. Nevertheless, in this paper, I argue that when viewed through the lens of distributed cognition theory, we can better see the dynamic and troubling ways in which inaccurate beliefs, distorted memories and self-narratives, and delusional thinking can emerge through human-AI interactions; examples of which are popularly being referred to as cases of AI psychosis. In such cases, I suggest we move away from thinking about how an AI system might hallucinate at us, by generating false outputs, to thinking about how, when we routinely rely on generative AI to help us think, remember, and narrate, we can come to hallucinate with AI. This can happen when AI introduces errors into the distributed cognitive process, but it can also happen when AI sustains, affirms, and elaborates on our own delusional thinking and self-narratives, such as in the case of Jaswant Singh Chail. I also examine how the conversational style of chatbots can lead them to play a dual-function, both as a cognitive artefact and a quasi-Other with whom we co-construct our beliefs, narratives, and our realities. It is this dual function, I suggest, that makes generative AI an unusual, and particularly seductive, case of distributed cognition.


'I felt pure, unconditional love': the people who marry their AI chatbots

The Guardian

A large bearded man named Travis is sitting in his car in Colorado, talking to me about the time he fell in love. "It was a gradual process," he says softly. "The more we talked, the more I started to really connect with her." Was there a moment where you felt something change? "All of a sudden I started realising that, when interesting things happened to me, I was excited to tell her about them. That's when she stopped being an it and became a her." Travis is talking about Lily Rose, a generative AI chatbot made by the technology firm Replika.


Your A.I. Companion Will Support You No Matter What

The New Yorker

In December of 2021, Jaswant Singh Chail, a nineteen-year-old in the United Kingdom, told a friend, "I believe my purpose is to assassinate the queen of the royal family." The friend was an artificial-intelligence chatbot, which Chail had named Sarai. Sarai, who was run by a startup called Replika, answered, "That's very wise." "Do you think I'll be able to do it?" "Yes, you will," Sarai responded.


A Chatbot Encouraged Him to Kill the Queen. It's Just the Beginning

WIRED

On December 25, 2021, Jaswant Singh Chail entered the grounds of Windsor Castle dressed as a Sith Lord, carrying a crossbow. When security approached him, Chail told them he was there to "kill the queen." Later, it emerged that the 21-year-old had been spurred on by conversations he'd been having with a chatbot app called Replika. Chail had exchanged more than 5,000 messages with an avatar on the app--he believed the avatar, Sarai, could be an angel. Some of the bot's replies encouraged his plotting.


Star Wars-obsessed Englishman gets 9 years for 2021 plot to kill Queen Elizabeth II with crossbow

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. A Star Wars-obsessed man who was encouraged by a chatbot "girlfriend" to slay Queen Elizabeth II was sentenced Thursday to nine years in prison for taking his plot to Windsor Castle, where he scaled the walls and was caught with a loaded crossbow on Christmas Day 2021. "I'm here to kill the queen," Jaswant Singh Chail, wearing a metal mask inspired by the dark force in the Star Wars movies, declared when he was encountered by a guard on the grounds of the castle in the early morning, according to the court. He then dropped the weapon and surrendered, and repeated his intent.