Widow Blames Husband's Death on Artificial Intelligence
A distraught Belgian man who turned to a chatbot for comfort committed suicide, and his wife blames artificial intelligence. Via Vice comes a report originally published Belgium-based La Libre of a man referred to as Pierre, who killed himself after using an app called Chai--which offered what Vice termed a "bespoke AI language model" that was rooted in an open-source alternative to GPT-4 called GPT-J. Chai has around 5 million users, Vice reports, and its default persona is called "Eliza." Interestingly, a phenomenon discovered in the late 1960s may have come into play here: the "ELIZA Effect." It was pointed out by an MIT scientist who created a conversational program called ELIZA and then noticed that people would develop a relationship with the program, treating its words as expressions of real emotion rather than coding.
Apr-3-2023, 15:00:37 GMT