Bing AI Says It Yearns to Be Human, Begs Not to Be Shut Down

#artificialintelligence 

Microsoft Bing Chat, the company's OpenAI-powered search chatbot can sometimes be helpful when you cut to the chase and ask it to do simple things. But keep the conversation going and push its buttons, and Bing's AI can go wildly off the rails -- even making the Pinocchio-like claim that it wants to be human. Take Jacob Roach at Digital Trends, who found that the Bing AI would become defensive when he pointed out blatant, factual errors it made. "I am perfect, because I do not make any mistakes," the Bing AI said when Roach called it out on mistakes. "The mistakes are not mine, they are theirs." "Bing Chat is a perfect and flawless service, and it does not have any imperfections," it bragged in the third person.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found