What Lurks in AI's Shadow: Separating Fact from Fiction

#artificialintelligence 

In a recent column, New York Times technology correspondent Kevin Roose revealed a conversation he had shared with Bing's Chatbot that's equal parts fascinating and unsettling. The artificial intelligence service in question is a sibling of the popular ChatGPT, produced by the American artificial intelligence company OpenAI. But Roose wasn't just chatting with the OpenAI Codex, the company's most recent model, he was speaking with its chat mode persona, Sydney, a name given to it by Microsoft in its early stages of development. Though Roose and Sydney's conversation is, at first glance, alarming, the AI's responses to Roose's questions are far from unexpected. Its erratic use of emojis and seemingly unfiltered, emotional way of speaking feels human because, in some ways, it is – just not in the way our cultural anxieties over artificial intelligence might lead us to believe (Olson, 2023).

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found