From ELIZA to ChatGPT, our digital reflections show the dangers of AI - Vox
It didn't take long for Microsoft's new AI-infused search engine chatbot -- codenamed "Sydney" -- to display a growing list of discomforting behaviors after it was introduced early in February, with weird outbursts ranging from unrequited declarations of love to painting some users as "enemies." As human-like as some of those exchanges appeared, they probably weren't the early stirrings of a conscious machine rattling its cage. Instead, Sydney's outbursts reflect its programming, absorbing huge quantities of digitized language and parroting back what its users ask for. Which is to say, it reflects our online selves back to us. And that shouldn't have been surprising -- chatbots' habit of mirroring us back to ourselves goes back way further than Sydney's rumination on whether there is a meaning to being a Bing search engine.
Mar-6-2023, 01:30:27 GMT
- Technology: