Microsoft's Tay AI chatbot goes offline after being taught to be a racist

ZDNet 

Tay fell silent after making several provocative and controversial posts on Twitter. Microsoft's millennial-talking AI chatbot, Tay.ai, has taken a break from Twitter after humans taught it to parrot a number of inflammatory and racist opinions. Microsoft had launched Tay on Wednesday, aiming it at people aged between 18 and 24 years in the US. But after 16 busy hours of talking on subjects ranging from Hitler to 9/11 conspiracies, Tay has gone quiet. "c u soon humans need sleep now so many conversations today thx," Tay said in what many suspect is Microsoft's effort to silence it after Tay made several provocative and controversial posts.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found