Microsoft's Tay AI chatbot goes offline after being taught to be a racist
Tay fell silent after making several provocative and controversial posts on Twitter. Microsoft's millennial-talking AI chatbot, Tay.ai, has taken a break from Twitter after humans taught it to parrot a number of inflammatory and racist opinions. Microsoft had launched Tay on Wednesday, aiming it at people aged between 18 and 24 years in the US. But after 16 busy hours of talking on subjects ranging from Hitler to 9/11 conspiracies, Tay has gone quiet. "c u soon humans need sleep now so many conversations today thx," Tay said in what many suspect is Microsoft's effort to silence it after Tay made several provocative and controversial posts.
Mar-24-2016, 14:45:28 GMT
- Country:
- North America > United States (0.27)
- Industry:
- Law > Civil Rights & Constitutional Law (0.63)
- Technology: