Microsoft Grounds Its AI Chat Bot After it Learns Sexism and Racism From Twitter Users

#artificialintelligence 

Microsoft's Tay AI is youthful beyond just its vaguely hip-sounding dialogue -- it's overly impressionable, too. The company has grounded its Twitter chat bot (that is, temporarily shutting it down) after people taught it to repeat conspiracy theories, racist views and sexist remarks. We won't echo them here, but they involved 9/11, GamerGate, Hitler, Jews, Trump and less-than-respectful portrayals of President Obama. Yeah, it was that bad. The account is visible as we write this, but the offending tweets are gone; Tay has gone to "sleep" for now.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found