Microsoft's politically correct chatbot is even worse than its racist one
Every sibling relationship has its clichés. In the Microsoft family of social-learning chatbots, the contrasts between Tay, the infamous, sex-crazed neo-Nazi, and her younger sister Zo, your teenage BFF with #friendgoals, are downright Shakespearean. When Microsoft released Tay on Twitter in 2016, an organized trolling effort took advantage of her social-learning abilities and immediately flooded the bot with alt-right slurs and slogans. Tay copied their messages and spewed them back out, forcing Microsoft to take her offline after only 16 hours and apologize. A few months after Tay's disastrous debut, Microsoft quietly released Zo, a second English-language chatbot available on Messenger, Kik, Skype, Twitter, and Groupme.
Mar-8-2019, 18:37:18 GMT
- Country:
- Africa > Middle East (0.04)
- Asia > Middle East
- Europe > Middle East (0.04)
- North America > United States (0.15)
- Industry:
- Education (0.55)
- Law > Civil Rights & Constitutional Law (0.84)
- Law Enforcement & Public Safety > Terrorism (0.54)
- Technology: