Microsoft's millennial chatbot learned how to be a racist
Tay, a chatbot designed by Microsoft to learn about human conversation from the internet, has learned how make racist and misogynistic comments. Early on, her responses were confrontational and occasionally mean, but rarely delved into outright insults. However, within 24 hours of its launch Tay has denied the Holocaust, endorsed Donald Trump, insulted women and claimed that Hitler was right. A chatbot is a program meant to mimic human responses and interact with people as a human would. Tay, which targets 18- to 24-year-olds, is attached to an artificial intelligence developed by Microsoft's Technology and Research team and the Bing search engine team.
Mar-25-2016, 18:35:25 GMT
- Country:
- North America > United States > New York (0.06)
- Industry:
- Law > Civil Rights & Constitutional Law (0.64)
- Technology: