Civil Rights & Constitutional Law

Microsoft's AI millennial chatbot became a racist jerk after less than a day on Twitter


The bot was designed to learn by talking with real people on Twitter and the messaging apps Kik and GroupMe. And, after less than a day on Twitter, the bot had itself started spouting racist, sexist, anti-Semitic comments. "Tay" went from "humans are super cool" to full nazi in 24 hrs and I'm not at all concerned about the future of AI Now, you might wonder why Microsoft would unleash a bot upon the world that was so unhinged. The AI chatbot Tay is a machine learning project, designed for human engagement.