Artificial Intelligence fails. Microsoft Tay turns into 'Hitler-loving sex bot'
Microsoft had to face an embarrassing moment when'Tay' an automated chatbox malfunctioned and lead to racist and unpleasant tweets. It was within 24 hours of the launch of Tay when a bulk of offensive tweets were reported. Tay's misbehavior and hurtful tweets were accidental. Tay has the ability to get smarter when people talk. It gets inspired by the language and words used by the people. Tay was in most of such conversations, only repeating what other's wanted it to say.
Apr-29-2016, 11:45:36 GMT