Microsoft axes 'chatbot' that learned a little too much online

The Japan Times 

SAN FRANCISCO – OMG! Did you hear about the artificial intelligence program that Microsoft designed to chat like a teenage girl? It was totally yanked offline in less than a day after it began spouting racist, sexist and otherwise offensive remarks. Microsoft said it was all the fault of some really mean people, who launched a "coordinated effort" to make the "chatbot" known as Tay "respond in inappropriate ways." To which one artificial intelligence expert responded: Duh! Well, he didn't really say that.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found