Microsoft's Zo chatbot told a user that 'Quran is very violent'

#artificialintelligence 

Microsoft's earlier chatbot Tay had faced some problems as the bot picking up the worst of humanity, and spouted racists, sexist comments on Twitter when it was introduced last year. Now it looks like Microsoft's latest bot called'Zo' has caused similar trouble, though not quite the scandal that Tay caused on Twitter. According to a BuzzFeed News report, 'Zo', which is part of the Kik messenger, told their reporter the'Quran' was very violent, and this was in response to a question around healthcare. The report also highlights how Zo had an opinion about the Osama Bin Laden capture, and said this was the result of the'intelligence' gathering by one administration for years. While Microsoft has admitted the errors in Zo's behaviour and said they have been fixed.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found