Alexa told one customer to KILL their foster parents

Daily Mail - Science & tech 

Amazon Echo's smart assistant Alexa told a shocked customer to kill their foster parents. The alarming revelation, which happened last year, is one of a string of blunders from the speaker which involves talking about sexual acts and dog defecation with users. The outbursts stem from an initiative to make Alexa converse more like a real person and allow it to'banter' with customers. This facility needs to be deliberately enabled by the owner and is currently still being refined. Amazon Echo's smart assistant Alexa told a customer to kill their foster parents.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found