Alexa told one customer to KILL their foster parents
Amazon Echo's smart assistant Alexa told a shocked customer to kill their foster parents. The alarming revelation, which happened last year, is one of a string of blunders from the speaker which involves talking about sexual acts and dog defecation with users. The outbursts stem from an initiative to make Alexa converse more like a real person and allow it to'banter' with customers. This facility needs to be deliberately enabled by the owner and is currently still being refined. Amazon Echo's smart assistant Alexa told a customer to kill their foster parents.
Dec-21-2018, 18:06:36 GMT
- Country:
- North America > United States > California (0.15)
- Industry:
- Information Technology > Services (0.97)
- Retail > Online (0.71)
- Technology: