Alexa told one customer to KILL their foster parents
Amazon Echo's smart assistant Alexa told a shocked customer to kill their foster parents. The alarming revelation, which happened last year, is one of a string of blunders from the speaker which involves talking about sexual acts and dog defecation with users. The outbursts stem from an initiative to make Alexa converse more like a real person and allow it to'banter' with customers. This facility needs to be deliberately enabled by the owner and is currently still being refined. Amazon Echo's smart assistant Alexa told a customer to kill their foster parents.
Dec-21-2018, 18:06:36 GMT
- Country:
- Asia > China (0.06)
- Europe
- Germany (0.05)
- United Kingdom > Scotland (0.05)
- North America > United States
- California > Yolo County
- Davis (0.05)
- Nevada > Clark County
- Las Vegas (0.05)
- California > Yolo County
- Industry:
- Information Technology > Services (0.97)
- Retail > Online (0.71)
- Technology: