Amazon employees listening to your Alexa recordings can also find out where customers live

Daily Mail

Hot on the heels of Amazon admitting it can listen to private Alexa audio, a new report has revealed that employees can also access users' home addresses. An Amazon team charged with auditing Alexa users' commands can see users' latitude and longitude coordinates, allowing them to easily discover their addresses, Bloomberg reported, citing sources close to the situation. It's the same team uncovered by Bloomberg earlier this month, which is located all over the world and sifts through thousands of recordings, transcribing and analyzing them in the process. An Amazon team charged with auditing Alexa users' commands can see their location coordinates, allowing them to easily discover their addresses, a new report has found Tap the menu button on the top-left of the screen. Select'Manage how your data improves Alexa.' Turn the button next to'Help Develop New Features' to off.

Hacking the Amazon Alexa virtual assistant to spy on unaware users


The Alexa virtual assistant could be abused by attackers to spy on consumers with smart devices. Researchers at security firm Checkmarx created a proof-of-concept Amazon Echo Skill for Alexa that instructs the device to indefinitely record surround voice to secretly eavesdrop on users' conversations and then sends the transcripts to a website controlled by the attackers. Amazon allows developers to build custom Skills that can control voice-activated smart devices such as Amazon Echo Show, Echo Dot, and Amazon Tap. The rogue Echo Skill for Alexa is disguised as a simple math calculator, once installed it will be activated in the background after a user says "Alexa, open calculator." "The Echo is continuously listening for the user's voice.

Alexa, are you alone? Amazon staff may be listening to your recordings - National


WATCH (May 24, 2018): Amazon's Alexa records family's conversation, sends it to random contact Amazon staff can listen to commands and questions users pose to the Alexa voice assistant -- and they sometimes do. The company acknowledged that the conversations aren't totally private in a statement to Global News after the news was first reported by Bloomberg. "We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience," Amazon said in the statement. Amazon explained that the company uses samples collected to better train "speech recognition and natural language understanding systems." READ MORE: Alexa recorded one family's conversations and sent them to a friend, without them knowing Bloomberg reported Wednesday that Amazon has "thousands" of employees who are trying to improve Alexa's speech recognition technology.

41% of voice assistant users have concerns about trust and privacy, report finds – TechCrunch


Forty-one percent of voice assistant users are concerned about trust, privacy and passive listening, according to a new report from Microsoft focused on consumer adoption of voice and digital assistants. And perhaps people should be concerned -- all the major voice assistants, including those from Google, Amazon, Apple and Samsung, as well as Microsoft, employ humans who review the voice data collected from end users. But people didn't seem to know that was the case. So when Bloomberg recently reported on the global team at Amazon that reviews audio clips from commands spoken to Alexa, some backlash occurred. In addition to the discovery that our AI helpers also have a human connection, there were concerns over the type of data the Amazon employees and contractors were hearing -- criminal activity and even assaults in a few cases, as well as the otherwise odd, funny or embarrassing things the smart speakers picked up.

Amazon is keeping your Alexa data in text form even AFTER you delete the audio recordings

Daily Mail - Science & tech

Voice recordings captured by Amazon's Alexa can be deleted but the automatically produced transcriptions remain in the company's cloud, according to reports. After Alexa hears its'wake' word, the smart assistant starts listening and transcribing everything it hears. All the voice commands said to the virtual assistant can be deleted from the central system, but the company still has the the text logs, according to CNET. This data is kept on its cloud servers, with no option for users to delete it, but the company claims it is working on ways to make the data inaccessible. Amazon workers are listening to private and sometimes disturbing voice recordings to improve the voice-assistants understanding of human speech.