Amazon sent private Alexa audio recordings to a random person

Engadget

An Amazon customer in Germany received another user's Alexa audio recordings due to a "human error." The unidentified man was sent thousands of audio snippets, along with other information, in a zip file when he requested access to his own data from Amazon as part of the EU's GDPR guidelines. Searching through the document, he found 1,700 Alexa voice files that belonged to someone else, reports Germany's C't Magazine. The man, who reportedly isn't an Alexa user himself, got in touch with Amazon to notify it about the mishap but never received a response. He later found that the download link to the original file was no longer working, but he'd already saved the info locally.


These popular devices keep a recording of everything you ask them -- here's how to find it and delete it

#artificialintelligence

Google Home and the Amazon Echo Dot both provide a variety of services that can help you out around the house and entertain you. But to accomplish this task, they also keep audio recordings of everything you have ever asked them. It's a little bit creepy, but both companies say this history is important to help the devices learn and serve you better. If this still bothers you, here's how to find the recordings and delete them. Following is a transcript of the video.


An Amazon employee might have listened to your Alexa recording

Engadget

Yes, someone might listen to your Alexa conversations someday. A Bloomberg report has detailed how Amazon employs thousands of full-timers and contractors from around the world to review audio clips from Echo devices. Apparently, these workers transcribe and annotate recordings, which they then feed back into the software to make Alexa smarter than before. The process helps beef up the voice AI's understanding of human speech, especially for non-English-speaking countries or for places with distinctive regional colloquialisms. In French, for instance, an Echo speaker could hear avec sa ("with his" or "with her") as "Alexa" and treat it as a wake word.


NYC Audio Recording Studio Lotas Productions

#artificialintelligence

We use technology to supplement our searches, but know that when it comes to sound, nothing compares to the human ear. Ours have been in this industry long enough to know how to match emotion to product, project or concept, and we put those skills to work for you!


Just How Dangerous Is Alexa?

Huffington Post - Tech news and opinion

What happens to the approximately 60 seconds of audio recording preceding a wake word? The one that has a recording of the TV soundtrack, footsteps, the loud argument in the next room, the gunshot, etc.? What happens with that audio? Again, Amazon says it is erased and replaced with the next 60 seconds of audio. Skeptics say if a wake word is detected, the previous 60-ish seconds of audio is put in a database for further IVCS training. If so, could that audio be subpoenaed?