A MailOnline investigation into how much personal information Alexa is recording and storing on its users has revealed the smart assistant eavesdrops on housemates' gossip, private conversations about insurance policies - and even the family dog. Amazon insists Alexa can only be activated when the allocated'wake word' is uttered - being Alexa, Computer or Echo. The tech giant - along with Apple's Siri and, until recently, Google's Assistant - says it saves every single interaction a person has with the device to improve the service - with some'unintentional' snippets also being recorded if it mistakes another noise for a'wake word'. However, evidence seen by MailOnline shows this cannot be the case, or the process is fundamentally flawed, as a host of sounds and conversations were recorded without a clear or legitimate wake word being uttered - some when there was not even a human nearby. A MailOnline investigation into these'secret' archives has revealed an eerie snippets of users' friends, families and children being recorded while they were completely unaware - and without a clear or legitimate wake word being uttered.
Amazon has rolled out a new security feature to give users greater control over their voice recordings. The internet giant will now let users ask Alexa-equipped devices to delete their voice recordings from that day. It comes as Amazon has faced growing privacy concerns tied to its Alexa digital assistant, including who is able to access users' voice recordings and how it stores them. Amazon has rolled out a new security feature to give users greater control over their voice recordings. 'Simply say, "Alexa, delete everything I said today" and the respective recordings will be deleted,' Amazon said.
Alexa's poor reputation for privacy may soon worsen as a patent filed by the firm suggests the virtual assistant may start listening before its'wake word' is said. Under the plans Alexa will be able to detect when it is being given a command even if the wake word is said at the end of the sentence instead of at the front. The move raises concerns over user privacy as Alexa will, by default, always be listening to conversations on the off-chance its wakeword is spoken. Alexa's poor reputation for privacy may soon worsen as a patent filed by the firm suggests the virtual assistant may start listening before its'wake word' is said. The patent, filed with the US Patent and Trademark Office, reveals the Seattle-fimrs plans for the next evolutionary step for it Alexa's technology.
USA TODAY's Ed Baig tests Amazon Echo's personal digital assistant. Find out how Alexa relates to Siri, benefits and flaws in this Jan. Siri can now name songs, via Shazam. SAN FRANCISCO -- While there are multiple voice-controlled products, only a few are in wide use in the United States. Siri can be set up to only turn on when the user pushes a button, or to also respond to the phrase "Hey, Siri."
Amazon has been under fire from critics concerned about the potential loss of privacy when Alexa hears your every word. So on a day Amazon unveiled its latest smart speaker with a display – the $89.99 Echo Show 5 – the company announced privacy features that will apply to all its Alexa-infused devices: notably, the ability to ask Alexa to delete the recordings of your voice captured when you summon Alexa for a task or query. Starting today, you can utter the words, "Alexa, delete what I said today" and recordings from the given day will be erased. In the coming weeks in the U.S. (and later elsewhere), you will be able to say," Alexa, delete what I just said," to wipe out the last request you made. Amazon separately put the spotlight on a new Alexa Privacy Hub meant to provide transparency around how you can ensure privacy when using Alexa and Echo devices.