Researchers compile list of 1,000 words that accidentally trigger Alexa, Siri, and Google Assistant

Daily Mail - Science & tech 

Researchers in Germany have compiled a list of more than 1,000 words that will inadvertently cause virtual assistants like Amazon's Alexa and Apple's Siri to become activated. Once activated, these virtual assistants create sound recordings that are later transmitted to platform holders, where they may be transcribed for quality assurance purposes or other analysis. According to the team, from Ruhr-Universität Bochum and the Max Planck Institute for Cyber Security and Privacy in Germany, this has'alarming' implications for user privacy and likely means short recordings of personal conversations could periodically end up in the hands of Amazon, Apple, Google, or Microsoft workers. Researchers in Germany tested virtual assistants like Amazon's Alexa, Apple's Siri, Google Assistant, and Microsoft's Cortana, and found more than 1,000 words or phrases that would inadvertently activate each device The group tested Amazon's Alexa, Apple's Siri, Google Assistant, Microsoft Cortana, as well as three virtual assistants exclusive to the Chinese market, from Xiaomi, Baidu, and Tencent, according to a report from the Ruhr-Universität Bochum news blog. They left each virtual assistant alone in a room with a television that played dozens of hours of episodes from Game of Thrones, Modern Family, and House of Cards, with English, German, and Chinese audio tracks for each.