Speech: AI-Alerts

Google open-sources Live Transcribe's speech engine


The company hopes doing so will let any developer deliver captions for long-form conversations. The source code is available now on GitHub. Google released Live Transcribe in February. The tool uses machine learning algorithms to turn audio into real-time captions. Unlike Android's upcoming Live Caption feature, Live Transcribe is a full-screen experience, uses your smartphone's microphone (or an external microphone), and relies on the Google Cloud Speech API.

Voice Assistants: A Big R&D Bet Which People Rarely Use


"Alexa's responses are protected by the First Amendment" Big tech would really like you to engage with their products using your voice. Firms like Google (NASDAQ:GOOGL), Apple (NASDAQ:APPL), and Amazon (NASDAQ:AMZN) have collectively spent tens of billions of dollars perfecting the technology that allows their gadgets to listen attentively for your voice, understand your commands, and respond obediently. A recent survey by market research firm SUMO Heavy has found that approximately 30% of US adults are active users of voice assistants. As far as the device type, the bulk (49%), use voice assistants with their smartphone, followed by smart speaker, and then PC. Those that do use voice assistants on their smartphone tend to be iPhone users.

Voice Recognition Still Has Significant Race and Gender Biases


Voice AI is becoming increasingly ubiquitous and powerful. Forecasts suggest that voice commerce will be an $80 billion business by 2023. Google reports that 20% of their searches are made by voice query today -- a number that's predicted to climb to 50% by 2020. In 2017, Google announced that their speech recognition had a 95% accuracy rate. While that's an impressive number, it begs the question: 95% accurate for whom?

Voice assistants seem to be worse at understanding commands from women

New Scientist

Many people who use a voice assistant, such as Alexa or Google Home, will be familiar with them not fulling understanding commands. But now it appears they may be worse at understanding women than men. Polling company YouGov asked 1000 people in the UK about voice assistants. Around two thirds of the female participants said the devices failed to respond their voice commands some of the time compared to half of the men. "Our research reveals that women are more likely to encounter problems being understood by a smart speaker than men, …

Amazon staff listen to customers' Alexa recordings, report says

The Guardian

When Amazon customers speak to Alexa, the company's AI-powered voice assistant, they may be heard by more people than they expect, according to a report. Amazon employees around the world regularly listen to recordings from the company's smart speakers as part of the development process for new services, Bloomberg News reports. Some transcribe artist names, linking them to specific musicians in the company's database; others listen to the entire recorded command, comparing it with what the automated systems heard and the response they offered, in order to check the quality of the company's software. Technically, users have given permission for the human verification: the company makes clear that it uses data "to train our speech recognition and natural language understanding systems", and gives users the chance to opt out. But the company doesn't explicitly say that the training will involve workers in America, India, Costa Rica, and more nations around the world listening to those recordings.

Smart speaker recordings reviewed by humans

BBC News

Amazon, Apple and Google all employ staff who listen to customer voice recordings from their smart speakers and voice assistant apps. News site Bloomberg highlighted the topic after speaking to Amazon staff who "reviewed" Alexa recordings. All three companies say voice recordings are occasionally reviewed to improve speech recognition. But the reaction to the Bloomberg article suggests many customers are unaware that humans may be listening. The news site said it had spoken to seven people who reviewed audio from Amazon Echo smart speakers and the Alexa service.

Your Voice Assistant May Be Getting Smarter, But It's Still Awkward


In September of this year, Amazon hosted a press event in the steamy Spheres at its Seattle headquarters, announcing a dizzying array of new hardware products designed to work with the voice assistant Alexa. But at the event, Amazon also debuted some new capabilities for Alexa that showcased the ways in which the company has been trying to give its voice assistant what is essentially a better memory. At one point during the presentation, Amazon executive Dave Limp whispered a command to Alexa to play a lullaby. But this year, the companies making voice-controlled products tried to turn them into sentient gadgets. Alexa can have the computer version of a "hunch" and predict human behavior; Google Assistant can carry on a conversation without requiring you to repeatedly say the wake word.

Speech recognition is tech's next giant leap, says Google

The Guardian

AI robots and self-driving cars might steal the headlines, but the next big leap in technology will be advances in voice services, according to Google's head of search, Ben Gomes, who says that a better understanding of common language is crucial to the future of the internet. "Speech recognition and the understanding of language is core to the future of search and information," said Gomes . "But there are lots of hard problems such as understanding how a reference works, understanding what'he', 'she' or'it' refers to in a sentence. It's not at all a trivial problem to solve in language and that's just one of the millions of problems to solve in language." Gomes was speaking to the Guardian ahead of Google's 20th anniversary on 24 September, more than seven years after Google launched its first voice service as simple speech-to-text for search.

Microsoft and Amazon made their voice assistants into friends. Here's how that relationship works.

Washington Post - Technology News

Amazon.com and Microsoft have officially set up the friendship between their two voice assistants, Alexa and Cortana, a year after announcing the partnership to expand the reach and abilities of the competing assistants. But how does this relationship of rivals work? To talk to one assistant through the other, customers have to say either "Cortana, open Alexa" or "Alexa, open Cortana." From there, people can talk to the other assistant as usual. What they don't get is access to each others' data, according to statements from both companies Wednesday.

Voice assistants gain ground as many Americans talking to their devices: A Foolish Take

USATODAY - Tech Top Stories

Amazon has a new version of Alexa for hotels. Voice assistants such as Apple's (NASDAQ: AAPL) Siri, Alphabet's (NASDAQ: GOOG)(NASDAQ: GOOGL) Google Assistant and Amazon's (NASDAQ: AMZN) Alexa have integrated themselves into our digital lives. Almost half of American adults used voice assistants last year, according to Pew Research. Earlier this year, PwC reported that U.S. internet users who spoke to their devices interacted with smartphones most frequently. However, users are also talking to their tablets, PCs and smart speakers.