Boot up the options for your digital voice assistant of choice and you're likely to find two options for the gender you prefer interacting with: male or female. The problem is, that binary choice isn't an accurate representation of the complexities of gender. Some folks don't identify as either male or female, and they may want their voice assistant to mirror that identity. But a group of linguists, technologists, and sound designers--led by Copenhagen Pride and Vice's creative agency Virtue--are on a quest to change that with a new, genderless digital voice, made from real voices, called Q. Q isn't going to show up in your smartphone tomorrow, but the idea is to pressure the tech industry into acknowledging that gender isn't necessarily binary, a matter of man or woman, masculine or feminine. The project is confronting a new digital universe fraught with problems.
Talk to Apple's Siri or Amazon's Alexa and you'll notice a common trait: They both have female voices. While this can help make robotic assistants more relatable and natural to converse with, it has assigned a gender to a technology that's otherwise genderless. Now, researchers are hoping to offer a new alternative by launching what they're calling the world's first'genderless voice.' To create'Q', researchers recorded voices from participants who identify as non-binary, or neither exclusively female nor male. Researchers then tested the voice on 4,600 people across Europe.
Technology that can understand regional accents and gender-neutral voice assistants are among the developments expected in the voice technology field in 2020. Products such as Alexa and Siri have faced mounting criticism that the technology behind them disproportionately misunderstands women, ethnic minorities and those with accents not represented in datasets that have historically favoured white and Chinese male voices. In response, a wave of new projects aims to redress the balance and make the growing voice tech industry more inclusive. "Voice tech has failed by not being programmed to respond adequately to abuse," she said. "The example of Siri stating'I'd blush if I could' when told it was a bitch is a well-known example, as is Alexa replying'Well, thanks for the feedback' when told'You're a slut'."
In a world of female chatbots, one program has dared to refer to itself as'binary'. Named Eno, the gender-neutral virtual assistant was created by Capital One to help the bank's customers'manage their money by texting in a conversational way'. The robot is powered with artificial intelligence, allowing it to understand natural language, and when asked if it is a male or female, it responds'binary'. In a world of female chatbots, one program dares to refer to itself as'binary'. Named Eno, the gender-neutral virtual assistant was created by Capital One Financial Corp to help the bank's customers'manage their money by texting in a conversational way' Capital One Financial Corp has unveiled a chatbot to'help the bank's customers'manage their money by texting in a conversational way'.
Artificial intelligence voice assistants with female voices reinforce existing gender biases, according to a new United Nations' report. The new report from UNESCO, entitled "I'd Blush If I Could," looks at the impact of having female voice assistants, from Amazon's Alexa to Apple's Siri, projected in a way that suggests that women are "subservient and tolerant of poor treatment." The report takes its title from the response Siri used to give when a human told her, "Hey Siri, you're a b-tch." Further, researchers argue that tech companies have failed to take protective measures against abusive or gendered language from users. "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like'hey' or'OK,'" the researchers write.