Goto

Collaborating Authors

Let's not allow artificial intelligence to reinforce very real stereotypes

#artificialintelligence

Playing with my Lego, as a child, I would build human-like figures. I would create a whole cast of goodies and baddies, who would invariably end up fighting. The goodies always spoke with a North American drawl, while the baddies spoke English with heavy foreign accents. The very few female characters in my games were either shrieking, hyper-feminine princesses who needed saving, or near-voiceless helpers who looked after the base and cared for the wounded heroes. My bedroom carpet was a showground for the stereotypes of the day.


Startup launches world's first genderless AI to fight bias in smart assistants

Daily Mail - Science & tech

Talk to Apple's Siri or Amazon's Alexa and you'll notice a common trait: They both have female voices. While this can help make robotic assistants more relatable and natural to converse with, it has assigned a gender to a technology that's otherwise genderless. Now, researchers are hoping to offer a new alternative by launching what they're calling the world's first'genderless voice.' To create'Q', researchers recorded voices from participants who identify as non-binary, or neither exclusively female nor male. Researchers then tested the voice on 4,600 people across Europe.


Why do we gender AI? Voice tech firms move to be more inclusive

The Guardian

Technology that can understand regional accents and gender-neutral voice assistants are among the developments expected in the voice technology field in 2020. Products such as Alexa and Siri have faced mounting criticism that the technology behind them disproportionately misunderstands women, ethnic minorities and those with accents not represented in datasets that have historically favoured white and Chinese male voices. In response, a wave of new projects aims to redress the balance and make the growing voice tech industry more inclusive. "Voice tech has failed by not being programmed to respond adequately to abuse," she said. "The example of Siri stating'I'd blush if I could' when told it was a bitch is a well-known example, as is Alexa replying'Well, thanks for the feedback' when told'You're a slut'."


Meet Q, The Gender-Neutral Voice Assistant

NPR Technology

For most people who talk to our technology -- whether it's Amazon's Alexa, Apple Siri or the Google Assistant -- the voice that talks back sounds female. Some people do choose to hear a male voice. Now, researchers have unveiled a new gender-neutral option: Q. "One of our big goals with Q was to contribute to a global conversation about gender, and about gender and technology and ethics, and how to be inclusive for people that identify in all sorts of different ways," says Julie Carpenter, an expert in human behavior and emerging technologies who worked on developing Project Q. The voice of Q was developed by a team of researchers, sound designers and linguists in conjunction with the organizers of Copenhagen Pride week, technology leaders in an initiative called Equal AI and others. They first recorded dozens of voices of people -- those who identify as male, female, transgender or nonbinary.


Capital One launches Eno, a gender neutral AI assistant

Daily Mail - Science & tech

In a world of female chatbots, one program has dared to refer to itself as'binary'. Named Eno, the gender-neutral virtual assistant was created by Capital One to help the bank's customers'manage their money by texting in a conversational way'. The robot is powered with artificial intelligence, allowing it to understand natural language, and when asked if it is a male or female, it responds'binary'. In a world of female chatbots, one program dares to refer to itself as'binary'. Named Eno, the gender-neutral virtual assistant was created by Capital One Financial Corp to help the bank's customers'manage their money by texting in a conversational way' Capital One Financial Corp has unveiled a chatbot to'help the bank's customers'manage their money by texting in a conversational way'.