The Genderless Digital Voice the World Needs Right Now

WIRED

Boot up the options for your digital voice assistant of choice and you're likely to find two options for the gender you prefer interacting with: male or female. The problem is, that binary choice isn't an accurate representation of the complexities of gender. Some folks don't identify as either male or female, and they may want their voice assistant to mirror that identity. But a group of linguists, technologists, and sound designers--led by Copenhagen Pride and Vice's creative agency Virtue--are on a quest to change that with a new, genderless digital voice, made from real voices, called Q. Q isn't going to show up in your smartphone tomorrow, but the idea is to pressure the tech industry into acknowledging that gender isn't necessarily binary, a matter of man or woman, masculine or feminine. The project is confronting a new digital universe fraught with problems.


Startup launches world's first genderless AI to fight bias in smart assistants

Daily Mail

Talk to Apple's Siri or Amazon's Alexa and you'll notice a common trait: They both have female voices. While this can help make robotic assistants more relatable and natural to converse with, it has assigned a gender to a technology that's otherwise genderless. Now, researchers are hoping to offer a new alternative by launching what they're calling the world's first'genderless voice.' To create'Q', researchers recorded voices from participants who identify as non-binary, or neither exclusively female nor male. Researchers then tested the voice on 4,600 people across Europe.


Capital One launches Eno, a gender neutral AI assistant

Daily Mail

In a world of female chatbots, one program has dared to refer to itself as'binary'. Named Eno, the gender-neutral virtual assistant was created by Capital One to help the bank's customers'manage their money by texting in a conversational way'. The robot is powered with artificial intelligence, allowing it to understand natural language, and when asked if it is a male or female, it responds'binary'. In a world of female chatbots, one program dares to refer to itself as'binary'. Named Eno, the gender-neutral virtual assistant was created by Capital One Financial Corp to help the bank's customers'manage their money by texting in a conversational way' Capital One Financial Corp has unveiled a chatbot to'help the bank's customers'manage their money by texting in a conversational way'.


Google Assistant is gender-neutral(ish), but it's not feminist

#artificialintelligence

In a world occupied by Siri, Cortana and Alexa, Google Assistant is a bit of an anomaly. It's the first widely used voice assistant to eschew a female name, which the company reportedly did to avoid giving it a personality. The company would rather you imagined yourself talking directly to "Google the search engine" than a go-between. Avoiding a gendered name just happened to be a happy coincidence, it seems. Despite Google (perhaps unintentionally) shunning obvious sexism in its AI, it still fell into the gender bias trap by giving Assistant a female voice.


"Siri, Cortana, Alexa, Marcus. Do bots really need a gender?"

#artificialintelligence

They are in fact such a persistent feature in society that it is not unsurprising to find these stereotypes -- gender, in particular -- still perpetuated in the artificial intelligence bots that have been produced over the years. These gender stereotypes are played out in the roles that these bots are assigned to perform in an industry and their overall personalities. Female bots typically perform more administrative and secretarial roles such as assisting in the completion of routine tasks, scheduling meetings and customer service. Male bots on the other hand, often perform more analytical roles like providing financial advice and paralegal services. Recently, a growing group of companies have started to buck this trend by choosing to create gender-neutral bots instead, sparking discussions in the tech industry on the necessity and consequences of assigning gender (and with that stereotypical traits) to bots in the first place.