Microsoft's Cortana, Amazon's Alexa and Google's Assistant all have something in common – each AI is programmed to have a female voice. Other than Apple adding the option of a male voice for Siri, all of the technology on the market speaks with a softer tone. Although some consider this move an act of sexism, two studies have revealed that both men and women preferred female voices - which were found to be'warmer' and'understanding'. Microsoft's Cortana, Amazon's Alexa (pictured is Amazon Echo, Alexa's home) and Google's Assistant all have a female voice. The Wall Street Journal (WSJ) has recently cited two studies that investigate these allegations, which have discovered that'both women and men find the female voice welcoming and warm,' reports Joanna Stern with WSJ.
Playing with my Lego, as a child, I would build human-like figures. I would create a whole cast of goodies and baddies, who would invariably end up fighting. The goodies always spoke with a North American drawl, while the baddies spoke English with heavy foreign accents. The very few female characters in my games were either shrieking, hyper-feminine princesses who needed saving, or near-voiceless helpers who looked after the base and cared for the wounded heroes. My bedroom carpet was a showground for the stereotypes of the day.
Our tech world is fraught with troubling trends when it comes to gender inequality. A recent UN report "I'd blush if I could" warns that embodied AIs like the primarily female voice assistants can actually reinforce harmful gender stereotypes. Dag Kittlaus, who co-founded Siri before its acquisition by Apple, spoke out on Twitter against the accusation on Siri's sexism: It is important to acknowledge that the gender of Siri, unlike that of other voice assistants, was configurable early on. But the product's position becomes harder to define when you notice that Siri's response to the highly inappropriate comment "You're a slut" is in fact the title of the UN report: "I'd blush if I could." Therefore, in this article I'd like to discuss the social and cultural aspects of voice assistants, and specifically, why they are designed with gender, what ethical concerns this causes, and how we can fix this issue.
Consider the artificially intelligent voices you hear on a regular basis. Are any of them men? Whether it's Apple's Siri, Microsoft's Cortana, Amazon's Alexa, or virtually any GPS system, chances are the computerized personalities in your life are women. This gender imbalance is pervasive in fiction as well as reality. Films like "Her" and "Ex Machina" reflect our anxieties about what intelligent machines mean for humanity.
Technology that can understand regional accents and gender-neutral voice assistants are among the developments expected in the voice technology field in 2020. Products such as Alexa and Siri have faced mounting criticism that the technology behind them disproportionately misunderstands women, ethnic minorities and those with accents not represented in datasets that have historically favoured white and Chinese male voices. In response, a wave of new projects aims to redress the balance and make the growing voice tech industry more inclusive. "Voice tech has failed by not being programmed to respond adequately to abuse," she said. "The example of Siri stating'I'd blush if I could' when told it was a bitch is a well-known example, as is Alexa replying'Well, thanks for the feedback' when told'You're a slut'."