Goto

Collaborating Authors

Is it time for Alexa and Siri to have a "MeToo moment"?

#artificialintelligence

More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say. The numbers are eye-popping: 85% of Americans use at least one product with artificial intelligence (AI), and global use will reach 1.8 billion by 2021, so the impact of these "robot overlords" is unparalleled. But (AI) voice assistants, including Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Assistant are inflaming gender stereotypes and teaching sexism to a generation of millennials by creating a model of "docile and eager-to-please helpers," with acceptance of sexual harassment and verbal abuse, a new U.N. study says. A 145-page U.N. report published this week by the educational, scientific and cultural organization UNESCO concludes that the voices we speak to are programmed to be submissive and accept abuse as a norm. The report is titled, "I'd blush if I could: Closing Gender Divides in Digital Skills Through Education."


Alexa, Alex, or Al?

#artificialintelligence

Our tech world is fraught with troubling trends when it comes to gender inequality. A recent UN report "I'd blush if I could" warns that embodied AIs like the primarily female voice assistants can actually reinforce harmful gender stereotypes. Dag Kittlaus, who co-founded Siri before its acquisition by Apple, spoke out on Twitter against the accusation on Siri's sexism: It is important to acknowledge that the gender of Siri, unlike that of other voice assistants, was configurable early on. But the product's position becomes harder to define when you notice that Siri's response to the highly inappropriate comment "You're a slut" is in fact the title of the UN report: "I'd blush if I could." Therefore, in this article I'd like to discuss the social and cultural aspects of voice assistants, and specifically, why they are designed with gender, what ethical concerns this causes, and how we can fix this issue.


Startup launches world's first genderless AI to fight bias in smart assistants

Daily Mail - Science & tech

Talk to Apple's Siri or Amazon's Alexa and you'll notice a common trait: They both have female voices. While this can help make robotic assistants more relatable and natural to converse with, it has assigned a gender to a technology that's otherwise genderless. Now, researchers are hoping to offer a new alternative by launching what they're calling the world's first'genderless voice.' To create'Q', researchers recorded voices from participants who identify as non-binary, or neither exclusively female nor male. Researchers then tested the voice on 4,600 people across Europe.


Why It Matters That Alexa and Google Assistant Finally Have Male Voices

Slate

Siri, Alexa, Cortana, and Google Assistant are all female. The topic has been much discussed and researched in recent years, with data offering as one explanation for the phenomenon that both men and women prefer the sound of female voices. "They're warmer and more relatable, and make people receptive to voice-activated technology," Fast Company explained in March. Many virtual assistant users (and critics) weren't satisfied--why is that we're so OK bossing around a female voice and not a male one? Writing for the Atlantic in 2016, Adrienne LaFrance said, "The simplest explanation is that people are conditioned to expect women, not men, to be in administrative roles--and that the makers of digital assistants are influenced by these social expectations."


Why do we gender AI? Voice tech firms move to be more inclusive

The Guardian

Technology that can understand regional accents and gender-neutral voice assistants are among the developments expected in the voice technology field in 2020. Products such as Alexa and Siri have faced mounting criticism that the technology behind them disproportionately misunderstands women, ethnic minorities and those with accents not represented in datasets that have historically favoured white and Chinese male voices. In response, a wave of new projects aims to redress the balance and make the growing voice tech industry more inclusive. "Voice tech has failed by not being programmed to respond adequately to abuse," she said. "The example of Siri stating'I'd blush if I could' when told it was a bitch is a well-known example, as is Alexa replying'Well, thanks for the feedback' when told'You're a slut'."