Digital assistants like Siri and Alexa entrench gender biases, says UN

The Guardian

Assigning female genders to digital assistants such as Apple's Siri and Amazon's Alexa is helping entrench harmful gender biases, according to a UN agency. Research released by Unesco claims that the often submissive and flirty responses offered by the systems to many queries – including outright abusive ones – reinforce ideas of women as subservient. "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like'hey' or'OK'," the report said. "The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."


Let's not allow artificial intelligence to reinforce very real stereotypes

#artificialintelligence

Playing with my Lego, as a child, I would build human-like figures. I would create a whole cast of goodies and baddies, who would invariably end up fighting. The goodies always spoke with a North American drawl, while the baddies spoke English with heavy foreign accents. The very few female characters in my games were either shrieking, hyper-feminine princesses who needed saving, or near-voiceless helpers who looked after the base and cared for the wounded heroes. My bedroom carpet was a showground for the stereotypes of the day.


Is it time for Alexa and Siri to have a "MeToo moment"?

#artificialintelligence

More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say. The numbers are eye-popping: 85% of Americans use at least one product with artificial intelligence (AI), and global use will reach 1.8 billion by 2021, so the impact of these "robot overlords" is unparalleled. But (AI) voice assistants, including Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Assistant are inflaming gender stereotypes and teaching sexism to a generation of millennials by creating a model of "docile and eager-to-please helpers," with acceptance of sexual harassment and verbal abuse, a new U.N. study says. A 145-page U.N. report published this week by the educational, scientific and cultural organization UNESCO concludes that the voices we speak to are programmed to be submissive and accept abuse as a norm. The report is titled, "I'd blush if I could: Closing Gender Divides in Digital Skills Through Education."


The Guardian view on female voice assistants: not OK, Google Editorial

#artificialintelligence

Within two years there will be more voice assistants on the internet than there are people on the planet. Another, possibly more helpful, way of looking at these statistics is to say that there will still be only half a dozen assistants that matter: Apple's Siri, Google's Assistant, and Amazon's Alexa in the west, along with their Chinese equivalents, but these will have billions of microphones at their disposal, listening patiently for sounds they can use. Voice is going to become the chief way that we make our wants known to computers – and when they respond, they will do so with female voices. This detail may seem trivial, but it goes to the heart of the way in which the spread of digital technologies can amplify and extend social prejudice. The companies that program these assistants want them to be used, of course, and this requires making them appear helpful.


AI Voice Assistants Reinforce Gender Biases, U.N. Report Says

TIME - Tech

Artificial intelligence voice assistants with female voices reinforce existing gender biases, according to a new United Nations' report. The new report from UNESCO, entitled "I'd Blush If I Could," looks at the impact of having female voice assistants, from Amazon's Alexa to Apple's Siri, projected in a way that suggests that women are "subservient and tolerant of poor treatment." The report takes its title from the response Siri used to give when a human told her, "Hey Siri, you're a b-tch." Further, researchers argue that tech companies have failed to take protective measures against abusive or gendered language from users. "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like'hey' or'OK,'" the researchers write.