Digital assistants like Siri and Alexa entrench gender biases, says UN

The Guardian

Assigning female genders to digital assistants such as Apple's Siri and Amazon's Alexa is helping entrench harmful gender biases, according to a UN agency. Research released by Unesco claims that the often submissive and flirty responses offered by the systems to many queries – including outright abusive ones – reinforce ideas of women as subservient. "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like'hey' or'OK'," the report said. "The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."


Let's not allow artificial intelligence to reinforce very real stereotypes

#artificialintelligence

Playing with my Lego, as a child, I would build human-like figures. I would create a whole cast of goodies and baddies, who would invariably end up fighting. The goodies always spoke with a North American drawl, while the baddies spoke English with heavy foreign accents. The very few female characters in my games were either shrieking, hyper-feminine princesses who needed saving, or near-voiceless helpers who looked after the base and cared for the wounded heroes. My bedroom carpet was a showground for the stereotypes of the day.


Is it time for Alexa and Siri to have a "MeToo moment"?

#artificialintelligence

More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say. The numbers are eye-popping: 85% of Americans use at least one product with artificial intelligence (AI), and global use will reach 1.8 billion by 2021, so the impact of these "robot overlords" is unparalleled. But (AI) voice assistants, including Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Assistant are inflaming gender stereotypes and teaching sexism to a generation of millennials by creating a model of "docile and eager-to-please helpers," with acceptance of sexual harassment and verbal abuse, a new U.N. study says. A 145-page U.N. report published this week by the educational, scientific and cultural organization UNESCO concludes that the voices we speak to are programmed to be submissive and accept abuse as a norm. The report is titled, "I'd blush if I could: Closing Gender Divides in Digital Skills Through Education."


We tested bots like Siri and Alexa to see who would stand up to sexual harassment

#artificialintelligence

Women have been made into servants once again. Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Google Home peddle stereotypes of female subservience--which puts their "progressive" parent companies in a moral predicament. People often comment on the sexism inherent in these subservient bots' female voices, but few have considered the real-life implications of the devices' lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers' actions as normal or acceptable. In order to substantiate claims about these bots' responses to sexual harassment and the ethical implications of their pre-programmed responses, Quartz gathered comprehensive data on their programming by systematically testing how each reacts to harassment. The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity. And Apple, Amazon, Google, and Microsoft have the responsibility to do something about it.


We tested bots like Siri and Alexa to see who would stand up to sexual harassment

#artificialintelligence

Women have been made into servants once again. Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Google Home peddle stereotypes of female subservience--which puts their "progressive" parent companies in a moral predicament. People often comment on the sexism inherent in these subservient bots' female voices, but few have considered the real-life implications of the devices' lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers' actions as normal or acceptable. In order to substantiate claims about these bots' responses to sexual harassment and the ethical implications of their pre-programmed responses, Quartz gathered comprehensive data on their programming by systematically testing how each reacts to harassment. The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity. And Apple, Amazon, Google, and Microsoft have the responsibility to do something about it.