Alexa, are you making me sexist?

#artificialintelligence

The other day I spent 10 minutes hurling verbal abuse at Siri. Cringing as I spoke, I said into my phone: "Siri, you're ugly." I said, "Siri, you're fat." She replied, "It must be all the chocolate." I felt mortified for both of us.


AI Voice Assistants Reinforce Gender Biases, U.N. Report Says

TIME - Tech

Artificial intelligence voice assistants with female voices reinforce existing gender biases, according to a new United Nations' report. The new report from UNESCO, entitled "I'd Blush If I Could," looks at the impact of having female voice assistants, from Amazon's Alexa to Apple's Siri, projected in a way that suggests that women are "subservient and tolerant of poor treatment." The report takes its title from the response Siri used to give when a human told her, "Hey Siri, you're a b-tch." Further, researchers argue that tech companies have failed to take protective measures against abusive or gendered language from users. "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like'hey' or'OK,'" the researchers write.


Is it time for Alexa and Siri to have a "MeToo moment"?

#artificialintelligence

More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say. The numbers are eye-popping: 85% of Americans use at least one product with artificial intelligence (AI), and global use will reach 1.8 billion by 2021, so the impact of these "robot overlords" is unparalleled. But (AI) voice assistants, including Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Assistant are inflaming gender stereotypes and teaching sexism to a generation of millennials by creating a model of "docile and eager-to-please helpers," with acceptance of sexual harassment and verbal abuse, a new U.N. study says. A 145-page U.N. report published this week by the educational, scientific and cultural organization UNESCO concludes that the voices we speak to are programmed to be submissive and accept abuse as a norm. The report is titled, "I'd blush if I could: Closing Gender Divides in Digital Skills Through Education."


Let's not allow artificial intelligence to reinforce very real stereotypes

#artificialintelligence

Playing with my Lego, as a child, I would build human-like figures. I would create a whole cast of goodies and baddies, who would invariably end up fighting. The goodies always spoke with a North American drawl, while the baddies spoke English with heavy foreign accents. The very few female characters in my games were either shrieking, hyper-feminine princesses who needed saving, or near-voiceless helpers who looked after the base and cared for the wounded heroes. My bedroom carpet was a showground for the stereotypes of the day.


We tested bots like Siri and Alexa to see who would stand up to sexual harassment

#artificialintelligence

Women have been made into servants once again. Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Google Home peddle stereotypes of female subservience--which puts their "progressive" parent companies in a moral predicament. People often comment on the sexism inherent in these subservient bots' female voices, but few have considered the real-life implications of the devices' lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers' actions as normal or acceptable. In order to substantiate claims about these bots' responses to sexual harassment and the ethical implications of their pre-programmed responses, Quartz gathered comprehensive data on their programming by systematically testing how each reacts to harassment. The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity. And Apple, Amazon, Google, and Microsoft have the responsibility to do something about it.