Alexa, are you making me sexist?

#artificialintelligence

The other day I spent 10 minutes hurling verbal abuse at Siri. Cringing as I spoke, I said into my phone: "Siri, you're ugly." I said, "Siri, you're fat." She replied, "It must be all the chocolate." I felt mortified for both of us.


AI Voice Assistants Reinforce Gender Biases, U.N. Report Says

TIME - Tech

Artificial intelligence voice assistants with female voices reinforce existing gender biases, according to a new United Nations' report. The new report from UNESCO, entitled "I'd Blush If I Could," looks at the impact of having female voice assistants, from Amazon's Alexa to Apple's Siri, projected in a way that suggests that women are "subservient and tolerant of poor treatment." The report takes its title from the response Siri used to give when a human told her, "Hey Siri, you're a b-tch." Further, researchers argue that tech companies have failed to take protective measures against abusive or gendered language from users. "Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like'hey' or'OK,'" the researchers write.


"Hey Update My Voice" movement exposes cyber harassment

#artificialintelligence

São Paulo, January 2020 - Virtual assistants are increasingly present in people's routine, whether to help, answer questions and facilitate daily life. What they all have in common are women's names and the standard female voice, such as Lu, Siri, Alexa, Nat, Bia, etc. According to a study entitled "I'd Blush If I Could" published by UNESCO in May 2019, virtual assistants via Artificial Intelligence suffer from high levels of gender prejudice, although they usually answer with tolerant, subservient and passive phrases. Based on this context, the "Hey Update My Voice" movement was launched in partnership with UNESCO with the objective of drawing attention to cyber education and respect for virtual assistants, and ask companies to update their assistants' responses. If even virtual assistants are harassed, can you imagine how many women are victims of this kind of violence?


Is it time for Alexa and Siri to have a "MeToo moment"?

#artificialintelligence

More people will speak to a voice assistance machine than to their partners in the next five years, the U.N. says, so it matters what they have to say. The numbers are eye-popping: 85% of Americans use at least one product with artificial intelligence (AI), and global use will reach 1.8 billion by 2021, so the impact of these "robot overlords" is unparalleled. But (AI) voice assistants, including Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Assistant are inflaming gender stereotypes and teaching sexism to a generation of millennials by creating a model of "docile and eager-to-please helpers," with acceptance of sexual harassment and verbal abuse, a new U.N. study says. A 145-page U.N. report published this week by the educational, scientific and cultural organization UNESCO concludes that the voices we speak to are programmed to be submissive and accept abuse as a norm. The report is titled, "I'd blush if I could: Closing Gender Divides in Digital Skills Through Education."


Let's not allow artificial intelligence to reinforce very real stereotypes

#artificialintelligence

Playing with my Lego, as a child, I would build human-like figures. I would create a whole cast of goodies and baddies, who would invariably end up fighting. The goodies always spoke with a North American drawl, while the baddies spoke English with heavy foreign accents. The very few female characters in my games were either shrieking, hyper-feminine princesses who needed saving, or near-voiceless helpers who looked after the base and cared for the wounded heroes. My bedroom carpet was a showground for the stereotypes of the day.