First UNESCO recommendations to combat gender bias in applications using artificial intelligence


Beginning as early as next year, many people are expected to have more conversations with digital voice assistants than with their spouse. Presently, the vast majority of these assistants--from Amazon's Alexa to Microsoft's Cortana--are projected as female, in name, sound of voice and'personality'. 'I'd blush if I could', a new UNESCO publication produced in collaboration with Germany and the EQUALS Skills Coalition holds a critical lens to this growing and global practice, explaining how it: The title of the publication borrows its name from the response Siri, Apple's female-gendered voice assistant used by nearly half a billion people, would give when a human user told'her', "Hey Siri, you're a bi***." Siri's submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education. According to Saniye Gülser Corat, UNESCO's Director for Gender Equality, "The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them."