We tested bots like Siri and Alexa to see who would stand up to sexual harassment
Women have been made into servants once again. Apple's Siri, Amazon's Alexa, Microsoft's Cortana, and Google's Google Home peddle stereotypes of female subservience--which puts their "progressive" parent companies in a moral predicament. People often comment on the sexism inherent in these subservient bots' female voices, but few have considered the real-life implications of the devices' lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers' actions as normal or acceptable. In order to substantiate claims about these bots' responses to sexual harassment and the ethical implications of their pre-programmed responses, Quartz gathered comprehensive data on their programming by systematically testing how each reacts to harassment. The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity. And Apple, Amazon, Google, and Microsoft have the responsibility to do something about it.
Sep-14-2019, 00:13:33 GMT