Alexa, Can You Hear Me?
By exploring the various facets of gendering at play in the design of VPAs, specifically Alexa, I argue that gendering Alexa as female poses societal harm insofar as she reproduces normative assumptions about the role of women as submissive, inferior, and secondary to men. The prevalence of AI-driven virtual personal assistants (VPAs) is proliferating, with Amazon Echo being one of the most highly sought-after smart speakers globally. However, not until recently has there been much research or attention focused on the gender bias noticeably programmed into this technology, specifically Alexa, intentionally designed, coded, and programmed by men and gendered to be distinctly female. Big Tech's decision to gender VPAs is seen most evident through their assigned female names and their female voices that users find more pleasant to give orders to than a male voice, as seen through witty flirtatious programmed responses. Through these interactions, Alexa performs gender as a feminized and sexualized entity imposed upon her by her Silicon Valley creators, that has the potential to unravel decades of social and political progress, as well as reinstate the gender bias of the past that women strived to eradicate. In the not-so-distant future, TechCrunch forecasts that the use of voice assistants is set to triple over the next few years and estimates there will be ten billion digital voice assistants by 2023, up from the 2.5 billion assistants in use at the end of 2018. This growth is attributed to Amazon Echo being one of the most highly sought-after smart speakers in the world.
Aug-16-2021, 01:35:12 GMT
- Country:
- Europe > Denmark
- Capital Region > Copenhagen (0.04)
- North America > United States
- California (0.49)
- Oceania > Australia
- Europe > Denmark
- Industry:
- Health & Medicine > Therapeutic Area
- Psychiatry/Psychology (0.47)
- Information Technology > Services (0.69)
- Law > Civil Rights & Constitutional Law (1.00)
- Health & Medicine > Therapeutic Area
- Technology: