AI-rays: Exploring Bias in the Gaze of AI Through a Multimodal Interactive Installation
Gao, Ziyao, Zhang, Yiwen, Li, Ling, Papatheodorou, Theodoros, Zeng, Wei
–arXiv.org Artificial Intelligence
Numerous cases have demonstrated that specific appearance signals can implicitly correlate with biased social sorting, causing Data surveillance has become more covert and pervasive with AI injustice. For example, AI predictive policing overestimates recidivism algorithms, which can result in biased social classifications. Appearance risk for black people [David Robinson 2016], recruitment offers intuitive identity signals, but what does it mean engines prefer male candidates for tech jobs [Reuters 2018], and to let AI observe and speculate on them? We introduce AI-rays, AI beauty contests favor white winners [Levin 2016]. Nowadays, an interactive installation where AI generates speculative identities machine scrutiny is pervasive and constant. How do machines interpret from participants' appearance which are expressed through our appearance cues? Who is putting that speculation to use? synthesized personal items placed in participants' bags. It uses Does the meaning of appearance signal change when machines, speculative X-ray visions to contrast reality with AI-generated assumptions, not humans, observe us? metaphorically highlighting AI's scrutiny and biases. AI-rays promotes discussions on modern surveillance and the future of human-machine reality through a playful, immersive experience exploring AI biases.
arXiv.org Artificial Intelligence
Oct-3-2024