Audit finds gender and age bias in OpenAI's CLIP model
All the sessions from Transform 2021 are available on-demand now. In January, OpenAI released Contrastive Language-Image Pre-training (CLIP), an AI model trained to recognize a range of visual concepts in images and associate them with their names. CLIP performs quite well on classification tasks -- for instance, it can caption an image of a dog "a photo of a dog." But according to an OpenAI audit conducted with Jack Clark, OpenAI's former policy director, CLIP is susceptible to biases that could have implications for people who use -- and interact -- with the model. Prejudices often make their way into the data used to train AI systems, amplifying stereotypes and leading to harmful consequences.
Aug-10-2021, 18:45:16 GMT