Audit finds gender and age bias in OpenAI's CLIP model

#artificialintelligence 

All the sessions from Transform 2021 are available on-demand now. In January, OpenAI released Contrastive Language-Image Pre-training (CLIP), an AI model trained to recognize a range of visual concepts in images and associate them with their names. CLIP performs quite well on classification tasks -- for instance, it can caption an image of a dog "a photo of a dog." But according to an OpenAI audit conducted with Jack Clark, OpenAI's former policy director, CLIP is susceptible to biases that could have implications for people who use -- and interact -- with the model. Prejudices often make their way into the data used to train AI systems, amplifying stereotypes and leading to harmful consequences.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found