Women Wearing Lipstick: Measuring the Bias Between an Object and Its Related Gender
–arXiv.org Artificial Intelligence
In this paper, we investigate the impact of objects on gender bias in image captioning systems. Our results show that only gender-specific objects have a strong gender bias (e.g., women-lipstick). In addition, we propose a visual semantic-based gender score that measures the degree of bias and can be used as a plug-in for any image captioning system. Our experiments demonstrate the utility of the gender score, since we observe that our score can measure the bias relation between a caption and its related gender; therefore, our score can be used as an additional metric to the existing Object Gender Co-Occ approach. Code and data are publicly available at \url{https://github.com/ahmedssabir/GenderScore}.
arXiv.org Artificial Intelligence
Nov-20-2023
- Genre:
- Research Report > New Finding (0.54)
- Industry:
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning (1.00)
- Natural Language > Text Processing (1.00)
- Representation & Reasoning (1.00)
- Vision (0.70)
- Information Technology > Artificial Intelligence