Goto

Collaborating Authors

Twitter's Photo Crop Algorithm Favors White Faces and Women

WIRED

Last fall, Canadian student Colin Madland noticed that Twitter's automatic cropping algorithm continually selected his face--not his darker-skinned colleague--from photos of the pair to display in tweets. The episode ignited accusations of bias as a flurry of Twitter users published elongated photos to see whether the AI would choose the face of a white person over a Black person or if it focused on women's chests over their faces. At the time, a Twitter spokesperson said assessments of the algorithm before it went live in 2018 found no evidence of race or gender bias. Now, the largest analysis of the AI to date has found the opposite: that Twitter's algorithm favors white people over Black people. That assessment also found that the AI for predicting the most interesting part of a photo does not focus on women's bodies over women's faces.


Twitter's Photo Crop Algorithm Favors White Faces and Women

#artificialintelligence

Last fall, Canadian student Colin Madland noticed that Twitter's automatic cropping algorithm continually selected his face--not his darker-skinned colleague's--from photos of the pair to display in tweets. The episode ignited accusations of bias as a flurry of Twitter users published elongated photos to see whether the AI would choose the face of a white person over a Black person or if it focused on women's chests over their faces. At the time, a Twitter spokesperson said assessments of the algorithm before it went live in 2018 found no evidence of race or gender bias. Now, the largest analysis of the AI to date has found the opposite: that Twitter's algorithm favors white people over Black people. That assessment also found that the AI for predicting the most interesting part of a photo does not focus on women's bodies over women's faces.


TechCrunch

#artificialintelligence

In an interesting development in the wake of a bias controversy over its cropping algorithm, Twitter has said it's considering giving users decision-making power over how tweet previews look, saying it wants to decrease its reliance on machine learning-based image cropping. Yes, you read that right. A tech company is affirming that automating certain decisions may not, in fact, be the smart thing to do -- tacitly acknowledging that removing human agency can generate harm. As we reported last month, the microblogging platform found its image-cropping algorithm garnering critical attention after Ph.D. student Colin Madland noticed the algorithm only showed his own (white male) image in preview -- repeatedly cropping out the image of a black faculty member. Ironically enough he'd been discussing a similar bias issue with Zoom's virtual backgrounds.


Twitter data confirms image cropping tool favoured white people

ZDNet

Twitter has quantified the extent to which its image cropping algorithm was racially biased, admitting that white individuals were prioritised over Black individuals when images were algorithmically cropped on the platform. In research [PDF] conducted by Twitter, the company tested its image cropping algorithm for race-based and gender biases and considered whether its model aligned with the goal of enabling people to make their own choices on the platform. From looking at its crop imaging algorithm, which uses a saliency approach, it found that in comparisons of Black and white individuals, there was a 4% difference from demographic parity in favour of white individuals. When looking at comparisons of Black and white women, there was a 7% difference favouring of white women; for their male counterparts, white men were favoured 2% more from demographic parity for image crops. Twitter started using a saliency approach to crop images in 2018.


This Is Why Twitter's Algorithm Appears To Have A Race Problem

#artificialintelligence

As we've learned (or apparently not) time and time again, AI and machine learning technology have a racism problem. From soap dispensers that don't register dark-skinned hands to self-driving cars that are 5 percent more likely to run you over if you are black because they don't recognize darker skin tones, there are numerous examples of algorithms that don't function as they should because they weren't tested enough with non-white people in mind. Over the weekend, one such algorithm with apparent bias drew attention after cryptographer and infrastructure engineer Tony Arcieri tried a simple experiment on Twitter. Arcieri took two photos: One of Barack Obama and one of Mitch McConnell. He then arranged them as below.