gender


In the AI revolution, bias is the new breech. How CIO's must manage risk

#artificialintelligence

We have seen it too many times before. A major security or privacy breach creates a crisis for an enterprise. Headlines, lawsuits and, sometimes, the CEO testifying before congress. The CIO works around the clock only to be rewarded with a pink-slip and an uncertain career. Researchers from MIT and...


The Algorithm That Helped Google Translate Become Sexist

#artificialintelligence

StitchFix CEO Katrina Lake posted this on Twitter on the day of her company's IPO in 2017. Automated translation is using the same kind of models for suggesting words that are sometimes laced with bias. Parents know one particular challenge of raising kids all too well: teaching them to do what we ...


Here are the best gay dating apps, since meeting people IRL is hell

Mashable

Dating. Whether you hate it a little or hate it a lot, it's a rite of passage for most of us. It's also particularly challenging for members of the LGBTQ community, who've traditionally only had access to hetero-based sites and apps. When I was on the apps in the late aughts, queer women could bare...


Can Artificial Intelligence Weed Out Unconscious Bias?

#artificialintelligence

Let me preface this by saying it has been my experience, that barring the obvious bad apples, most people are basically good and want to do the right thing. So in 2018, here in our comfortable Western (and litigious) society let me submit that a hiring manager is unlikely to look at a resume and say...


How to Treat Missing Values in Your Data

@machinelearnbot

One of most excruciating pain points during Data Exploration and Preparation stage of an Analytics project are missing values. How do you deal with missing values - ignore or treat them? The answer would depend on the percentage of those missing values in the dataset, the variables affected by miss...



Facial Recognition Is Accurate, if You're a White Guy

#artificialintelligence

Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph. When the person in the photo is a white man, the software is right 99 percent of the time. But the darker the skin, the more errors arise -- up to nearly 35...


Face-recognition software is perfect – if you're a white man

New Scientist

Face-recognition software can guess your gender with amazing accuracy – if you are a white man. Joy Buolamwini at the Massachusetts Institute of Technology tested three commercially available face-recognition systems, created by Microsoft, IBM and the Chinese company Megvii. The systems correctly identified the gender of white men 99 per cent of the time. But the error rate rose for people with darker skin, reaching nearly 35 per cent for women. The results will be presented at the Conference on Fairness, Accountability, and Transparency in New York later this month Face-recognition software is already being used in many different situations, including by police to identify suspects in a crowd and to automatically tag photos. This means inaccuracies could have consequences, such as systematically ingraining biases in police stop and searches. Biases in artificial intelligence systems tend to come from biases in the data they are trained on. According to one study, a widely used data set is around 75 per cent male and more than 80 per cent white. Read more: Is tech racist? The fight back against digital discrimination This article appeared in print under the headline "Face recognition's biases on show"


Study finds popular face ID systems may have racial bias

Daily Mail

Tech giants have made some major strides in advancing facial recognition technology. It's now popping up in smartphones, laptops and tablets, all with the goal of making our lives easier. But a new study, called 'Gender Shades,' has found that it may not be working for all users, especially those ...


AI facial analysis demonstrates both racial and gender bias

Engadget

Researchers from MIT and Stanford University found that that three different facial analysis programs demonstrate both gender and skin color biases. The full article will be presented at the Conference on Fairness, Accountability, and Transparency later this month. Specifically, the team looked at the accuracy rates of facial recognition as broken down by gender and race. "Researchers at a major U.S. technology company claimed an accuracy rate of more than 97 percent for a face-recognition system they'd designed. But the data set used to assess its performance was more than 77 percent male and more than 83 percent white." This narrow test base results in a higher error rate for anyone who isn't white or male. In order to test these systems, MIT researcher Joy Buolamwini collected over 1,200 images that contained a greater proportion of women and people of color and coded skin color based on the Fitzpatrick scale of skin tones, in consultation with a dermatologic surgeon. After this, Buolamwini tested the facial recognition systems with her new data set. The results were stark in terms of gender classification. "For darker-skinned women There have certainly been accusations of bias in tech algorithms previously, and it's well known that facial recognition systems often do not work as well on darker skin tones. Even with that knowledge, these figures are staggering, and it's important that companies who work on this kind of software take into account the breadth of diversity that exists in their user base, rather than limiting themselves to the white men that often dominate their workforces.