Using artificial intelligence to detect discrimination

#artificialintelligence

A new artificial intelligence (AI) tool for detecting unfair discrimination--such as on the basis of race or gender--has been created by researchers at Penn State and Columbia University. Preventing unfair treatment of individuals on the basis of race, gender or ethnicity, for example, been a long-standing concern of civilized societies. However, detecting such discrimination resulting from decisions, whether by human decision makers or automated AI systems, can be extremely challenging. This challenge is further exacerbated by the wide adoption of AI systems to automate decisions in many domains--including policing, consumer finance, higher education and business. "Artificial intelligence systems--such as those involved in selecting candidates for a job or for admission to a university--are trained on large amounts of data," said Vasant Honavar, Professor and Edward Frymoyer Chair of Information Sciences and Technology, Penn State.


Using artificial intelligence to detect discrimination

#artificialintelligence

Preventing unfair treatment of individuals on the basis of race, gender or ethnicity, for example, been a long-standing concern of civilized societies. However, detecting such discrimination resulting from decisions, whether by human decision makers or automated AI systems, can be extremely challenging. This challenge is further exacerbated by the wide adoption of AI systems to automate decisions in many domains -- including policing, consumer finance, higher education and business. "Artificial intelligence systems -- such as those involved in selecting candidates for a job or for admission to a university -- are trained on large amounts of data," said Vasant Honavar, Professor and Edward Frymoyer Chair of Information Sciences and Technology, Penn State. "But if these data are biased, they can affect the recommendations of AI systems."


Detect discrimination with help of artificial intelligence

#artificialintelligence

"We analysed an adult income data set containing salary, demographic and employment-related information for close to 50,000 individuals. We found evidence of gender-based discrimination in salary. Specifically, we found that the odds of a woman having a salary greater than $50,000 per year is only one-third that for a man. This would suggest that employers should look for and correct, when appropriate, gender bias in salaries," said Honavar.


Fairness in Algorithmic Decision Making: An Excursion Through the Lens of Causality

arXiv.org Machine Learning

As virtually all aspects of our lives are increasingly impacted by algorithmic decision making systems, it is incumbent upon us as a society to ensure such systems do not become instruments of unfair discrimination on the basis of gender, race, ethnicity, religion, etc. We consider the problem of determining whether the decisions made by such systems are discriminatory, through the lens of causal models. We introduce two definitions of group fairness grounded in causality: fair on average causal effect (FACE), and fair on average causal effect on the treated (FACT). We use the Rubin-Neyman potential outcomes framework for the analysis of cause-effect relationships to robustly estimate FACE and FACT. We demonstrate the effectiveness of our proposed approach on synthetic data. Our analyses of two real-world data sets, the Adult income data set from the UCI repository (with gender as the protected attribute), and the NYC Stop and Frisk data set (with race as the protected attribute), show that the evidence of discrimination obtained by FACE and FACT, or lack thereof, is often in agreement with the findings from other studies. We further show that FACT, being somewhat more nuanced compared to FACE, can yield findings of discrimination that differ from those obtained using FACE.


Gender pay equity could be expanded under this measure

Los Angeles Times

A bill sent to the governor Monday would prevent California employers from paying women less than male colleagues based on prior salary. The state strengthened its protections against gender-based wage discrimination last year. The bill the state Assembly sent the governor Monday, AB 1676, would add prior salary to the list of reasons women can't be paid less than men. Nationally, a woman on average makes roughly 79 cents for every dollar a man makes, according to U.S. Census Bureau data from 2014. AB 1676, authored by Assemblywoman Nora Campos (D-San Jose), joins another pay equality bill sent to the governor last week that would strengthen protections against wage discrimination based on race or ethnicity.