Goto

Collaborating Authors

Why are Artificial Intelligence systems biased?

#artificialintelligence

A machine-learned AI system used to assess recidivism risks in Broward County, Fla., often gave higher risk scores to African Americans than to whites, even when the latter had criminal records. The popular sentence-completion facility in Google Mail was caught assuming that an "investor" must be a male. A celebrated natural language generator called GPT, with an uncanny ability to write polished-looking essays for any prompt, produced seemingly racist and sexist completions when given prompts about minorities. Amazon found, to its consternation, that an automated AI-based hiring system it built didn't seem to like female candidates. Commercial gender-recognition systems put out by industrial heavy-weights, including Amazon, IBM and Microsoft, have been shown to suffer from high misrecognition rates for people of color.


Why are Artificial Intelligence systems biased?

#artificialintelligence

A machine-learned AI system used to assess recidivism risks in Broward County, Fla., often gave higher risk scores to African Americans than to whites, even when the latter had criminal records. The popular sentence-completion facility in Google Mail was caught assuming that an "investor" must be a male. A celebrated natural language generator called GPT, with an uncanny ability to write polished-looking essays for any prompt, produced seemingly racist and sexist completions when given prompts about minorities. Amazon found, to its consternation, that an automated AI-based hiring system it built didn't seem to like female candidates. Commercial gender-recognition systems put out by industrial heavy-weights, including Amazon, IBM and Microsoft, have been shown to suffer from high misrecognition rates for people of color.


Why are Artificial Intelligence systems biased?

#artificialintelligence

A machine-learned AI system used to assess recidivism risks in Broward County, Fla., often gave higher risk scores to African Americans than to whites, even when the latter had criminal records. The popular sentence-completion facility in Google Mail was caught assuming that an "investor" must be a male. A celebrated natural language generator called GPT, with an uncanny ability to write polished-looking essays for any prompt, produced seemingly racist and sexist completions when given prompts about minorities.


Why are Artificial Intelligence systems biased?

#artificialintelligence

A machine-learned AI system used to assess recidivism risks in Broward County, Fla., often gave higher risk scores to African Americans than to …


Crime Prediction Algorithms Aren't Very Good At Predicting Crimes

International Business Times

Some courts in the U.S., particularly in states from California to New Jersey, use crime-predicting algorithms to determine if a defendant is likely to commit another crime in the future. While the software helps judges decide who gets bail, who goes to jail and who can walk away free, it appears the technology isn't very reliable and opens doors to a more unfair justice system.