Accenture Unveils Tool to Help Companies Insure Their AI Is Fair

#artificialintelligence 

Consulting firm Accenture has a new tool to help businesses detect and eliminate gender, racial and ethnic bias in artificial intelligence software. Companies and governments are increasingly turning to machine-learning algorithms to help make critical decisions, including who to hire, who gets insurance or a mortgage, who receives government benefits and even whether to grant a prisoner parole. One of the arguments for using such software is that, if correctly designed and trained, it can potentially make decisions free from the prejudices that often impact human choices. But, in a number of well-publicized examples, algorithms have been found to discriminate against minorities and women. For instance, an algorithm many U.S. cities and states used to help make bail decisions was twice as likely to falsely label black prisoners as being at high-risk for re-offending as white prisoners, according to a 2016 investigation by ProPublica.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found