UK data watchdog investigates whether AI systems show racial bias

#artificialintelligence 

The UK data watchdog is to investigate whether artificial intelligence systems are showing racial bias when dealing with job applications. The Information Commissioner's Office said AI-driven discrimination could have "damaging consequences for people's lives" and lead to someone being rejected for a job or being wrongfully denied a bank loan or a welfare benefit. It will investigate the use of algorithms to sift through job applications, amid concerns that they are affecting employment opportunities for people from ethnic minorities. "We will be investigating concerns over the use of algorithms to sift recruitment applications, which could be negatively impacting employment opportunities of those from diverse backgrounds," said the ICO. The investigation is being announced as part of a three-year plan for the ICO under the UK's new information commissioner, John Edwards, who joined the ICO in January after running its New Zealand counterpart.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found