Blackballed by machine learning: how algorithms can destroy your chances of getting a job

#artificialintelligence 

The Guardian's published a long excerpt from Cathy O'Neil's essential new book, Weapons of Math Destruction, in which O'Neil describes the way that shoddy machine-learning companies have come to dominate waged employment hiring, selling their dubious products to giant companies that use them to decide who can and can't work. Because so many of America's biggest employers use these systems, it can be nearly impossible to find work if their secret, unaudited models decide that you're a bad hire. What's more, many of the models' litmus tests are just proxies for race, poverty, and disability -- things that companies are not legally allowed to consider when hiring (unless they're being considered by unaccountable software provided by a third party). This hurts everyone, not just the people who get blackballed. Because the machine learning companies that supply this HR-ware don't refine their models based on the success of their predictions, they end up excluding lots of people who'd be excellent hires, and hiring people who are no good for their customers.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found