Bias isn't the only problem with credit scores--and no, AI can't help

MIT Technology Review 

But in the biggest ever study of real-world mortgage data, economists Laura Blattner at Stanford University and Scott Nelson at the University of Chicago show that differences in mortgage approval between minority and majority groups is not just down to bias, but to the fact that minority and low-income groups have less data in their credit histories. This means that when this data is used to calculate a credit score and this credit score used to make a prediction on loan default, then that prediction will be less precise. It is this lack of precision that leads to inequality, not just bias. The implications are stark: fairer algorithms won't fix the problem. "It's a really striking result," says Ashesh Rambachan, who studies machine learning and economics at Harvard University, but was not involved in the study.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found