The U.K. Wants to Become the World Leader in Ethical A.I.

Slate 

In 2013, an algorithm determined Eric Loomis' six-year prison sentence in Wisconsin for attempting to flee a traffic officer and operating a motor vehicle without the owner's consent. No one knew how the software, Correctional Offender Management Profiling for Alternative Sanctions, or COMPAS, worked--not even the judge who delivered the sentence. Analyses conducted by ProPublica later found the predictive artificial intelligence used in this case, which attempts to gauge the likelihood of an offender committing another crime, to be racially biased: A two-year study involving 10,000 defendants found that the A.I. routinely overestimated the likelihood of recidivism among black defendants and underestimated it among whites. The U.S. Supreme Court declined to review Eric Loomis' case, so the sentence stands. Increasingly, A.I. has the power to alter the course of people's lives.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found