Goto

Collaborating Authors

Results


What if Big Data Helped Judges Decide Exactly What Words Mean?

Slate

The precision and promise of a data-driven society has stumbled these past years, serving up some disturbing--even damning--results: facial recognition software that can't recognize Black faces, human resource software that rejects women's job applications, talking computers that spit racist vitriol. "Those who don't learn history are doomed to repeat it," George Santayana said. But most artificial intelligence applications and data-driven tools learn history aplenty--they just don't avoid its pitfalls. Instead, though touted as a step toward the future, these systems generally learn the past in order to replicate it in the present, repeating historical failures with ruthless, and mindless, efficiency. As Joy Buolamwini says, when it comes to algorithmic decision-making, "data is destiny."