Bias detectives: the researchers striving to make algorithms fair
In 2015, a worried father asked Rhema Vaithianathan a question that still weighs on her mind. A small crowd had gathered in a basement room in Pittsburgh, Pennsylvania, to hear her explain how software might tackle child abuse. Each day, the area's hotline receives dozens of calls from people who suspect that a child is in danger; some of these are then flagged by call-centre staff for investigation. But the system does not catch all cases of abuse. Vaithianathan and her colleagues had just won a half-million-dollar contract to build an algorithm to help. Vaithianathan, a health economist who co-directs the Centre for Social Data Analytics at the Auckland University of Technology in New Zealand, told the crowd how the algorithm might work. For example, a tool trained on reams of data -- including family backgrounds and criminal records -- could generate risk scores when calls come in.
Jun-23-2018, 05:51:45 GMT
- AI-Alerts:
- 2018 > 2018-06 > AAAI AI-Alert for Jun 26, 2018 (1.00)
- Country:
- Europe (1.00)
- North America > United States
- Pennsylvania > Allegheny County > Pittsburgh (0.24)
- Oceania > New Zealand
- North Island > Auckland Region > Auckland (0.24)
- Genre:
- Research Report (0.46)
- Industry:
- Government > Regional Government (0.94)
- Health & Medicine (1.00)
- Information Technology > Security & Privacy (0.95)
- Law > Criminal Law (0.95)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (1.00)
- Technology: