What We Learned Auditing Sophisticated AI for Bias
A recently passed law in New York City requires audits for bias in AI-based hiring systems. AI systems fail frequently, and bias is often to blame. A recent sampling of headlines features sociological bias in generated images, a chatbot, and a virtual rapper. These examples of denigration and stereotyping are troubling and harmful, but what happens when the same types of systems are used in more sensitive applications? Leading scientific publications assert that algorithms used in healthcare in the U.S. diverted care away from millions of black people.
Oct-20-2022, 11:26:21 GMT
- Country:
- Europe > Netherlands (0.04)
- North America > United States
- District of Columbia > Washington (0.04)
- New York (0.25)
- Industry:
- Banking & Finance (0.70)
- Government > Regional Government
- Information Technology > Security & Privacy (1.00)
- Law (1.00)
- Technology: