Collaborating Authors

Can Auditing Eliminate Bias from Algorithms? – The Markup


For more than a decade, journalists and researchers have been writing about the dangers of relying on algorithms to make weighty decisions: who gets locked up, who gets a job, who gets a loan--even who has priority for COVID-19 vaccines. Rather than remove bias, one algorithm after another has codified and perpetuated it, as companies have simultaneously continued to more or less shield their algorithms from public scrutiny. The big question ever since: How do we solve this problem? Lawmakers and researchers have advocated for algorithmic audits, which would dissect and stress-test algorithms to see how they work and whether they're performing their stated goals or producing biased outcomes. And there is a growing field of private auditing firms that purport to do just that.

The Algorithmic Auditing Trap


This op-ed was written by Mona Sloane, a sociologist and senior research scientist at the NYU Center for Responsible A.I. and a fellow at the NYU Institute for Public Knowledge. Her work focuses on design and inequality in the context of algorithms and artificial intelligence. We have a new A.I. race on our hands: the race to define and steer what it means to audit algorithms. Governing bodies know that they must come up with solutions to the disproportionate harm algorithms can inflict. This technology has disproportionate impacts on racial minorities, the economically disadvantaged, womxn, and people with disabilities, with applications ranging from health care to welfare, hiring, and education.

New York City bill could ban AI-powered hiring tools that discriminate against applicants

Daily Mail - Science & tech

A bill passed by the New York City council early this month aims to ban companies from using artificial intelligent-powered hiring tools that discriminate based on an applicant's gender or race. If signed into law, the legislation will require providers the technology to systems evaluated each year by an audit service and provide the results to companies using those systems. Employers using systems that do not meet requirements could be fined up to $1,500 per violation, but the law states it will be left up to the vendors to conduct the audits and show employers that their tools meet the city's requirements. If the bill is pushed to law, it would go into affect January 2023 and make New York City the first place in the US to rein in AI hiring tools. A bill passed by the New York City council early this month aims to ban companies from using artificial intelligent-powered hiring tools that discriminate based on an applicant's gender or race However, Alexandra Givens, president of the Center for Democracy & Technology, notes that this legislation does not protect against disabilities or age.

AI Hiring Tools Can Discriminate Based on Race and Gender. A New NYC Bill Would Fight That

TIME - Tech

Job candidates rarely know when hidden artificial intelligence tools are rejecting their resumes or analyzing their video interviews. But New York City residents could soon get more say over the computers making behind-the-scenes decisions about their careers. A bill passed by the city council in early November would ban employers from using automated hiring tools unless a yearly bias audit can show they won't discriminate based on an applicant's race or gender. It would also force makers of those AI tools to disclose more about their opaque workings and give candidates the option of choosing an alternative process -- such as a human -- to review their application. Proponents liken it to another pioneering New York City rule that became a national standard-bearer earlier this century -- one that required chain restaurants to slap a calorie count on their menu items.

Want to Prove Your Business Is Fair? Audit Your Algorithm


Yale Fox's business doesn't work unless everyone thinks its fair. His startup, Rentlogic, relies on an algorithm to score New York City landlords on how well they take care of their properties. It's an easy way for tenants to avoid bedbugs and mold, and for landlords to signal they take good care of their properties. But it isn't enough for Rentlogic's score to just exist; Fox needs landlords and tenants to believe in it. This was on his mind last fall when he heard Cathy O'Neil speak.