Goto

Collaborating Authors

 make fair decision


Women negative to artificial intelligence

#artificialintelligence

When it comes to the use of face recognition by police, 31% of women are not certain whether it is a good or bad idea, compared with 22% of men. Women are more likely to support the inclusion of a wider variety of groups in AI design. Women are also more likely to say it is important that different racial and ethnic groups are included in the same AI design process (71% vs. 63%). Additionally, women are more doubtful than men that it is possible to design AI computer programs that can consistently make fair decisions in complex situations. Only around two-in-ten women (22%) think it is possible to design AI programs that can consistently make fair decisions, while a larger share of men (38%) say the same.


Artificial Intelligence: Can We Trust Machines to Make Fair Decisions?

#artificialintelligence

Artificial Intelligence touches almost every aspect of our lives, from mobile banking and online shopping to social media and real-time traffic maps. But what happens when artificial intelligence is biased? What if it makes mistakes on important decisions — from who gets a job interview or a mortgage to who gets arrested and how much time they ultimately serve for a crime?


Artificial Intelligence: Can We Trust Machines to Make Fair Decisions?

#artificialintelligence

But what happens when artificial intelligence is biased? What if it makes mistakes on important decisions -- from who gets a job interview or a mortgage to who gets arrested and how much time they ultimately serve for a crime? "These everyday decisions can greatly affect the trajectories of our lives and increasingly, they're being made not by people, but by machines," said UC Davis computer science professor Ian Davidson. A growing body of research, including Davidson's, indicates that bias in artificial intelligence can lead to biased outcomes, especially for minority populations and women. Facial recognition technologies, for example, have come under increasing scrutiny because they've been shown to better detect white faces than they do the faces of people with darker skin.