Artificial Intelligence: Can We Trust Machines to Make Fair Decisions?
But what happens when artificial intelligence is biased? What if it makes mistakes on important decisions -- from who gets a job interview or a mortgage to who gets arrested and how much time they ultimately serve for a crime? "These everyday decisions can greatly affect the trajectories of our lives and increasingly, they're being made not by people, but by machines," said UC Davis computer science professor Ian Davidson. A growing body of research, including Davidson's, indicates that bias in artificial intelligence can lead to biased outcomes, especially for minority populations and women. Facial recognition technologies, for example, have come under increasing scrutiny because they've been shown to better detect white faces than they do the faces of people with darker skin.
Apr-18-2021, 15:05:30 GMT
- Country:
- North America > United States > California (0.15)
- Industry:
- Technology:
- Information Technology > Artificial Intelligence
- Issues > Social & Ethical Issues (0.50)
- Machine Learning (0.48)
- The Future (0.40)
- Vision (0.38)
- Information Technology > Artificial Intelligence