Self-driving car dilemmas reveal that moral choices are not universal
Self-driving cars are being developed by several major technology companies and carmakers. When a driver slams on the brakes to avoid hitting a pedestrian crossing the road illegally, she is making a moral decision that shifts risk from the pedestrian to the people in the car. Self-driving cars might soon have to make such ethical judgments on their own -- but settling on a universal moral code for the vehicles could be a thorny task, suggests a survey of 2.3 million people from around the world. The largest ever survey of machine ethics1, published today in Nature, finds that many of the moral principles that guide a driver's decisions vary by country. For example, in a scenario in which some combination of pedestrians and passengers will die in a collision, people from relatively prosperous countries with strong institutions were less likely to spare a pedestrian who stepped into traffic illegally.
Oct-26-2018, 04:11:02 GMT
- AI-Alerts:
- 2018 > 2018-10 > AAAI AI-Alert for Oct 30, 2018 (1.00)
- Country:
- Africa > Nigeria (0.05)
- Asia
- Europe
- Finland (0.06)
- France (0.15)
- Germany > Bavaria
- Upper Bavaria > Ingolstadt (0.05)
- North America
- Canada > British Columbia (0.05)
- United States
- Connecticut > New Haven County
- New Haven (0.05)
- Massachusetts (0.05)
- South Carolina (0.05)
- Connecticut > New Haven County
- South America > Colombia (0.05)
- Industry:
- Technology: