Self-driving car dilemmas reveal that moral choices are not universal
Self-driving cars are being developed by several major technology companies and carmakers. When a driver slams on the brakes to avoid hitting a pedestrian crossing the road illegally, she is making a moral decision that shifts risk from the pedestrian to the people in the car. Self-driving cars might soon have to make such ethical judgments on their own -- but settling on a universal moral code for the vehicles could be a thorny task, suggests a survey of 2.3 million people from around the world. The largest ever survey of machine ethics1, published today in Nature, finds that many of the moral principles that guide a driver's decisions vary by country. For example, in a scenario in which some combination of pedestrians and passengers will die in a collision, people from relatively prosperous countries with strong institutions were less likely to spare a pedestrian who stepped into traffic illegally.
Oct-26-2018, 04:11:02 GMT
- AI-Alerts:
- 2018 > 2018-10 > AAAI AI-Alert for Oct 30, 2018 (1.00)
- Country:
- Europe (0.72)
- North America > United States
- Connecticut (0.15)
- Industry:
- Technology: