Who Should You or a Self-Driving Car Hit in a Moral Bind?

#artificialintelligence 

I don't know how self-driving car technology ranks on a difficulty scale. Perhaps it's not as difficult as rocket science, but it still must be very hard. Add to that the challenge of programming a self-driving car to make moral decisions. Take for example the MIT Media Lab experiment called "The Moral Machine," which was "designed to test how we view…moral problems in light of the emergence of self-driving cars." If a self-driving car were in a'moral bind' in which it would have to hit either an elderly person, a child or a pet to avoid the others, what should it do?

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found