-
Are You Gonna Pay Forty Grand For A Car That'll Sacrifice You?
-
I heard Leo Laporte and his panel talking about this the other day, and it's a hell of a moral dilemma that takes an old ethics question and applies it to the newest technology, because it's something that HAS to be worked out, like, right now: Should self-driving cars protect passengers or others in the case of a pending accident? In other words, there's going to be a fatal crash and the car has to decide whether to protect, say, the two people inside the car or the dozen people in the crosswalk. What should its priority be, to protect its cargo, the passengers, or to protect the largest number of people? The car would, in effect, have to decide who to kill. A study says that most people would want the algorithm to pick killing the smallest number of people. But most assume that the car would be programmed to protect its passengers no matter what. Okay, Solomon, what would YOU have them do? (Washington Post)
Have an opinion? Add your comment below.