God Mode ON

Should a driverless car be programmed with ethics? They will have to be. In the hypothetical scenario in which a driver must decide between risking the lives of pedestrians or risking the life of passengers in the car, an autonomous car must make that decision in a split second. Dr. Iyad Rahwan of MIT’s Media Lab is looking at the problem.

To understand that question Dr Rahwan and his team conducted an online survey to see how people feel about cars handling the social dilemma.

Overall they found that people were largely utilitarian, favouring the protection of the greater number of lives. For instance, they would favour the car being programmed to drive into a wall rather than a crowd of people.

Certainly not surprising. However, that sense of utilitarianism didn’t remain when people were asked if they would buy a car with such programming. Instead self interest prevailed for hypothetical buyers preferring to purchase a car that would save themselves even if it meant killing many others.

Yes, it’s always easier to answer hypothetical ethics questions than to contemplate buying a vehicle that may someday decide to kill you in order to save others. That’s the gist of the latest comic from CommitStrip.


Newest 3
Newest 3 Comments

Owning a car with collision avoidance, Let me assure you that this sort of scenario will never occur. It slams on the brakes far earlier and more often than needed.
Abusive comment hidden. (Show it anyway.)
The question is only a hypothetical thought experiment, NOT a real issue. Sensors, today, aren't remotely reliable enough to prevent false positives and therefore sacrifice vehicle occupants for a radar blip.
Abusive comment hidden. (Show it anyway.)
Login to comment.
Email This Post to a Friend
"God Mode ON"

Separate multiple emails with a comma. Limit 5.

 

Success! Your email has been sent!

close window
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
 
Learn More