The ethical problems surrounding self-driving cars has plagued the automobile companies. All of them have been reluctant to talk about the ethical and moral dilemma when faced with questions such as to put driver’s safety first or the pedestrians.
Mercedes-Benz has now become the first company to blatantly announce its policy that self-driving cars will put driver’s safety first, even if that means killing a pedestrian.
Christoph von Hugo, who is the head of driver assistance systems for Mercedes, spoke on the decision,
“You could sacrifice the car. You could, but then the people you’ve saved initially, you don’t know what happens to them after that in situations that are often very complex, so you save the ones you know you can save.”
This moral dilemma has been around for decades, first with the trolley car riddle which asks you to decide the fate of railway workers. In that scenario, a runaway car is heading towards five workers. So do you pull the rail switch to change tracks and redirect the train towards the track where only one guy works, or do you let the train going towards the five?
For AI and self-driving cars, the riddle has been changed to choose between yourself or a group of kids breaking out into the road, with no third option but fatality of either one.
But Hugo seems to be crystal clear about the dilemma. He said in an interview at the Paris auto show
“If you know you can save at least one person, at least save that one. Save the one in the car,”. “If all you know for sure is that one death can be prevented, then that’s your first priority.”
Below are the three foundations for self-driving cars published by Germany this September, ultimately banning AI systems from choosing to harm one group of people over another. It stated,
” 1) It is clear that property damage takes always precedence of personal injury.”
“2) There must be no classification of people, for example on size, age and the like.”
“3) If something happens, the manufacturer is liable.”
The third rule seems to be controversial since it blames the automakers and makes the assumption that the human driver would not have taking control back over the vehicle. The major autonomous cars companies have been silent on these rules till date.
“We believe this ethical question won’t be as relevant as people believe today. It will occur much less often. There are situations that today’s driver can’t handle, that—from the physical standpoint—we can’t prevent today and automated vehicles can’t prevent, either. [The self-driving car] will just be far better than the average [human] driver.”
What are your thoughts on this ethical dilemma, and the decision made by Mercedes? Let us know in the comments’ section below!