Self-Driving Mercedes Will Prefer Killing A Pedestrian Over A Driver

Mercedes-Benz F015 - Luxury in Motion

The ethical problems surrounding self-driving cars has plagued the automobile companies. All of them have been reluctant to talk about the ethical and moral dilemma when faced with questions such as to put driver’s safety first or the pedestrians.

Mercedes-Benz has now become the first company to blatantly announce its policy that self-driving cars will put driver’s safety first, even if that means killing a pedestrian.

[Image Courtesy of Mercedes-Benz]
[Image Courtesy of Mercedes-Benz]
Christoph von Hugo, who is the head of driver assistance systems for Mercedes, spoke on the decision,

“You could sacrifice the car. You could, but then the people you’ve saved initially, you don’t know what happens to them after that in situations that are often very complex, so you save the ones you know you can save.”

This moral dilemma has been around for decades, first with the trolley car riddle which asks you to decide the fate of railway workers. In that scenario, a runaway car is heading towards five workers. So do you pull the rail switch to change tracks and redirect the train towards the track where only one guy works, or do you let the train going towards the five?

For AI and self-driving cars, the riddle has been changed to choose between yourself or a group of kids breaking out into the road, with no third option but fatality of either one.

But Hugo seems to be crystal clear about the dilemma. He said in an interview at the Paris auto show

“If you know you can save at least one person, at least save that one. Save the one in the car,”. “If all you know for sure is that one death can be prevented, then that’s your first priority.”

[Image Courtesy of Mercedes-Benz]
[Image Courtesy of Mercedes-Benz]

Below are the three foundations for self-driving cars published by Germany this September, ultimately banning AI systems from choosing to harm one group of people over another. It stated,

” 1) It is clear that property damage takes always precedence of personal injury.”

“2) There must be no classification of people, for example on size, age and the like.”

“3) If something happens, the manufacturer is liable.”

The third rule seems to be controversial since it blames the automakers and makes the assumption that the human driver would not have taking control back over the vehicle. The major autonomous cars companies have been silent on these rules till date.

 

[Image Courtesy of Mercedes-Benz]
[Image Courtesy of Mercedes-Benz]

Hugo added,

“We believe this ethical question won’t be as relevant as people believe today. It will occur much less often. There are situations that today’s driver can’t handle, that—from the physical standpoint—we can’t prevent today and automated vehicles can’t prevent, either. [The self-driving car] will just be far better than the average [human] driver.”

What are your thoughts on this ethical dilemma, and the decision made by Mercedes? Let us know in the comments’ section below!

1 Comment

  1. Stuart Reply

    Wow, amazing. So a pedestrian, someone who chooses to take a journey not in a car can be killed for doing so. That’s great Mercedes.

    I firmly believe the risk should be greater for the person that chooses to travel by car as they will knowingly take that risk.

Leave a Reply

Your email address will not be published. Required fields are marked *