Do you run the child over or swerve into a tree, which could kill you and your passenger?
It sounds like a test to determine someone’s level of psychopathy, but it’s actually a choice the developers of autonomous vehicles have been faced with since the idea first materialised.
In an interview published last week with Car and Driver, the manager of driver-assistance systems at Mercedes-Benz, Christoph von Hugo, revealed that the company’s future autonomous vehicles would always put the driver first. In other words, in the above dilemma, they will be programmed to run over the child
While either choice by the car’s developers may feel uncomfortable, it would be more dangerous if they didn’t address such a situation, because unless a self-driving vehicle is told what to do when a child runs into the road, it won’t
Manufacturers, however, had been quiet about what would happen under these circumstances until this month. Speaking at the Paris Auto Show, von Hugo told Car and Driver that all of the company’s future Level 4 and Level 5 self-driving cars would be programmed with the decision to save the people they carry over anything else.
“If you know you can save at least one person, at least save that one. Save the one in the car,” von Hugo said, according to Car and Driver. “If all you know for sure is that one death can be prevented, then that’s your first priority.”
But does this viewpoint line up with the that of potential buyers? A study published in the journal Science this year highlighted the ethical dilemmas for those manufacturing self-driving cars and what survey respondents thought would be the correct course of action: kill or be killed.
Just under 2,000 people were polled, and most believed that autonomous cars should always make the decision to cause the least number of fatalities. On the other hand, most people also said they would buy one only if it meant their safety was a priority. That suggests that, when it comes to selling cars, at least, Mercedes has probably made the right call.