When they crash, self-driving Mercedes will be programmed to save the driver, and not the person or people they hit. That’s the design decision behind the Mercedes Benz’s future Level 4 and Level 5 autonomous cars, according to the company’s manager of driverless car safety, Christoph von Hugo. Instead of worrying about troublesome details like ethics, Mercedes will just program its cars to save the driver and the car’s occupants, in every situation.
One of the biggest debates about driverless cars concerns the moral choices made when programming a car’s algorithms. Say the car is spinning out of control, and on course to hit a crowd queuing at a bus stop. It can correct its course, but in doing so, it’ll kill a cyclist for sure. What does it do? Mercedes’s answer to this take on the classic Trolley Problem is to hit whichever one is least likely to hurt the people inside its cars. If that means taking out a crowd of kids waiting for the bus, then so be it.
That’s a callous example, but it shows how we think, as opposed to how cars think, and by extension, how engineers think. “If you know you can save at least one person, at least save that one. Save the one in the car,” von Hugo told Car and Driver in an interview. “If all you know for sure is that one death can be prevented, then that’s your first priority.”
So far, the highest-profile death in a self-driving car was when a Tesla crashed on May 7, 2016, while in Autopilot mode. The car didn’t see a semi truck pull out across the road ahead, and drove under it, killing the driver, Joshua Brown. That was a straight-up error, but future crashes will involve the car choosing to point itself at humans, and in all likelihood killing them.
CONTINUE @ FAST COMPANY