[Ed. – It’s always something. In any case, for multiple reasons, self-driving cars are a bad idea.]
The newest tech dream these days seems to be self-driving cars, vehicles programmed to take us back and forth without the need of a human driver. But what will these machines be programmed to do during an impending catastrophic accident? At least one ethicist is warning that your self-driving car might be programmed to save the most lives during such a situation — killing you in the process. …
What if, for instance, your driverless car decides that your life is expendable in order to save a school bus full of children?
Are you comfortable with a soulless computer making a life or death decision when it is your life on the line? Should the computer in your driverless car be able to use a “utilitarian” philosophy to save the lives of others even if yours is sacrificed to do so?
This is the question with which ethicists are now wrestling.