It has been widely written that politicians, insurance companies and lawyers will be among the many obstacles to quickly releasing self-driving cars into the mass market. However, I recently read an article which discusses certain moral issues that need to preliminarily be resolved as well.
In the article entitled “Why Self-Driing Cars Must Be Programmed To Kill”, the issue of ethically programming is discussed. Specifically, should the software be programmed to minimize loss of life, even if it means sacrificing the driver and occupants, or should it protect the occupants at all costs?
The article posits: “Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?”
Of course, many people will argue in favor of a utilitarian approach to minimize loss of life (i.e., that the car should avoid killing as many people as possible). But would you buy a car that is programmed to kill you over others?
Another moral dilemma involves motorcyclists. Should a self-driving car be programmed to hit a wall rather than a motorcyclists knowing that the threat to injury is great for a motorcyclists that the driver of the car?
While the technology is great and will reduce accidents, there are obviously serious ethical issues (among other things) that may take years to work out.
That’s not a particularly good example as the survivability of hitting a wall in a car with modern safety features is quite high, but I understand the moral dilemma being presented.
The technology being developed now for autonomous vehicles uses multiple technologies to read the road, the surroundings and other cars. Thermal sensors that perceive humans or other animals alert autonomous systems well ahead of human eyes (especially at night) making the example given unlikely when true autonomy happens.
The systems being developed now will evaluate speed, location and priority of objects around the car and respond to avoid at a level even humans are unable to anticipate. Will they be infallible? No, but nothing is.
What will likely happen is the number of lives severely injured and/or lost to traffic accidents each year will drop drastically and people commuting to and from work will either nap, read, or work productively while their car finds the path of least resistance.
I’m a car guy who loves to drive and races cars. I shouldn’t want any part of this but t’ve been to the University of Michigan’s M-City, an 11 million dollar test facility built to test and develop autonomous cars in every kind of weather condition. Audi, BMW, Google, Apple and others have autonomous vehicles in various stages of development in different parts of the world so you’d have to cover your eyes and ears while making loud noises not to see autonomy coming.
When autonomous cars start reducing lost life and injury while improving the quality of life for those in those cars, it’s hard to imagine they won’t be fully embraced.
People objected when seat belts, ABS and air bags were first mandated as too much government but the lowering numbers of death and injury only confirmed their impact.
Once they get the kinks worked out autonomy will likely be as common and integrated as ABS, GPS and air bags. Our relationship to cars is already changing as good handling and reliability have relegated them to transportation appliances rather than social objects of personal expression.
Autonomy will drastically impact the number of traffic tickets issued if autonomy means cars don’t speed but that kind of change is years away, making it likely our grandkids will be asking, “What’s a manual transmission?”, and “Why did people get speeding tickets?” some day. 🙂
Thanks for sharing your insightful comments to my post. The post certainly raises nettlesome philosophical issues.
Thats give one something to think about. As a motorcyclist, I am glad you brought that up. Does the driver have the ultimate control of the car? Years of driving has shown, you cannot prevent all accidents, and sometime you can minimize the damage. It is a tough problem you give us. I dont think I would want the car to kill us, if we were the lesser evil.
Thanks for sharing your thoughts of these tricky issues. It will be interesting to see how they play out.