Self-Driving Cars: A Moral Dilemma?



Self-driving cars are accelerating into our lives with no signs of stopping. After almost a decade of work, Google's own version of a smart car, Waymo, is already giving rides in Phoenix. This car can interpret hand signals from incoming cyclists riding their bikes, and can drive thousands of miles without human intervention. Now, the matter is getting cars like this onto the mass market while at the same time making them affordable—and safe.

Many believe there is some kind of moral dilemma when dealing with smart cars. Some online studies were done by researchers concerning how consumers feel about sharing the road with driverless cars. The participants agreed that from a moral standpoint, automated cars should be designed to protect the greatest number of people—"even if they must be programmed to kill their passengers to do so.”

However, the interesting part of this survey is that respondents said they would ultimately want to purchase a vehicle programmed to protect the passengers, even if it meant that pedestrians (any number) were not protected. This is the moral dilemma that is seen by the manufacturers of smart cars: How important are the people in the car compared to those outside of it?

Example of decisions a smart car would have to make (Image by Bonnefon, Shariff and Rahwan)

Example of decisions a smart car would have to make (Image by Bonnefon, Shariff and Rahwan)

These smart cars would have to learn how to avoid accidents, but would not be able to measure the costs of them. The main dilemma that was discussed in the survey was how to swerve when pedestrians are incoming. Would the car move one way into the pedestrian? Or would it sacrifice its potential passenger if it meant saving the pedestrian’s life? These ethical and moral issues are one of many problems plaguing the makers of self-driving cars. 

Another potential issue is the question of blame. If your self driving car causes an accident, who is to blame? The company who programmed the vehicle? The answer will certainly affect insurance companies and legal liability. 

According to the National Safety Council as many as 40,000 people died in traffic accidents in 2016, making it the deadliest year in almost a decade. In 90 percent of those accidents, human error was to blame. By removing the human element, self-driving cars could actually help prevent accidents. The technology has not yet been perfected, but how safe is safe enough when it comes to automated vehicles? 

If we wait for perfect, we’ll be waiting for a very, very long time.
— Mark Rosekind, Natoinal Highway Traffic Safety Admin.

Enjoy cutting edge technology? Check out:

Virtual Reality: Seeing Into the Future