The advancements toward autonomous driving vehicles are occurring quite fast. As with all new technology, there are a number of concerns associated with it. What role should the government play with self-driving vehicles? How is negligence in the case of an accident to be determined? Autonomous driving will arrive soon with a variety of ethical and moral issues to be decided about it.
A self-driving vehicle could provide passengers with a very positive driving experience. According to an article by CBS news, they could dramatically improve traffic efficiency. Autonomous vehicles may decrease air pollution. They could also eliminate over 89 percent of traffic accidents. A self-driving vehicle can do things a human may not be able to accomplish. They won’t get distracted by things around them on the road. They won’t fall asleep while driving. They don’t text and drive, and certainly, they won’t drink and drive like a human. The huge issue with autonomous vehicles is how can they make an important decision concerning life-or-death that is instinctive to people.
According to Scientific American, there was a study published in Science magazine. The study concerned the dilemmas facing people who purchase autonomous vehicles, and the companies that manufacturer them. The study consisted of over 1,900 participants. Most believed that a vehicle should be programmed to hit anything other than a pedestrian. The majority believed this should happen even if it results in the driver and passengers of the vehicle being killed. Researchers believer this means that moral principals will need to be placed in a self-driving vehicle’s algorithm. The goal will be to have guiding moral principals to handle situations when harm is unavoidable.
It is believed vehicle manufacturers will have the responsibility of programming ethics that match the ethics of society, but what exactly are they? Who would support a situation where a mother is pushing a stroller across a street, and in order to avoid hitting them, a mother in an autonomous car with her children swerve and risk going off of a bridge. In the article in Science magazine, over 70 percent of participants in a survey believed it would be better to sacrifice a driver and passenger than kill a dozen pedestrians. These same participants also believed they would want to be in an autonomous vehicle that would protect its occupants at all costs.
It’s still not certain how autonomous vehicles will impact insurance costs. Technology is not perfect. It is as flawed as the humans who created it. Even with self-driving cars, there will be accidents that cause property damage, injuries and even death. Things happen with technology as sophisticated as a computer. According to Wired magazine, even with automated cars, there will be accidents caused by software bugs, misaligned sensors as well as a variety of unanticipated technical issues. There will also be problems that are human-centric. This includes such things as improper servicing, people changing programs, altering the vehicles and more.
The National Highway Traffic Safety Administration(NHTSA) is analyzing what regulations will be needed when it comes to autonomous vehicles. The agency is expected to release a series of guidelines to cover self-driving vehicles in the near future. It has not yet held any public discussions covering the ethical concerns with this technology. There are some regulations currently in place to deal with autonomous vehicles. California has enacted regulations that focus on the prototype self-driving vehicles being tested in the state. There is currently a requirement for a safety driver always to be in place and prepared to take over the self-driving vehicle if necessary. A company is required to file a report for all occasions where a human had to take control of a self-driving vehicle.
Should the certification process for an autonomous vehicle be higher than for humans? It is possible the needs of such vehicles to do more than follow traffic laws and vehicle codes. Some are concerned what happens when a vehicle needs to break laws to avoid an emergency. This could involve going into the wrong lane to avoid a collision and more.