New York drivers who are concerned about the safety of autonomous vehicles may be interested to know that the programming the vehicles receive may make them unsafe. This is because the programs they are equipped with directs them to drive as humans would.

A computer science professor who conducts research on the designs of cyber-physical systems and who is funded by the National Science Foundation and the National Institute of Standards and Technology states that autonomous vehicle industry players are employing humans to guide the driving of the autonomous vehicles. Because the vehicles are copying humans, who are prone to make mistakes, they are also copy their dangerous driving behaviors.

According to the professor, the industry is trying to sell the idea that the autonomous vehicles can provide a human-like driving experience that is completely safe. However, it is because human driving is the standard being used that the vehicles are safety risks. In order to be safe, autonomous vehicles should only be permitted to travel at speeds at which they can come to a complete stop before the end of their range of vision. The speed at which the vehicles are moving should be slow enough that they can immediately stop if an obstruction abruptly appears.

The standards of driving for humans and autonomous cars are significantly different. Accidents that are caused by humans are perceived as unfortunate, but expected, as humans are known to make mistakes. Autonomous drivers are perceived as being completely safe, which is why when they cause an accident, the autonomous industry is in danger.

At present, most car accidents are caused by human factors. A person who has been injured in a collision that was caused by the negligence of another motorist might want to have the help of an attorney in seeking appropriate compensation.