While they may seem years away from reality, self-driving cars are indeed going to be the future. Not only are they the next step in ensuring road safety by preventing human error, but also because the world of tomorrow is going to be based around artificial intelligence (AI).
However, like all pieces of technology, self-driving cars are bound to fail at some point. The more complicated the design, the more points of failure are present, and this is likely going to be the case with any first-gen iteration of a product. Self-driving cars are not an exception to this, and they may even be more likely to suffer from failure, as is the case with the first fatal car accident involving a self-driving car.
This begs the question of who should be held liable in such an event.
Who Should Be Held Liable?
The realm of law is complex, which is why all cases are pursued based on the legal principle of Stare Decisis, which states that all legal arguments must be based on precedents (previous Supreme Court decisions that have the same circumstances as the current case).
Because there aren’t any Supreme cases that have the same scenario, there aren’t any precedents upon which to base a legal argument. Thus, in order to properly determine liability, we need to define the obligations that each party has. This way, we can then determine if there was a failure to deliver what was promised. As with all things legal, the best course is to consult with lawyers, especially lawyers like these Seattle car accident attorneys who are specialized in handling car accident cases.
What Happened in the Accident?
In light of the investigation on the Elaine Herzberg car accident mentioned above, it was found that the self-driving car failed to detect a jaywalking pedestrian. Additionally, Uber also had a safety driver inside the self-driving test car just in case. Unfortunately, the safety driver responded only a second before impact because she was distracted at the wheel.
Uber argues that this accident was a result of human error (or tort) but also acknowledges the failure in their software to detect a human being on the road. This prompted Uber to suspend self-driving car testing for nine months and they have also updated their software to account for pedestrians.
As for the case, it was not pursued, as the Herzbergs decided to settle the matter outside of court. For the sake of discussion, based on the duties that each party was held to, Uber was not necessarily liable considering that they were merely testing their self-driving car.
They did not have any obligation to fulfill for the riding public. While their software was indeed faulty, the testing phase serves to detect these problems. The person at fault here was the safety driver, in that she was obliged to intervene during situations where the car failed to respond properly. She failed to perform her duty, and because of her negligence, a person was killed.