We’ve briefly discussed some safety issues associated with self-driving cars—what to do after an accident caused by one, what to do if it was your car involved in the accident, who is responsible for these accidents in general, etc. However, there have been a number of updates with respect to legal liability in these accidents that need to be addressed.
Specifically, one state prosecutor just recently determined that Uber is not criminally liable for the death of a pedestrian who was fatally struck by one of their self-driving cars last year. At the time of the crash, the car was in computer control mode, although there was a person operating the car.
Determining Responsibility & Liability
According to the National Transportation Safety Board, right before the crash, the car classified the pedestrian as an “unknown object,” then a “vehicle,” then a “bicycle,” but failed to activate the emergency braking system because it was in computer control mode. In these circumstances, these cars are still designed to rely on the human operator to intervene to avoid accidents because the feature of the car’s technology essentially ignores any object in the road that it does not register as an obstacle to driving.
Still, this does not mean that Uber and the driver cannot be held liable in a civil action and, in the case of the driver, criminal charges may be appropriate. The driver—apparently, an Uber employee—was reportedly looking down and streaming a television show at the time of the accident, and may now face manslaughter charges. Not only could Uber still face a lawsuit in civil court, but the managers and/or employees of Uber could also be sued, as test drivers are supposed to intervene in situations like this. While the victim’s family has reached a settlement with Uber, the family has also sued the city, alleging that the crosswalk was poorly designed.
Experts have also weighed in on the prosecutor’s decision not to pursue criminal charges. It is difficult, in general, to hold companies criminally liable unless they purposely overlooked a huge defect that they knew would place lives at risk. Still, the decision leaves a number of unresolved issues for regulators to figure out as they try to maneuver this new technology and make it safe for everyone on the roads.
What about The Lithium-Ion Batteries & Other Autopilot Features In Other Self-Driving Vehicles?
While this may be the first fatal accident for Uber, Tesla has, unfortunately, had its fair share of accidents involving their Model S vehicles and the Autopilot feature. Not only have there been a number of fatal accidents involving the car, including here in Georgia, but a number of these cars have also led to massive fires afterward as well when the lithium-ion batteries reignite, over and over, after the crash. These accidents not only pose a serious danger to those in the car, but arguably to those that are anywhere near the car as well. The Autopilot feature in the Tesla Model S advertises itself as being able to “automatically change lanes” and “brake for you”; however, the feature has also been involved when Tesla drivers have struck a number of tractor-trailers. The National Traffic Safety Board has already determined that Tesla provides too much leeway to drivers and still lacks the proper safeguards, and this is especially concerning given that these cars do not appear to know how to obey police sirens and pull over when instructed to. There is also no apparent way for police to commandeer the cars, which leaves law enforcement in a difficult position when something goes wrong on the road.
The Model X version has also started receiving safety complaints, with reports that the vehicle gets confused as Autopilot doesn’t seem to understand diagonal white lines on the road. In these cases, the drivers were not necessarily distracted, nor were they driving negligent in any other way, but instead reported an inability to course-correct to avoid the accident beforehand because the steering wheel locked up. According to Tesla, there was one car accident recorded for every 2.91 million miles driven with Autopilot just during the fall of 2018, and there have been at least two deaths since 2016.
When Will Self-Driving Cars Be Safe Enough?
It is generally accepted that, one day, self-driving cars will be safer than human drivers, and therefore, safety issues that arise with respect to these cars should be judged against each other instead of human-driven vehicles. Still, even though in this case the test driver made a huge oversight that arguably cost someone their life, test drivers are not expected to be present when these companies eventually deploy these vehicles – an issue that must be addressed before more are allowed on the road.
Some have also pointed out that it may be impossible to eliminate every risk when it comes to these vehicles, and this is certainly an assessment that was never done for regular human-driven vehicles. Still, the accident has definitely had its effect: although companies continue to test the vehicles, efforts to pass legislation to get these cars on the road have significantly slowed since this occurred.
Contact Our Top Kennesaw Car Accident Lawyers
The law firm of the Roger Ghai Law Offices is the premier law firm serving car accident victims who suffer injuries and other damage in Kennesaw, Georgia. If you have been hurt in an accident that was not your fault—whether that involved a self-driving or human-driven vehicle—you have the right to recover the damages you need to get back on the road to recovery, and you should not, under any circumstances, have to negotiate and hassle with insurance companies to try and obtain this recovery. Contact our attorneys today to find out more about our excellent services.