It may be many years before fully autonomous vehicles become operational and rules on liability become clearer
By Kiran N. Kumar
With self-driving cars witnessed on roads increasingly, several claims on possible accidents and liability have been unnerving the regulators and insurers alike.
Top among the players in testing self-driving cars on American roads, Google project Waymo reported 18 accidents involving these cars over a period of 20 months, from Jan 2019 to Sept 2020 in Phoenix, Arizona.
In California alone, 187 accidents involving self-driving cars occurred between 2019 and 2021, though only two have been attributed to faulty autonomous systems, as per a research report by IDTechEx.
All manufacturers claim that the accidents are less fatal, yet currently these cars have a higher rate of accidents than human-driven cars.
On average, there are 9.1 self-driving car accidents per million miles driven, while the same rate is 4.1 crashes per million miles for regular human-driven vehicles.
As of April 2022, 38 states have laws or executive orders to prepare for the changes that self-driving cars may usher in but no state has outrightly banned the technology either.
The Federal Autonomous Vehicle Policy includes no new rules but guidance for states. It will likely be many years before fully autonomous vehicles become operational and rules on liability become clearer.
In fact, these accidents bring into limelight an ever looming gray area in our AI-aided future — Who should be blamed when self-driving cars are involved in accidents?
With varying levels of autonomous or fully autonomous vehicles on roads now, the blame cannot solely be assigned to the manufacturer or the service center, and the vehicle owner.
In Britain, a recent Law Commission directive issued in January this year described it as the driver’s “Duty of Care,” since two scenarios are involved — one where a driver has handed control over to the automated control systems, and the other where taking back control would be deemed possible, but failure to do so has occurred.
Current self-driving cars in the US, though small in number, typically include manual controls for backup safety drivers to meet federal safety standards.
More than 30 companies or organizations are permitted to test highly automated or self-driving vehicles on US roadways, according to National Highway Traffic Safety Administration (NHTSA).
According to some legal studies on liability, where a fully autonomous vehicle is involved, the responsibility for avoiding an accident shifts entirely to the vehicle and its software rather than the driver for the liability determination.
It could be the vehicle manufacturer, or some other party involved in the design, manufacture, or operation of the autonomous vehicle.
Since a computerized driver replaces a human one, the companies behind the software and hardware sit in the legal liability chain — not the car owner or the person’s insurance company, argue some experts. They believe that eventually, and inevitably, the carmakers will have to take the blame.
The US National Highway Traffic Safety Administration in March 2022 issued its final rules eliminating the need for highly automated and self-driving vehicles.
However, it underscored that these cars “must continue to provide the same high levels of occupant protection as current passenger vehicles” and children should not occupy what is traditionally known as the “driver’s position”.
Now that the option is wide open for manufacturers to build and deploy a self-driving vehicle without human controls like steering wheels or brake pedals, we may see fully autonomous vehicles.
Tesla, GM, Google and other auto manufacturers have invested billions in developing driverless vehicle technology. But no vehicles on roads are truly driverless yet, though capable of doing little more than keeping vehicles in marked lanes and automatically braking when a hazard is detected.
Since the technology currently available is short of full autonomous driving in the US, legal liability for violations by vehicles using automated driving technology rests with the driver.
Read: Who is liable in an accident with an autonomous car? (September 9, 2021)
According to MIT researcher Ashley Nunes, even if driverless technology does arrive, human intervention will still be necessary to service the vehicles and prevent illegal activity.
Otherwise, as things stand now, if you’re using autopilot or self-parking mode and get into a collision, the fault still lies with the driver, not the technology.