Self-driving car liability
Increases in the use of autonomous car technologies (e.g. advanced driver-assistance systems) is causing incremental shifts in the responsibility of driving, with the primary motivation of reducing the frequency of road accidents. Liability for incidents involving self-driving cars is a developing area of law and policy that will determine who is liable when a car causes physical damage to persons or property. As autonomous cars shift the responsibility of driving from humans to autonomous car technology, there is a need for existing liability laws to evolve in order to fairly identify the appropriate remedies for damage and injury. As higher levels of autonomy are commercially introduced (SAE automation levels 3 and 4), the insurance industry stands to see greater proportions of commercial and product liability lines, while personal automobile insurance shrinks.
- 1 Current liability frameworks
- 2 Policy considerations (US)
- 3 State level legislation
- 4 Shift in auto insurance marketplace
- 5 Public statements from car manufacturers
- 6 See also
- 7 References
Current liability frameworks
There are three basic theories of tort liability: traditional negligence, no-fault liability and strict liability.
|Traditional negligence||Driver is held liable for harms caused when reasonable care was not taken while in operation of the vehicle|
|No-fault||Crash victims are not permitted to sue the driver of the vehicle, unless the injuries resulting from the crash are of a certain severity. Victims are compensated through their own insurance|
|Strict liability||Applies for abnormally dangerous or “ultrahazardous” activities. The actors involved consequently bear the associated costs regardless of whether they are legally at fault|
According to a National Motor Vehicle Crash Causation Survey, over 90% of the crashes (representing an estimated 2 million crashes nationwide) involved the driver as the critical reason of the crash. Meanwhile, research from the Insurance Institute for Highway Safety (IIHS) shows that Advanced Driver-Assistance Systems, which are seen as stepping stones to get to Level 3 and 4 autonomy, have helped reduce accidents by employing forward collision warnings and automatic braking. Given these trends, increased use of autonomous vehicle technology could reduce the number of accidents and prevent crash-related deaths. Consequently, it is likely that cases of traditional negligence will fall and this will in turn reduce automobile-insurance costs.
With the onset of fully autonomous cars, it is possible that the need for specialized automobile insurance disappears and that health insurance and homeowner's liability insurance instead cover automobile crashes, much in the same way that they cover bicycle accidents. Moreover, as cases of traditional negligence decrease, no-fault insurance systems appear attractive given their benefits. It would provide compensation to victims relatively quickly and the compensation would not depend on the identification of a party at-fault. In such systems, individual drivers would be well protected and would encourage the adoption of autonomous cars for its safety and cost-related benefits.
Product liability governs the liability of manufacturers in terms of negligence and strict liability.
|Negligence||Manufacturers must exercise reasonable care in designing their products to be safe under potential use cases|
|Strict liability||Manufacturer is held strictly liable for damages even when the manufacturer has exercised all possible care to remove defects|
Autonomous car manufacturers are incentivized by possible product liability torts lawsuits to reduce the danger of their products as much as they can within a reasonable cost structure. Strict liability covers an expansive range of potential harms that manufacturers may find difficult to protect against; instead of reducing less cost-effective risks, manufacturers may to some degree pass on potential costs of liability to consumers through higher prices.
Furthermore, product liability cases distinguish among various types of defects.
|Manufacturing defects||When the product does not meet the manufacturer's specifications and standards|
|Design defects||When foreseeable risks of harm could have been reduced by use of an alternative design|
|Failure to warn||When manufacturer fails in its duty to provide instruction about how the product can be safely used and does not provide adequate warning of its risks|
Under manufacturing defects, a plaintiff needs to show that the autonomous car failed to work as specified by the manufacturer. In the case of autonomous cars, however, this presents a major hurdle because no court has applied manufacturing defects to software, which is not something tangible that is manufactured. Wrong performance of the technology system is called “malfunction”, which means that there is a coding error within the system to cause the accident. When there is a coding error, then the controlling software system may not have functioned as its authors had originally intended. If a crash stems from a software error, then the traditional product liability law on manufacturing defects may not suffice. A greater understanding of how software will be treated under this liability law, particularly when a software error causes physical parts to malfunction, needs to be explored.
Historically, courts have used two tests for defectiveness of design: consumer-expectations and cost-benefit.
Consumer-expectations: "A product is defective in design or formulation when it is more dangerous than an ordinary consumer would expect when used in an intended or reasonably foreseeable manner. Moreover, the question of what an ordinary consumer expects in terms of the risks posed by the product is generally one for the trier of fact."
On the other hand, the cost-benefit test weighs the benefits against the costs of a product in determining whether a design is defective. With autonomous cars, the plaintiff could make the argument that a different design, whether in the physical features of the vehicle or in the software that controls the movements of the vehicle, could have made the vehicle safer. For plaintiffs, this creates a high burden of proof and also makes it difficult to find qualified experts.
In asking "who do I sue," a plaintiff in a traditional car crash would assign blame to the driver or the car manufacturer, depending on the cause of the crash. In a crash involving an autonomous car, a plaintiff may have four options to pursue.
- Operator of the vehicle: in Florida and Nevada, an operator is defined as a person who causes the autonomous technology to engage, regardless of whether the person is physically in the vehicle. California, on the other hand, specifies that an operator as “the person who is seated in the driver’s seat, or, if there is no person in the driver’s seat, causes the autonomous technology to engage.” The viability of a claim against the operator will determine on the level of autonomy. For instance, if the autonomous technology allows the passenger to cede full control to the vehicle, then the passenger will likely not be found to be at fault for a crash caused by the technology.
- Car manufacturer: with this option, a plaintiff will need to determine whether the manufacturer had a part in installing autonomous technology into the vehicle. States such as Florida, however, are providing protection by limiting product liability for manufacturers.
- Company that created the finished autonomous car: Volvo is an example of a manufacturer who has pledged to take full responsibility for accidents caused by its self-driving technology.
- Company that created the autonomous car technology: Companies under this option could include those developing the software behind the autonomous car and those manufacturing the sensor systems that allow a vehicle to detect its surrounding.
In defense of such liabilities, autonomous vehicle manufacturers could make the argument of comparative negligence, product misuse and state of the art. With comparative negligence, the driver or passenger interference is seen as a part of the cause of harm and injury. With product misuse, the driver or passenger may be at fault for disregarding directions or altering the vehicle in a way to affect the proper performance of the vehicle. With state of the art, manufacturers could make the argument that there were not safe alternative designs at the time of manufacturing.
As cars become more interconnected and autonomous, the potential for hacking a car system to acquire data and cause harm poses a serious risk. For manufacturers and developers of autonomous technology, liability exposures arise from the collection and storage of data and personal information in the vehicle and in the cloud. Currently, manufacturers require indemnification from vendors and subcontractors (dealerships, repair/installation facilities, etc.) and this practice will likely be extended to autonomous technology developers.
Transportation systems are vital for the autonomous vehicle as they serve as the commander and with multiple autonomous vehicles systems used to increase efficient, risk of exposure to malicious attacks will dramatic increase. In order to protect the systems, cyberphysical system must be implemented with autonomous dynamical subsystems to ensure the decision, interaction and control
Policy considerations (US)
Manufacturers overbearing the costs
As argued in the article “The Coming Collision Between Autonomous Vehicles and the Liability System” by Gary Marchant and Rachel Lindor, it is impossible for a manufacturer to anticipate all possible scenarios that an autonomous car will encounter. While the manufacturer will design the system to minimize risks of situations that it does anticipate, the accidents that are most damaging and costly will be those that the manufacturer fails to anticipate. This leaves the manufacturer highly vulnerable to design defects, in particular the cost-benefit test.
In light of this, Marchant and Lindor argues that “the technology is potentially doomed...because the liability burden on the manufacturer may be prohibitive of further development. Thus, even though an autonomous vehicle may be safer overall than a conventional vehicle, it will shift the responsibility for accidents, and hence liability, from drivers to manufacturers. The shift will push the manufacturer away from the socially-optimal outcome—to develop the autonomous vehicle.”
Consequently, policymakers need to be mindful of manufacturer overbearing the liability costs and the potential consequences that may result, such as higher consumer costs and delays in introducing autonomous car technology. In the report “Autonomous Vehicle Technology” by the Rand Corporation, the authors recommend that policymakers consider approaches such as tort preemption, a federal insurance backstop and long-term cost-benefit analysis of the legal standard for reasonableness. These approaches attempt to align the private and public costs of the autonomous car technology such that adoption is not unnecessarily delayed and one party does not overly-bear the costs.
In September 2016, the National Highway Traffic Safety Administration released a policy report to accelerate the adoption of autonomous car technology (or HAVs, highly automated vehicles) and provide guidelines for an initial regulatory framework. The key points are:
- States are responsible for determining liability rules for HAVs. States should consider how to allocate liability among HAV owners, operators, passengers, manufacturers, and others when a crash occurs.
- Determination of who or what is the “driver” of an HAV in a given circumstance does not necessarily determine liability for crashes involving that HAV.
- Rules and laws allocating tort liability could have a significant effect on both consumer acceptance of HAVs and their rate of deployment. Such rules also could have a substantial effect on the level and incidence of automobile liability insurance costs in jurisdictions in which HAVs operate.
- In the future, the States may identify additional liability issues and seek to develop consistent solutions. It may be desirable to create a commission to study liability and insurance issues and make recommendations to the States.
H.R. 3388, the SELF DRIVE Act of 2017
- Advance safety by prioritizing the protection of consumers.
- Reaffirm the role and responsibilities of federal and state governments.
- Update the Federal Motor Vehicle Safety Standards to account for advances in technology and the evolution of highly automated vehicles,
The Federal Government, with the passing of the SELF DRIVE Act, is limiting the role of States and this could signal a change in the future of liability laws. With the Federal Government also asserting that consumers will be protected, manufacturers may be at a liability disadvantage and stand to lose surplus. Updating the Federal Motor Vehicle Safety Standards will affect liability law. These laws will continue to protect the consumer while placing stricter standards on producers. The Federal Government has yet to announce any specific autonomous vehicular manslaughter liability laws.
Artificial intelligence and liability
More broadly, any software with access to the real world, including autonomous vehicles and robots, can cause property damage, injury, and death. This raises questions about civil liability or criminal responsibility.
- Perpetrator via another - the programmer (software designer) or the user could be held liable for directly instructing the AI entity to commit the crime. This is used in conventional law when a person instructs or directly causes an animal or person incapable of criminal responsibility (such as a young child or a person with a severe mental disability) to commit a crime.
- Natural and probable consequence - the programmer or the user could be held liable for causing the AI entity to commit a crime as a consequence of its natural operation. For example, if a human obstructs the work of a factory robot and the AI decides to squash the human as the easiest way to clear the obstruction to continue working, if this outcome was likely and the programmer knew or should have known that, the programmer could be held criminally liable.
- Direct liability - the AI system has demonstrated the criminal elements of recognized theory of liability in criminal law. Strict liability offenses (like speeding) simply require an action (actus reus), but "conventional" offenses (like murder) require an intention (a type of mens rea). Criminal negligence involves non-performance of a duty in the face of evidence of possible harm. Legally, courts may be capable under existing laws of assigning criminal liability to the AI system of an existing self-driving car for speeding; however, it is not clear that this would be a useful thing for a court to do.
Kingston identifies two areas of law, depending on the type of entity:
- For products, product liability laws apply, including enforcement of warranties.
- For services, the tort of negligence may apply if the system failed to perform up to its duty of care.
The NHTSA investigation of a fatal 2016 crash involving Tesla Autopilot proceeded as an automobile product safety inquiry, and determined that despite the crash there were no defects that required a recall (though Tesla is working to improve the software to avoid similar crashes). Autopilot only gives cars limited autonomy, and human drivers are expected to maintain situational awareness and take over as needed.
With fully autonomous vehicles, the software and vehicle manufacturers are expected to be liable for any at-fault accidents (under existing automobile products liability laws), rather than the human occupants, the owner, or the owner's insurance company. Volvo has already announced that it will pay for any injuries or damaged caused by its fully autonomous software, which it expects to start selling in 2020. Starting in 2012, some U.S. states have passed laws or regulations specifically regarding autonomous car testing, certification, and sales, with some issuing special driver's licenses; this remains an active area of lawmaking. Human occupants would still be liable for actions they directed, such as choosing where to park (and thus for parking tickets).
University of South Carolina law professor Bryant Walker Smith points out that with automated systems, considerably more data will typically be available than with human-driver crashes, allowing more reliable and detailed assessment of liability. He also predicted that comparisons between how an automated system responds and how a human would have or should have responded will be used to help determine fault.
State level legislation
According to the NHTSA, states retain their responsibility for motor vehicle insurance and liability regimes, among other traditional responsibilities such as vehicle licensing and registration and traffic laws and enforcement. Several states, such as Michigan and Nevada, and Washington D.C. have explicitly written provisions for how liability will be treated.
Enacted autonomous vehicle legislation
|State||Bill Number||Relevant Provisions||Effective Date|
|Michigan||SB 663 (2013)||Limits liability of vehicle manufacturer or upfitter for damages in a product liability suit resulting from modifications made by a third party to an automated vehicle or automated vehicle technology under certain circumstances; relates to automated mode conversions||Enacted and chaptered on Dec. 26, 2013|
|Nevada||SB 313 (2013)||Provides that the manufacturer of a vehicle that has been converted to be an autonomous vehicle by a third party is immune from liability for certain injuries||Enacted and chaptered on June 2, 2013|
|Washington D.C.||2012 DC B 19-0931||Restricts conversion to recent vehicles, and addresses liability of the original manufacturer of a converted vehicle||Enacted and effective from April 23, 2013.|
Arizona's Republican Gov. Doug Ducey's new rules, implemented March 1, lay out a specific list of licensing and registration requirements for autonomous car operators. Specifically, Ducey's order specifies that a “person” subject to the laws includes any corporation incorporated in Arizona.
Shift in auto insurance marketplace
In a white paper titled “Marketplace of Change: Automobile Insurance in the Era of Autonomous Vehicles,” KPMG estimated that personal auto accounted for 87% of loss insurance, while commercial auto accounted for 13% in 2013. By 2040, personal auto is projected to fall to 58%, while commercial auto rises to 28% and products liability gains 14%. This reflects the view that personal liability will fall as the responsibility of driving shifts to the vehicle and that mobility on demand will take greater hold. In addition, with the view that the overall pie representing losses covered by liability policies will shrink as autonomous cars cause fewer accidents.
Although KPMG cautions that this elimination of excess capacity will bring about significant changes to the insurance industry, 32% of insurance firm leaders expect that driverless vehicles will have no material effect on the insurance industry over the next 10 years. Inaction by the large players has opened up opportunities for new entrants. For example, Metromile, an insurance provider start-up founded in 2011, has started to offer usage-based insurance for low-mileage drivers and designed a policy to complement the commercial coverage of Uber drivers.
Public statements from car manufacturers
In 2015, Volvo issued a press release claiming that Volvo would accept full liability whenever its cars in autonomous mode. President and Chief Executive of Volvo Cars Håkan Samuelsson went further urging "regulators to work closely with car makers to solve controversial outstanding issues such as questions over legal liability in the event that a self-driving car is involved in a crash or hacked by a criminal third party."
In an IEEE article, the senior technical leader for safety and driver support technologies at Volvo echoed a similar sentiment saying, “if we made a mistake in designing the brakes or writing the software, it is not reasonable to put the liability on the customer...we say to the customer, you can spend time on something else, we take responsibility.”
- Bertoncello, Michele; Wee, Dominik. "Ten ways autonomous driving could redefine the automotive world". www.mckinsey.com. Retrieved 11 December 2016.
- Slone, Sean. "State Laws on Autonomous Vehicles". Retrieved 11 December 2016.
- "Autonomous Vehicle Technology. A Guide for Policymakers". Retrieved 11 December 2016.
- "Marketplace of change: Automobile insurance in the era of autonomous vehicles".
- "Critical Reasons for Crashes Investigated in the National Motor Vehicle Crash Causation Survey". Retrieved 11 December 2016.
- "ADAS technology is reducing crashes". Retrieved 11 December 2016.
- "Nearly 10,000 Deaths Could Be Prevented and More Than $250 Billion Saved with Greater Use of Driver Assistance Technologies". Retrieved 11 December 2016.
- "Autonomous vehicles: The legal landscape in the US".
- A., Geistfeld, Mark (2017). "A Roadmap for Autonomous Vehicles: State Tort Liability, Automobile Insurance, and Federal Safety Regulation". California Law Review. 105 (6). doi:10.15779/z38416sz9r.
- "Donegal Mutual Insurance vs White Consolidated Industries Inc".
- "Autonomous Vehicles - Liability and Policy Issues" (PDF).
- "Autonomous, Self-driving Vehicles Legislation".
- "US urged to establish nationwide Federal guidelines for autonomous driving".
- "Considerations for Personal and Commercial Lines Insurers" (PDF).
- Mikulski, Dariusz (Fall 2015). "Special Issue: Modeling & Simulation for Cyber Security of Autonomous Vehicle Systems". The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology. 12 (4): 359–361. doi:10.1177/1548512915604584.
- "The Coming Collision Between Autonomous Vehicles and the Liability System" (PDF).
- "Federal Automated Vehicles Policy". 2016-09-19.
- "House Passes Bipartisan Legislation Paving the Way for Self-Driving Cars on America's Roads - Energy and Commerce Committee". Energy and Commerce Committee. 2017-09-06. Retrieved 2017-11-30.
- Kang, Cecilia (2017-09-06). "Self-Driving Cars' Prospects Rise with Vote by House". The New York Times.
- "House panel approves legislation to speed deployment of". Reuters. 2017-07-27.
- "When an AI finally kills someone, who will be responsible?". March 12, 2018.
- Kingston, J. K. C. (2016, December). Artificial intelligence and legal liability. In International Conference on Innovative Techniques and Applications of Artificial Intelligence (pp. 269-279). Springer, Cham.
- Tesla’s Self-Driving System Cleared in Deadly Crash
- Who's Responsible When a Self-Driving Car Crashes?
- Automated Driving: Legislative and Regulatory Action
- After crash, injured motorcyclist accuses robot-driven vehicle of 'negligent driving'
- Felton, Ryan. "Why Uber Could Be Held Criminally Liable In Fatal Crash Involving Autonomous Car (Updated)".
- "Why You Shouldn't Worry About Liability for Self-Driving Car Accidents".