See William H. Weeden, a professor at the University of Miami School of Law, and Philip Koopman, an associate professor at Carnegie Mellon University, examine potential criminal liability for events that occur while operating motorized vehicles…
Current events highlight the potential criminal liability of operating a motor vehicle. In Arizona, a safety driver in an Uber robotaxi plead guilty to an unspecified felony in response to a negligent homicide charge for a death that occurred while the Automated Driving System (ADS) was in operation. Shortly before that, he was the owner of a Tesla Undertaking not to compete Charged with negligent homicide due to deaths that occurred while operating Tesla AutoPilot, which automates vehicle control under the driver’s supervision.
In both cases, the automation was controlling the vehicle’s braking, speed and steering at the time of the accident. Prosecutors in both cases brought criminal charges against the human operator on the theory that despite the use of the automation system, both drivers had ultimate responsibility for the safe operation of the vehicle. Attributing liability to the human operator in these cases is consistent with the limited existing case law. However, the trial decision ignored the very real problem Satisfaction with automation as an excuse, although it may have been a mitigating factor in sentencing without a prison sentence.
the SAE J3016 Terminology The standard for the level of automation depends on the manufacturer Design the intention. Tesla states that the automation used in the deaths in California was a Level II feature. Level 2 requires the human driver to remain alert at all times, ready to take immediate control of the vehicle to avoid an accident or dangerous situation. The Tesla Owner’s Manual likewise requires the driver’s constant attention. Uber says its robot taxi operates at Society of Automotive Engineers (SAE) Level 4, which requires no human intervention for a serial production vehicle. But during the test that led to death, Uber is designated a safe driver Responsibility to intervene to prevent an accident, just as in the case of a Level 2 Tesla.
Enter the Level 3 Mercedes-Benz, designated L Distribution in Nevada and California. The Series Production Level 3 driving feature does not require the human driver to remain alert at all times. It even envisions that a human driver can focus on other tasks while the automated driving system is on (for example, reading a book or watching a video). However, as part of the vehicle’s safety concept, the human driver must still be able to respond to the system’s request that the human driver take control of the vehicle. the UN ECE Standard No. 157 for Automated Lane Keeping Systems (ALKS) used as the basis for permission to operate in Europe sets a grace period of 10 seconds after which a human driver is expected to take control.
The law should be reformed Clarify several points of responsibility. First, the manufacturer’s declared design intent for the vehicle feature does not necessarily control the legal determination of criminal liability. The court is currently free to decide that the operator of a Tier 3 vehicle is liable for failure of the automation while the automation is in operation, just as it is in the Arizona and California cases. Mercedes does not dictate PC standards through a paragraph in the owner’s manual or press release. The law may hold the operator liable while using any type of automation. Manufacturers should be careful to get clarity from state legislatures because certainty gives guarantees to their customers. Selling a Level 3 product in which the human operator has potential criminal liability at all times must present a significant marketing problem.
Secondly, there is the thorny issue of liability in the aftermath of the takeover bid. Is the operator of a Tier 3 vehicle likely to be held liable for any accident immediately after the takeover order, or only for accidents that occur after a grace period such as the 10 seconds specified for ALKS (which is the European Standard which has not been adopted anywhere in the United States)? Will there be real-world scenarios where a reasonable driver cannot take control within the 10-second grace period? This may occur if the AV places the driver in an untenable or unrecoverable position at the time of the takeover request.
Third, there is the issue of potential liability for manufacturers. In those cases where the operator of a Tier 3 vehicle does not have potential criminal liability, does the manufacturer have potential liability instead? This may arise, for example, if a motor vehicle exceeds the speed limit by enough to constitute a felony under state law and does not issue a possession order, or runs a red light and causes a fatal accident.
Primarily with regard to liability for civil tort, the motor vehicle industry has repeatedly asserted that existing laws and legal frameworks are adequate to address liability issues. The criminal law concerns mentioned above demonstrate this This is not the case. Automated vehicle technology is new enough that society needs new legal approaches to account for the differences between old and new ways of driving.
William H. Widen is a professor at the University of Miami School of Law, Coral Gables, Florida, where he researches the regulatory implications of autonomous vehicles. Philip Koopman is an Associate Professor at Carnegie Mellon University, Pittsburgh, Pennsylvania, specializing in the safety of autonomous vehicles.
Suggested quote: Widen and Philip Koopman, Level 3 Motor Vehicles and Criminal Law, Jurist – Academic Commentary, 8 Aug. 2023, https://www.jurist.org/commentary/2023/08/widen-koopman-automated-vehicles-criminal-law/.
The opinions expressed in the JURIST Commentary are the sole responsibility of the author and do not necessarily reflect the views of the JURIST editors, staff, donors, or the University of Pittsburgh.