Metanav

Who’s Responsible for an Autonomous Vehicle Accident?

connected systems driverless car accidents AV accident AV courses autonomous training

Driver error causes 94% of conventional (non-autonomous) car crashes. Both fully and partially autonomous vehicles (AVs) could improve that number substantially – once they’re closer to perfected. In the meantime, there have been over 30 accidents involving AVs in the state of California since 2014. So clearly, crashes, injuries, and fatalities will hardly disappear when and if AVs become more prevalent. AVs will crash into one another, into conventional cars, and into pedestrians and bicyclists, too.

So, the question is: When it comes to accidents involving AVs, who is responsible?

No clear cut answer

Experts say that when a computerized driver replaces a human one, the companies behind the software and hardware sit in the legal liability hot seat, not the car owner or the person’s insurance company. But the line between human and machine liability isn’t always clear.

AVs are in test mode in various cities across the United States. Not yet fully autonomous, they require a human driver to pay attention. To date, several accidents show that the AV was warning its driver to disengage autopilot mode and take control of the vehicle. Does that mean the human driver is responsible for the crash?

After the AV test phase, when individuals actually choose to ride or drive in them rather than simply being part of the test field, will there be a shift in legal responsibility and blame?

In the case of the fatal AV accident in Tempe, Arizona, on a Sunday night last February, where the victim seems to have stepped off a median into a dark roadway while jaywalking, the initial police investigation indicated that the pedestrian may have been at fault.

Eventually, though, AVs may not have a human at the wheel or even have a steering wheel, so how could a human passenger intervene? In this case, inevitably, the carmakers will have to take responsibility. That’s okay though, because, by the time fully autonomous driving becomes a reality, carmakers like Volvo, Mercedes and Google are confident that their technologies will be so buttoned up that they’ll be able to take the driver out of the operation and liability picture almost entirely.

Get AV-savvy

While there’s still much uncertainty about the efficacy and usability of human-AV interaction, the AV era is forging ahead and continues to promise improved driving safety. Get a handle on the foundational and practical applications of autonomous, connected and intelligent vehicle technologies with Fundamentals of Autonomous Vehicle Technology, coming soon to IEEE. This six-course program was developed by some of the leading experts in AV technologies. To learn more about course titles and content and to pre-order the program, connect with an IEEE Content Specialist.

Resources

Iozzio, Corinne. (1 May 2016). Who’s Responsible When a Self-Driving Car Crashes. Scientific American.

Galeon, Dom. (29 Jan 2018). Who Is Responsible When a Self-Driving Car Has an Accident? Futurism.

Bogost, Ian. (20 Mar 2018). Can You Sue a Robocar? The Atlantic.

Trackbacks/Pingbacks

  1. Hack-Resistant Blockchain Could be the Solution to AV Security Challenges - IEEE Innovation at Work - September 13, 2018

    […] in connected and autonomous vehicles to thwart hackers is vital. Additionally, the question of liability in the event of an AV-involved accident is also top of […]

  2. Who Really Uses Ethical Blockchain? - May 23, 2020

    […] Defining what constitutes moral habits is important the place there’s human involvement. Even the sphere of know-how, which is the synergy of individuals and machines, wants ethics. There are rampant discussions on the ethics for artificial intelligence and autonomous automobiles. We come throughout opinions on what is considered ethics and what is not and who should be held responsible when machines err. […]

Leave a Reply

https://www.googletagmanager.com/gtag/js?id=G-BSTL0YJSGF