Metanav

Will You Trust an AV to Make All the Driving Decisions?

smart car av autonomous vehicles self-driving

While the technology behind autonomous vehicles (AV) is progressing, some say public AV availability for general use is at least 15 years away. This is mainly due to the reliability of AVs, especially when it comes to the almost impossible moral and ethical decisions that need to be taken by the car’s artificial intelligence (AI) system.

There’s a distinct lack of trust in automation versus human decision making when dealing with traffic and pedestrians. New AVs are on their way to being able to handle complex but everyday traffic issues that can be difficult for machines to comprehend and react to quickly and safely, while humans navigate them with relative ease.

Predicting the actions of unpredictable drivers and pedestrians is one of the more important issues AV technology must address. Torc Robotics’ Asimov Level 4 self-driving system does so with the use of Light Detection and Ranging (LIDAR), cameras and radar. The company’s video, which you can view here, demonstrates its systems in real-world Las Vegas traffic.

Navigating “What If” Situations

Still, there are plenty of situations in which LIDAR, radar and cameras may not be enough to detect signals or unusual hazards. Developers are still working on systems that can detect the approach of emergency vehicles based on sound, so that AVs will pull over and let them by.

But what about when a school bus pulls over in the opposite lane or a couple of lanes over to the right in the same direction of traffic? There’s also the situation where a car traveling in the opposite direction flashes its high beams, which can have various meanings depending on the context – a signal to proceed, to move over, to beware of a road hazard or a police officer up ahead measuring speed.

And what about a situation where the car in front of yours stops at a red light but overshoots the stop line so the driver backs up just a little, and you notice he or she leaves the car in reverse? You would probably tap the horn a couple of times, hoping that the driver ahead of you gets the point before the light turns green and the driver steps on the gas pedal, smashing right into you. But will an AV react appropriately?

There are also innocent pedestrians to consider. If the choice is to mow down a line of children or self-sacrifice into a wall, what would you do?

Aligning with Human Values

In a 2016 paper, “The Social Dilemma of Autonomous Vehicles,” three scientists examined public trust in that decision making process. In a survey of about 2,000 people, most respondents liked the idea of an AV sacrificing itself to save others, but as passengers, they said they would want the car to preserve their own safety, no matter what. The researchers advised, “To align moral algorithms with human values, we must start a collective discussion about the ethics of AVs – that is, the moral algorithms that we are willing to accept as citizens and to be subjected to as car owners.”

The study’s authors worried that mandating utilitarian AVs — those that would swerve to avoid a crowd — through federal regulation would present a confounding problem: passengers would never agree to be rationally self-sacrificing. Things get even more complicated in what are called “edge cases,” in which an AV may face a variety of thorny weather, traffic, and other conditions at once, forcing a series of complex rapid-fire decisions. The report concludes, “There seems to be no easy way to design algorithms that would reconcile moral values and personal self-interest.”

Azim Shariff—one of the paper’s authors and a professor at the University of California, Irvine—has called for “a new social contract” for AVs. Riding in one will mean giving yourself over to a machine whose “mind” most humans don’t understand — and which, in a moment of crisis, may be programmed to prioritize the lives of others over your own. “I’ve kind of wracked my brain to think of another consumer product which would purposefully put their owners at risk against their wishes,” he said. “It’s really a radically new situation.”

Coming Soon: Fundamentals of AV Technology

However uncertain the efficacy and usability of AV technologies is right now, there are still many potential benefits to drivers. The Fundamentals of Autonomous Vehicle Technology is a six-course program coming soon from IEEE. It covers foundation and practical applications of autonomous, connected and intelligent vehicle technologies. For more information or to pre-order this course program, connect with an IEEE Content Specialist today!

Resources

Ramey, Jay. (9 Jan 2018). Here’s how the latest autonomous driving systems handle real-world scenarios. AutoWeek.

Silverman, Jacob. (June 2018). The Menace and the Promise of Autonomous Vehicles. Longreads.

Winton, Neil. (2 Jan 2018). Autonomous Car Hype Is Way Ahead of Reality. Forbes.

, , , , , ,

Trackbacks/Pingbacks

  1. Who’s Responsible for an Autonomous Vehicle Accident? - IEEE Innovation at Work - August 9, 2018

    […] error causes 94% of conventional (non-autonomous) car crashes. Both fully and partially autonomous vehicles (AVs) could improve that number substantially – once they’re closer to perfected. In the meantime, […]

  2. LIDR is the Latest Game-Changing Advancement for Autonomous Vehicles | IEEE Innovation at Work - October 25, 2018

    […] California, and recently unveiled its 4D LIDAR system. They say this next generation version of LIDAR (Light Detection and Ranging) technology is superior to what is found in most autonomous vehicles […]

  3. AI is Driving AVs, but Whose Ethics are Driving AI? | IEEE Innovation at Work - November 26, 2018

    […] whatever those fleet drivers do influences the algorithm that makes the decisions, and that scenario would be implemented in millions of […]

Leave a Reply

https://www.googletagmanager.com/gtag/js?id=G-BSTL0YJSGF