This article exists as part of the online archive for HuffPost Australia, which closed in 2021.

Self-Driving Cars Need To Choose Who To Save In A Crash

It's not necessarily the driver.
An algorithm to decide who to save and who to risk.
Denver Post via Getty Images
An algorithm to decide who to save and who to risk.

One of the greatest ethical questions humanity faces -- to kill a few to protect the many -- will soon be made not by people, but by our self-driving cars.

As autonomous vehicle prototyping and testing continues, ethicists are sparking the debate about how these cars will be programmed to respond in an accident.

A new series of surveys shows that while most people agree autonomous vehicles should strive to save the greatest number of people -- including passengers -- in an accident, when it comes to purchasing, we'd like to buy a car that prioritises the safety of the driver and passengers.

The online surveys of more than 1900 Americans looked at this "formidable challenge" to create an algorithm to "choose between two evils, such as running over pedestrians or sacrificing themselves and their passenger to save the pedestrians".

Google's prototype autonomous vehicle is one of a handful being tested currently.
Elijah Nouvelage / Reuters
Google's prototype autonomous vehicle is one of a handful being tested currently.

The results, published in U.S. journal Science, showed 95 percent agreed a self-driving car should sacrifice its passengers if it meant saving five or more pedestrians. When asked to rate their likelihood from 0 to 100 of buying a car with programming to save the most people, the average was 20 out of 100, versus 50 out of 100 for car that was always programmed to save its occupants.

In Australia, these decisions will have to be made within the decade, according to National ICT Australia Optimisation Research Group professor Toby Walsh.

"Given that driverless cars are less than a decade away, we need to work out, as a society, how we program such systems," Walsh said.

"Unlike the past, where if you survived an accident, you could be brought in front of the courts if you drove irresponsibly, we will have to program computers with behaviours in advance that determine how they react in such situations."

Australia's roads will change dramatically in the coming decade.
shutterstock
Australia's roads will change dramatically in the coming decade.

Swinburne University of Technology transport engineering associate professor Hussein Dia said driving was already inherently risky as is.

"Some people have even gone further to suggest that given the number of fatal traffic accidents involving human error today, it could be considered unethical to introduce self-driving technology too slowly," Dia said.

"The biggest ethical question then becomes: How quickly should we move towards full automation given that we have a technology that potentially could save a lot of people, but is going to be imperfect and is going to kill a few?

"Should they be allowed on our roads, even if they make such mistakes?"

The study's conclusion had yet more questions that needed investigation:

Is it acceptable for an AV to avoid a motorcycle by swerving into a wall, considering that the probability of survival is greater for the passenger of the AV than for the rider of the motorcycle?

Should AVs account for the ages of passengers and pedestrians?

If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm's decisions?

Such liability considerations will need to accompany existing discussions of regulation, and we hope that psychological studies inspired by our own will be able to inform this discussion.

Figuring out how to build ethical autonomous machines is one of the thorniest challenges in artificial intelligence today.

Close
This article exists as part of the online archive for HuffPost Australia. Certain site features have been disabled. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.