This article exists as part of the online archive for HuffPost Australia, which closed in 2021.

In A Deadly Crash, Who Should A Driverless Car Kill -- Or Save?

It's one of the thorniest moral dilemmas in tech right now.
Getty

People can’t make up their moral minds about driverless cars.

In a series of surveys published Thursday in the journal Science, researchers asked people what they believe a driverless car ought to do in the following scenario: A group of pedestrians are crossing the street, and the only way the car can avoid hitting them is by swerving off the road, which would kill the passengers inside.

The participants generally agreed that the cars should be programmed to sacrifice their passengers if doing so would save many other people.

This, broadly speaking, is a utilitarian kind of answer -- one aimed at preserving the greatest possible number of lives. But there's one problem: The people in the survey also said they wouldn't want to ride in these cars themselves.

It would be OK for others to buy them, the participants said, but they personally would not.

“Figuring out how to build ethical autonomous machines is one of the thorniest challenges in artificial intelligence today,” Jean-François Bonnefon, of France's Institute for Advanced Study in Toulouse, and his colleagues wrote in the study.

The scenario described above is hypothetical, but it and others like it are bound to happen in real life once driverless cars become a mainstream reality, the researchers said. We need answers and rules now, so that we can include them in the programming of these machines. Even if a driverless car has a manual override option, it's easy to imagine a situation where there simply isn't time for a passenger to react and take control of the vehicle.

It can be unnerving to think about this stuff. But autonomous vehicles actually have the potential to create a safer world. In the United States alone, about 35,000 traffic deaths occur every year, along with millions of injuries.

“Around 90 percent of those accidents are due to human error," Azim Shariff, an assistant professor of psychology and social behavior at the University of California, Irvine, and a co-author of the study, said at a press conference. "Autonomous vehicles promise to change all that for the better, but there are barriers to their wide adoption. A number of those are technological barriers, but they’re also psychological ones."

If you've ever taken an ethics class, you might recognize the driverless-car scenario described above. It shares a lot of DNA with a famous thought experiment known as the trolley problem. In that scenario, you are the engineer of a runaway trolley. On the track ahead of you, five workers are working, oblivious to your approach. You can't brake, but you can switch the trolley onto another track. But there's one person on that track. Would you kill that one worker to save five others?

There are a number of variations on the question. What if that one person is pregnant? What if the five others are criminals? You can imagine how many different ways this can go.

For the study published Thursday, the researchers conducted six online surveys of U.S. residents between June and November 2015, asking participants how they would want their autonomous cars to behave in various scenarios. The researchers tweaked some variables, such as the number of pedestrians or passengers, in each scenario. In some examples, participants had to imagine their child was riding in the car. In others, they were told to imagine riding with their coworkers.

People generally preferred decisions that minimized the number of casualties. In one scenario, 76 percent of respondents said it would be more moral for an autonomous vehicle to sacrifice one passenger if it would save 10 pedestrians.

But this allegiance to the greater good only went so far. People balked when asked if they would purchase these cars themselves. “You can recognize the feeling," Bonnefon said -- it's "the feeling that I want other people to do something, but it would be great not to do it myself."

What should the manufacturers of driverless cars do? And what would happen to public safety if every driverless car was programmed to protect its passengers above all else?

"To maximize safety, people want to live in a world in which everybody owns driverless cars that minimize casualties, but they want their own car to protect them at all costs," said Iyad Rahwan, a researcher at MIT and co-author of the study.

Automakers that offer such cars would probably make a lot of money, Rahwan said. But it's the tragedy of the commons: "If everybody thinks this way then we will end up in a world in which every car will look after its own passenger’s safety and society as a whole is worse off."

However, the researchers believe that public attitudes may change over time. If the technology of autonomous cars improves to the point where riding in them is ultimately safer than driving yourself, people might become more receptive to the idea of buying a car that increases their safety -- even if it's explicitly programmed to allow them to die in certain unlikely situations.

If you have opinions of your own, you can contribute to the discussion at this website created by the researchers.

Close
This article exists as part of the online archive for HuffPost Australia. Certain site features have been disabled. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.