Digital Media Center
Bryant-Denny Stadium, Gate 61
920 Paul Bryant Drive
Tuscaloosa, AL 35487-0370
(800) 654-4262

© 2024 Alabama Public Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Should Self-Driving Cars Have Ethics?

New research explores how people think autonomous vehicles should handle moral dilemmas. Here, people walk in front of an autonomous taxi being demonstrated in Frankfurt, Germany, last year.
Andreas Arnold
/
Bloomberg via Getty Images
New research explores how people think autonomous vehicles should handle moral dilemmas. Here, people walk in front of an autonomous taxi being demonstrated in Frankfurt, Germany, last year.

In the not-too-distant future, fully autonomous vehicles will drive our streets. These cars will need to make split-second decisions to avoid endangering human lives — both inside and outside of the vehicles.

To determine attitudes toward these decisions a group of researchers created a variation on the classic philosophical exercise known as "the Trolley problem." They posed a series of moral dilemmas involving a self-driving car with brakes that suddenly give out: Should the car swerve to avoid a group of pedestrians, killing the driver? Or should it kill the people on foot, but spare the driver? Does it matter if the pedestrians are men or women? Children or older people? Doctors or bank robbers?

To pose these questions to a large range of people, the researchers built a website called Moral Machine, where anyone could click through the scenarios and say what the car should do. "Help us learn how to make machines moral," a video implores on the site.

The grim game went viral, multiple times over.

"Really beyond our wildest expectations," says Iyad Rahwan, an associate professor of Media Arts and Sciences at the MIT Media Lab, who was one of the researchers. "At some point we were getting 300 decisions per second."

What the researchers found was a series of near-universal preferences, regardless of where someone was taking the quiz. On aggregate, people everywhere believed the moral thing for the car to do was to spare the young over the old, spare humans over animals, and spare the lives of many over the few. Their findings, led by by MIT's Edmond Awad, were published Wednesday in the journal Nature.

Using geolocation, researchers found that the 130 countries with more than 100 respondents could be grouped into three clusters that showed similar moral preferences. Here, they found some variation.

For instance, the preference for sparing younger people over older ones was much stronger in the Southern cluster (which includes Latin America, as well as France, Hungary, and the Czech Republic) than it was in the Eastern cluster (which includes many Asian and Middle Eastern nations). And the preference for sparing humans over pets was weaker in the Southern cluster than in the Eastern or Western clusters (the latter includes, for instance, the U.S., Canada, Kenya, and much of Europe).

And they found that those variations seemed to correlate with other observed cultural differences. Respondents from collectivistic cultures, which "emphasize the respect that is due to older members of the community," showed a weaker preference for sparing younger people.

Rawhan emphasized that the study's results should be used with extreme caution, and they shouldn't be considered the final word on societal preferences — especially since respondents were not a representative sample. (Though the researchers did conduct statistical correction for demographic distortions, reweighing the responses to match a country's demographics.)

What does this add up to? The paper's authors argue that if we're going to let these vehicles on our streets, their operating systems should take moral preferences into account. "Before we allow our cars to make ethical decisions, we need to have a global conversation to express our preferences to the companies that will design moral algorithms, and to the policymakers that will regulate them," they write.

But let's just say, for a moment, that a society does have general moral preferences on these scenarios. Should automakers or regulators actually take those into account?

Last year, Germany's Ethics Commission on Automated Driving created initial guidelines for automated vehicles. One of their key dictates? A prohibition against such decision-making by a car's operating system.

"In the event of unavoidable accident situations, any distinction between individuals based on personal features (age, gender, physical or mental constitution) is strictly prohibited," the report says. "General programming to reduce the number of personal injuries may be justifiable. Those parties involved in the generation of mobility risks must not sacrifice non-involved parties."

But to Daniel Sperling, founding director of the Institute of Transportation Studies at University of California – Davis and author of a book on autonomous and shared vehicles, these moral dilemmas are far from the most pressing questions about these cars.

"The most important problem is just making them safe," he tells NPR. "They're going to be much safer than human drivers: They don't drink, they don't smoke, they don't sleep, they aren't distracted." So then the question is: How safe do they need to be before we let them on our roads?

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Laurel Wamsley is a reporter for NPR's News Desk. She reports breaking news for NPR's digital coverage, newscasts, and news magazines, as well as occasional features. She was also the lead reporter for NPR's coverage of the 2019 Women's World Cup in France.
News from Alabama Public Radio is a public service in association with the University of Alabama. We depend on your help to keep our programming on the air and online. Please consider supporting the news you rely on with a donation today. Every contribution, no matter the size, propels our vital coverage. Thank you.