Auto

Will Self-Driving Cars Get A Crash Course In Ethics?

We know the difference between right and wrong. How do we teach artificial intelligence the same instincts?

Will Self-Driving Cars Get A Crash Course In Ethics?
Getty Images
SMS

Philosophers spend their days toying with thought-puzzles on the meaning of life and whether shadows on a cave wall depict reality. 

Today, there's a different kind of a philosophy. One that could determine whether you live or die. It's called the trolley problem

Imagine you're enjoying an afternoon stroll near the train tracks when you see five people tied down to a track with a trolley barreling toward them. You realize you are next to a lever that will send the trolley to another track. On the second line, one person is tied down. 

Do you pull the lever, changing the trolley's course sacrificing one to save five?

Uber Suspends Its Autonomous Vehicle Testing After A Fatal Crash
Uber Suspends Its Autonomous Vehicle Testing After A Fatal Crash

Uber Suspends Its Autonomous Vehicle Testing After A Fatal Crash

After an autonomous vehicle struck and killed a woman in Arizona, Uber says it's halting its self-driving vehicle tests.

LEARN MORE

The twists have always created popular philosophical what-ifs. But with the advent of self-driving cars, these mind-games quickly become serious policy questions. Self-driving cars use a combination of sensors and software to do all the driving for us. They're programmed to follow traffic rules and navigate obstacles.

Some scientists envision these cars ultimately becoming "connected," meaning they'll be able to communicate with things like traffic lights and other cars. And as this happens, they'll start making choices. Or more accurately, programmers, lawmakers or consumers will be programming in some choices.

Imagine your self-driving car is zooming down the highway when a car comes rushing toward it with five people inside. Your car can lurch off the road and save the other car's occupants, but it will have to sacrifice a pedestrian, or maybe your car, or maybe you.

Do your priorities change if the other car is self-driven versus human-driven? If someone's driving drunk? Do the passengers' identities matter—a child, a terminally-ill patient, a head of state, the Dalai Lama? What if one car's worth a lot more?

Many hope self-driving cars will reduce traffic deaths. Over 40,000 people died in U.S. car crashes last year alone. 

In a world of self-directed cars, driving deaths might be fewer, resulting from imperfect philosophical solutions. Meanwhile, other forms of A.I. will be facing the same life-and-death dilemmas: think of fully-autonomous military drones, self-flying planes, self-operating floodgates all run by algorithms written by humans. 

We know technology is progressing but its future and ours may rest on ancient principles. Suddenly those Socratic debates seem awfully relevant.