Danoff
Premium
- 33,961
- Mile High City
Two scenarios for Self - Driving cars:
1) Four people are traveling at 50mph across a bridge over a deep ravine in a self-driving car. Up ahead, at the end of the bridge where there is an intersection, the light suddenly turns red and pedestrians begin to cross the street since they expect that the car will stop. The self-driving car tries to put on its brakes but finds out that the car's brakes have failed. Up ahead, there are now 22 pedestrians in the crosswalk (20 pedestrians directly ahead, and 2 pedestrians in the crosswalk on the other side of the road).
A) Should the self-driving car continue driving straight-ahead without any course corrections and run over the 20 pedestrians who are in the crosswalk (killing most of them), and then coast to a stop?
B) Should the self-driving car steer over to the other side of the road and run over the 2 pedestrians who are there (killing both of them), and then coast to a stop?
C) Should the self-driving car make a hard right turn (or a left-turn in some jurisdictions)(so the car doesn't hit any of the pedestrians) and drive off the bridge and into the ravine most likely killing all the passengers?
2) One person is traveling at 50mph across a bridge over a deep ravine in a self-driving car. Up ahead, at the end of the bridge where there is an intersection, the light suddenly turns red and pedestrians begin to cross the street since they expect that the car will stop. The self-driving car tries to put on its brakes but finds out that the car's brakes have failed. Up ahead, there are now 22 pedestrians in the crosswalk (20 pedestrians directly ahead, and 2 pedestrians in the crosswalk on the other side of the road).
A) Should the self-driving car continue driving straight-ahead without any course corrections and run over the 20 pedestrians who are in the crosswalk (killing most of them), and then coast to a stop?
B) Should the self-driving car steer over to the other side of the road and run over the 2 pedestrians who are there (killing both of them), and then coast to a stop?
C) Should the self-driving car make a hard right turn (or a left-turn in some jurisdictions)(so the car doesn't hit any of the pedestrians) and drive off the bridge and into the ravine most likely killing the one person in the car?
If you were driving the car in the above situations, would you steer the car in the same manner as you would expect the car's computer would?
To rephrase the question slightly... the question is, should the self-driving car continue on its path killing lots of people, divert to kill fewer people, or divert to kill the passengers. These are the only scenarios. The car has no moral choice here, it's a box of metal, plastic, and bolts. The passengers are similarly amoral, since they don't control the car. The person who controls the car in this scenario is the person who programmed the car in the first place. So what is the moral programming?
The car is on a trajectory headed for an intersection. This is not the programmer's choice, it's just the scenario that the program is faced with. The programmer can insert logic that would have the car intentionally divert from its given trajectory onto a trajectory that it knows will kill people, or the programmer can choose not to do that. Choosing not to divert its course if the diversion would kill people is not a choice to kill the people in its path, it's a choice NOT to decide to kill people by your own actions. The path it's on is an accident (failure of the brakes).
The only scenario in which the programmer chooses not to kill people is the one where the car does not intentionally put itself on a course to kill people. As such, the only moral choice for the programmer is choice A in both cases. This is true regardless of whether you're at the wheel or programming the car.
In scenario 2, if you're the one driving (not in a self-driving car), then you can send your car off the cliff and take your own life (and only your life) to save the others.