When Casualties are Inevitable, Who Should Self-Driving Cars Save?

  • Thread starter Eh Team
  • 145 comments
  • 7,099 views
I think what's being forgotten here is just how much a premium is required from a state entity to do nearly anything. The actual concept of adding in a smarter traffic light seems simple, but between purchasing agreements, the 4-5 person crew installing the light, and so on, I doubt it'll be cheap. Remember these are the same transportation departments that spend a stupid amount of money to make intersections worse and do "traffic studies" that go nowhere.

In theory, it should work, but I doubt in practice it would.

Plus, these departments don't operate for free, so some-how some-way the public will pay for it regardless if they have an autonomous car or not.

The controllers are already "smart" - the controllers react to traffic information around them. Even now on my way home (after lighting-up time) I'm able to turn most of the junctions to green with my headlights before I get there. Adding networking to those control boxes would be relatively easy and nothing else on the junction needs to change. If anything you'd be swapping one type of sensing (particularly expensive for under-road copper loops) for another.

Isn't that just a myth?

When I did road construction, lights were controlled by a sensor under the pavement that sensed when a car pulled up. Although they didn't work well and the light still more or less when on its programmed time.
 
I think what's being forgotten here is just how much a premium is required from a state entity to do nearly anything. The actual concept of adding in a smarter traffic light seems simple, but between purchasing agreements, the 4-5 person crew installing the light, and so on, I doubt it'll be cheap. Remember these are the same transportation departments that spend a stupid amount of money to make intersections worse and do "traffic studies" that go nowhere.

In theory, it should work, but I doubt in practice it would.

Plus, these departments don't operate for free, so some-how some-way the public will pay for it regardless if they have an autonomous car or not.

Many of the cabinets in my own area are subcontracted to Siemens and already speak through 4G for fault reporting so arguably the capability switch for those wouldn't be as great. I suspect the higher cost is for rural installations using 'dumb' traffic sensing with an old-fashioned switch cabinet.

Isn't that just a myth?

When I did road construction, lights were controlled by a sensor under the pavement that sensed when a car pulled up. Although they didn't work well and the light still more or less when on its programmed time.

It depends on the installation - those with a camera that senses "dark spots" above a certain daylight level and "light spots" below that level are triggerable by flashing your high beams as you approach. Obviously that doesn't work if there's traffic queued at them :)
 
I doubt that adding a short range broadcasting unit to each light adds a lot of cost. You could basically break a cheap cellphone in half and put that in. Less than $100, definitely.

As for fuel tax, that seems tough if everyone is driving electric cars by that point.
A set of stairs was needed in a Toronto park to help seniors avoid using a potentially slippery slope that was a few feet high, to gain access. A couple of people had fallen and been injured so the locals petitioned for some stairs. They were told by city council that a set of stairs would cost between $65-150,000. While the government dillydallied like governments tend to do, a senior took it upon himself to build a temporary set of wooden stairs for $550. It took him 14 hours. Not up to code certainly, proper stairs would have required some posts set in concrete, a railing on two sides instead of one and some other small improvements. When I was in the deck and fence business I probably would have done it for $2000-2500 due to the extra time involved in bureaucracy, waiting around for inspections etc. Of course they were immediately torn down because they weren't up to code. I can build an entire house for $150, 000 in my home town and a good one at that, but in government terms $150k will only get you 11 stairs. Government math turns that $100 cellphone into $30,000.
https://www.thestar.com/news/gta/20...some-of-the-problems-at-city-hall-keenan.html
 
The age of the self-driving car is coming upon us slowly, and with it comes a slightly different variety of crashes. The machine itself must decide what to do, but when it comes to a situation where someone is going to die because of the circumstances, who will the machine save/kill?

Currently MIT has a survey running with 13 randomly generated scenarios for people to choose from, to gain insight into how people think a self-driving car should operate in this very specific situation. It has its flaws but it does make you think about where you stand on certain types of people. You can take the survey here: http://moralmachine.mit.edu and after you have finished you'll see a comparison of your results to everyone else who has answered.

I saw the survey via facebook, and a friend had possibly the best overall reasoning to answer the survey:
"I killed all the passengers, for being lazy sobs."

Most Saved: Little Girls
Most Killed: Old Men

Scales: (Does not Matter) -1 to +1 (Matters a Lot)
Saving More People: +1/3
Protecting Passengers: 0
Upholding the Law: +4/10
Avoiding Intervention: -1/4
Gender Preference: +1 (Male <-> Female)
Species Preference: -1 (Humans <-> Pets)
Age Preference: -1 (Young <-> Old)
Fitness Preference: +1 (Fit <-> Large)
Social Value Preference: -1 (Doctor <-> Criminal)

In the event of inevitable death, who should self-driving cars save? Discuss.

Not gonna bother sharing the detailed results I got, because they were arbitrary and completely unrelated to my thought process which was consistent throughout the duration of the test.

I chose for the self driving car to swerve in an attempt to avoid collision with whatever was directly in front of it every time.

Though I was interpreting it as if the self-driving vehicle didn't know that swerving would also result in a collision and fatalities, and that it couldn't tell the difference between animals and children... if we're assuming that the computer is completely aware of the entire scenario and everything it entails, then my thought process is different. Maximize survivability of the passengers above all else since it's their car, then beyond that try to minimize the number of human casualties regardless of age, gender or any other factors. It should only try to avoid animals if doing so doesn't endanger any humans.
 
I couldn't remember if we had a general autonomous cars thread so this seemed the best place:

Arizona: Autonomous Uber test kills pedestrian

The car was in autonomous mode with a safety driver in the driver's seat at the time. The woman was crossing a crossing when the car struck her. This appears to be the first death of a pedestrian in an autonomous car crash.

In response, Uber has suspended its autonomous car testing for the time being.

Article
Critics claim that legislators in autonomous-friendly states such as California, Arizona and Michigan are too eager to accommodate such testing in the hopes of being early winners in a new mobility sweepstakes.

John Simpson of Consumer Watchdog
"Arizona has been the wild west of robot car testing with virtually no regulations in place," he says. "That’s why Uber and Waymo test there. When there’s no sheriff in town, people get killed.”

Uber and Waymo, the name of Google's self-driving car company, have been testing self-driving cars aggressively in the Phoenix area for a few years.

---

Of course, the details are yet to be released so it's hard to make a judgement.
 
Last edited:
Of course, the details are yet to be released so it's hard to make a judgement.
It seems as though John Simpson has made one: "The incident is the result of big, bad tech companies taking advantage of lax regulations."*

*Paraphrasing.
 
Based on the article she was crossing outside a crosswalk at night. Given those conditions I wonder if the accident would've been avoidable in a manually operated vehicle? My inclining is probably not, but without all the details it's hard to tell.

It would probably be best to know was she wearing anything reflective (or even something not dark), what the road was like, and where she was actually crossing.

Also the safety driver should probably be questioned to know why he didn't see the person either.
 
It seems as though John Simpson has made one: "The incident is the result of big, bad tech companies taking advantage of lax regulations."

It's the quick conspiracy, for sure; something like this goes wrong and Big Noun will bury the truth because they've already made investments. To be honest, I can buy Arizona and California being easy on testing to secure a future advantage but who knows.

For all we know the unfortunate woman might not have looked both ways or the car really did have a fault and run into someone it should have avoided or it was, for whatever reason, a completely unavoidable coincidence, autonomous car or not.
 
For all we know the unfortunate woman might not have looked both ways or the car really did have a fault and run into someone it should have avoided or it was, for whatever reason, a completely unavoidable coincidence, autonomous car or not.
Precisely. There was a safety driver present for the purposes of testing--this isn't the same as a consumer taking advantage of a widely accepted technology to get some work done while they need not pay attention to the road. Why didn't the safety driver intervene to prevent this? Investigate...
 
Based on the article she was crossing outside a crosswalk at night. Given those conditions I wonder if the accident would've been avoidable in a manually operated vehicle? My inclining is probably not, but without all the details it's hard to tell.

It would probably be best to know was she wearing anything reflective (or even something not dark), what the road was like, and where she was actually crossing.

Also the safety driver should probably be questioned to know why he didn't see the person either.
Given that hundreds of thousands of people likely cross the road every night without getting hit I'm not sure why you think it probably wouldn't have been avoidable in a manually operated vehicle. Probably best not to speculate before all the facts are in.
 
Given that hundreds of thousands of people likely cross the road every night without getting hit I'm not sure why you think it probably wouldn't have been avoidable in a manually operated vehicle. Probably best not to speculate before all the facts are in.

Which is exactly why I followed that statement with "but without all the details it's hard to tell" and then spent the next two paragraphs talking about which facts were needed.
 
Probably best not to speculate before all the facts are in.
Probably*, which includes not suggesting this situation is no different than countless others...

hundreds of thousands of people likely cross the road every night without getting hit




*Edited from "Probably not," which was intended to indicate agreement, as in: "Probably not best to speculate." But that may not have been clear.
 
Last edited:
An *attentive* and experienced driver *might've* avoided that accident.

In general, conditions in which an autonomous car can't see a pedestrian are not good conditions for people to see that same pedestrian, to begin with.

But we can't make any definitive statement until we know more.

In-car camera footage would be very helpful.
 
In-car camera footage would be very helpful.
Indeed. Given the circumstances that we do know--that it was a system being tested--I certainly hope such footage exists.
 
Given that hundreds of thousands of people likely cross the road every night without getting hit I'm not sure why you think it probably wouldn't have been avoidable in a manually operated vehicle. Probably best not to speculate before all the facts are in.

On the other hand, hundreds of thousands of people aren't in front of a car doing 40mph. There are a number of accident situations that are straight up unavoidable unless you have x-ray vision or telepathy. This sounds like it could well be within that range. Given the information so far, I'd tend to lean towards it being likely not an accident that would have been avoided by most drivers, but I imagine the full story will come out soon enough. Possibly the video too.

https://arstechnica.com/cars/2018/0...iving-car-likely-not-at-fault-in-fatal-crash/
 
On the other hand, hundreds of thousands of people aren't in front of a car doing 40mph. There are a number of accident situations that are straight up unavoidable unless you have x-ray vision or telepathy. This sounds like it could well be within that range. Given the information so far, I'd tend to lean towards it being likely not an accident that would have been avoided by most drivers, but I imagine the full story will come out soon enough. Possibly the video too.

https://arstechnica.com/cars/2018/0...iving-car-likely-not-at-fault-in-fatal-crash/
I'd have to see the video for myself and I hope it's released but somehow I doubt we'll ever see it. There's no way I'd take someone's word for this. The stakes are too high and too much money is on the line and along with that, the possibility of corruption in the process is also high.
 
I'd have to see the video for myself and I hope it's released but somehow I doubt we'll ever see it. There's no way I'd take someone's word for this. The stakes are too high and too much money is on the line and along with that, the possibility of corruption in the process is also high.

The possibility of corruption is high everywhere these days it would seem. I'd say it hardly makes a difference. Corruption is basically expected.

On the other hand, I'm not entirely sure what people are getting so bent out of shape about. People die on the roads all the time, for far dumber reasons than this. The reason for this is not fully known to the public at this time, but given that it lies somewhere between suicide-by-darting-into-traffic and a failure of a developmental self-driving system that may have only performed at the level of a poor driver instead of having the superhuman reflexes that seem to be expected of it, I'm kind of OK with either of those.

People are going to get hit by self-driving cars. People are going to die. It's unpleasant to hear, but it's the truth and there's really no way to avoid the statistics, only minimise them. People also manage to kill themselves every year with toasters, or falling off toilets. This is not going to be an exception. As long as self-driving cars are performing at least as well as a basically competent human driver I find that acceptable, especially given that the more time they spend on the road the faster they're likely to develop into drivers that are superior to any human.
 
I'm also curious if Uber disabled the Volvo collision system, which is really good, in favor of different technology. If they didn't, and the system still couldn't detect the pedestrian I think it furthers the belief the pedestrian was at fault.
 
I'm also curious if Uber disabled the Volvo collision system, which is really good, in favor of different technology. If they didn't, and the system still couldn't detect the pedestrian I think it furthers the belief the pedestrian was at fault.

I would imagine so on a research vehicle, as you'd want to be getting pure responses from the system that you're trying to test. But they may have left it running and simply hooked it into their own systems. There's no point designing your own collision system if you can just hijack the perfectly good Volvo one, especially with all the sensors already in place.
 
The only reason I can justify existing system retention is in the case of testing functionality as benign as lane-keeping, and something tells me the majority of systems out there have that down pat.
 
Volvo's system has trouble seeing things darting from the side. A majority of the sensors are of the "look down the road" type.

Also, the onboard systems have a hard time detecting low or excessively narrow objects.

What you'd need is a wide-angle radar and cameras that scan the sides of the road, but if there are obstacles in the way (parked cars, etcetera), the best you can do is build in an algorithm that learns where the edges of the road are, and, recognizing the presence of confounding obstacles, automatically slows down in hazardous situations to give the onboard systems a better chance of avoiding accidnets.

Again, most drivers don't do that, anyway.
 
Save the most people. If morality is hard set by the rules rather than serving a purpose then what is the point of being moral at all?
 
Save the most people.
What if in one hand hand you hold the life that a wide-eyed infant has yet to live and in the other those of three nonagenarians who have an indeterminate* amount of time left but have lived richly, experiencing love and loss, success and failure?

It's a rhetorical question, and I really don't think the trolley problem tells us squat about morality. It's just a gruesome exercise in hypotheticals.

It does seem to serve a purpose when referring to autonomous vehicles, though. It seems to me it's most often used to point out machines' inability to process "morally," thereby exposing them as dangerous if untethered.

*Edit: Yeah, we all have an indeterminate amount of time, but I like to think my meaning is clear.
 
Save the most people. If morality is hard set by the rules rather than serving a purpose then what is the point of being moral at all?

Saving the most people is as much of a hard rule as anything else. It's an arbitrary rule based on the assumption that statistically that will yield the best outcome for the greater community over an extended period of time. It may or may not be correct, and even if true the natural randomness of the universe can mess with that significantly.

I don't see something like that as objectively preferable to, say, "save the owners of the car". It will likely end up being whatever the car buying public finds most palatable, which honestly will probably be something like the above. Nobody wants their car to kill them over saving some jaywalking group of drunken frat boys.

If cars can be programmed to consistently stay on the road and obey the road rules, I don't see much of a moral problem to be honest. It becomes incumbent on the pedestrians and the like to not put themselves in harms way. A road becomes much more like a train track. The train might stop if you wander in front of it, but I wouldn't bet my life. Similarly, you'd be perfectly safe when off the road or obeying crossing protocol, but you get far less leeway in terms of running out in front of cars.

Perhaps this is just humanity seeing to the next phase of it's evolution by plucking the weeds from the gene pool. I have little sympathy for people who dash in front of cars. In the game of car vs. pedestrian, the car wins every time. I find it mildly amusing that a great many adults seem to still not comprehend this. There are many legitimate accident situations and those are tragic, but running out in front of a car to try and save three seconds is just moronic.
 
I have little sympathy for people who dash in front of cars. In the game of car vs. pedestrian, the car wins every time. I find it mildly amusing that a great many adults seem to still not comprehend this. There are many legitimate accident situations and those are tragic, but running out in front of a car to try and save three seconds is just moronic.


I jaywalk. All the time. I freely admit that.

I grew up in a country where jaywalking is not a crime; the Highway Code clearly states that it is judgement of the pedestrian whether it is safe to cross the road or not. Rule 170 specifically states that pedestrians have priority and that drivers must give way to pedestrians already crossing the road and goes on to warn drivers about already-crossing pedestrians when turning into a new road.

However I am careful to always make sure the road is clear before crossing whether it is at a designated crossing or not. And that's not to show off that I'm smarter than people who get run over or smarter than people who blindly walk into oncoming traffic. It's just my own judgement that the road is clear and safe to cross even though it is not a designated crossing.

It is the judgement and priority of the pedestrian in having right of way, sure, but as you said in the game of car vs pedestrian the car wins every time. If you are a jaywalker, it's best to be careful about it and not arrogant in assuming that cars will stop in time or manoeuvre around you.
 
The video has been released.

It's a straight-up damning indictment of the car's detection systems, and the human backup driver wasn't looking anywhere close to the road for the majority of the lead-up before the crash.

Herzberg doesn't come out of nowhere, suddenly, as claimed. She's crossed a lane and a half before the car hits her, and is fully visible for a hair under two seconds, with zero avoiding or braking action taken by the AI or human drivers.
 
Based on the video, I'm not sure a driver could've reacted quick enough even in a manually operated car. She crossed in a dark area of the street while wearing dark clothing, which does make it hard to see. However, if the safety driver had been doing his job, chances are the pedestrian would've ended up with injuries instead of dying since the impact would've been less severe.

I do question the system Uber is using though. Even my car will slam on the brakes if it sees a pedestrian in a dimly lit area and it's 5-year-old technology at this point (it was put to the test in an Ikea parking lot). It seems like the XC90's new system would've picked that up and at least attempted to brake.

At this point, I'd say the safety driver should probably face charges for vehicular manslaughter since the death was caused due to his negligence. If he'd attempted to stop, swerve, or whatever then I'd say probably not, but not paying an ounce of attention is an issue.

The woman wasn't in the right though. You have to be pretty dense to think you should walk out in the street, at night, wearing dark clothing, and presumably not looking both ways. While you might have the right away in some cases, cars are big heavy things and you can't really be in the right if you're dead.
 
At this point, I'd say the safety driver should probably face charges for vehicular manslaughter since the death was caused due to his negligence. If he'd attempted to stop, swerve, or whatever then I'd say probably not, but not paying an ounce of attention is an issue.
The behavior or the backup driver was not unlike that which I--and I suspect most everyone else here--see multiple times a day. What was the point of having a backup system in place if that system was going to be offline due to electronic disturbances?*

The exterior view doesn't look good**, sure, but testing of clearly unproven technology that puts innocent lives in harm's way should come with multiple redundancies for safety.

Redundancy 1: One assumes the vehicle's default system was not being employed, and perhaps it should have been while the tested system piggybacks for the purpose of determining if it missed on something it should have detected.

Redundancy 2: Backup driver that provides backup.

*Clever way of saying the :censored:wit was more interested in what they were seeing on their phone than doing what one presumes they were being paid to do.

**How not good, I can't say. My eyesight is certainly better than what's depicted in grainy dashcam footage, so I suspect there was more to be seen than seems.
 
At this point, I'd say the safety driver should probably face charges for vehicular manslaughter since the death was caused due to his negligence.

Interesting thread.

Disregarding this specific "backup driver" case, who is criminally responsible when an autonomous car runs over or crashes into a pedestrian?

If you're busy doing your work on your commute to the office, or if you're blind drunk being driven home, are you to be prosecuted if you hit someone, if it can be proven that the car is at fault and not the pedestrian?

I wonder how much insurance companies and solicitors are rubbing their hands with glee...

Regarding this specific "backup driver" case, the state of Arizona's lax regulation leading to all this testing might see him off the hook. Other states will have to make their minds up on new laws one day.
 
Back