Reply to post: Re: I'm not sure I understand

Robot cars probably won't happen, sniffs US transport chief

Deltics

Re: I'm not sure I understand

No, it's that if a human driving a vehicle mows down 15 people then that human will find themselves in a court where a jury of their peers will examine the specific circumstances and capabilities of that human, taking all factors into account and reaching a decision as to whether the action was justified - or at least excusable. And if not, then that human has consequences to face. Otherwise, the family (or families) of any victims at least may be satisfied that justice has been applied (it is not uncommon for families in such cases to feel compassion and sympathy along with their grief).

But if a vehicle control system makes that decision then it is a simple question of whether the vehicle followed it's programming or there was a defect in that programming.

If the program is shown to have a defect then the manufacturer is liable not just to the families involved but will likely face instant bankruptcy as their product becomes poison. Or at least face a massive recall exercise.

If the vehicles is demonstrated to not have a "defect" in the programming. That is, that the program specification was followed precisely, then in any event, that programming is responsible for having chosen the deaths of 15 people over the 1 life of the passenger (or, the death of the 1 person over the 15). The argument then will be that the decision tree formulated years in advance and in splendid isolation from the circumstances on the day in question, was not sufficiently adaptable to those circumstances and was thus inherently and dangerously flawed.

So even if there was no defect, the program was defective.

There will be either the families of the 15 people or the 1 person lining up to claim massive damages as a result of the decision or error that concluded that the life of their loved one was the one - on balance - worth sacrificing.

Aha - comes the cry from the permanently not-pessimistic - but what if the program can be demonstrated not to have performed any such "weighing of the balance" at all !!?! Eh? Ha! Then the program can't be blamed for making a decision that it didn't actually take.

OK - so there was no "decision" to mow down 15 people, they were simply not a factor in the vehicles action to save the occupant. In which case the open and shut argument is simply that such a system is not safe to permit on the roads where such decisions are necessarily required.

It's not that either outcome is "OK".

It's that the legal questions arising from the one where a vehicle is "responsible" are just too complex and intractable and once this is realised, the car companies will quickly back pedal from the idea, except as a development vehicle [sic] for technologies to provide driver assistance (as opposed to replacing the driver entirely).

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon