Reply to post:

Tesla driver charged with vehicular manslaughter after deadly Autopilot crash

Peter2 Silver badge

I'm not familiar with the road in question. I am however personally familiar with several roads in the UK where you come off of a highway doing 70mph (our national speed limit) and are faced with traffic lights on the exit ramp before a roundabout that looks to be just shy of 200 metres judged by the distance scale in google maps, which I suspect is probably 200 yards, having been there since well before we went metric. I can think of many, many more that have about double that sort of distance.

If the vehicle took an exit like this without decelerating under the mistaken impression that it's still on the highway (our satnav often doesn't notice when i'm driving until we are going around the roundabout) then at 70mph your covering 31.5 meters a second, or 34 yards per second. It's going to cover 200 yards in 5.88 seconds. The (British) highway code has a table in the back which recognises that a driver paying attention is going to react to a threat and hit the brakes in three quarters of a second before coming to a halt approximately one hundred yards (or 91 metres) after noticing the threat requiring you to stop. Paying 100% attention you would therefore have approximately two seconds to realise that a self driving car had been stupid and left the highway at a junction and apply the brakes to stop at the lights, assuming that the slip road was completely empty with no cars were parked at the junction to hit.

Any human "supervising" a self driving car for any sort of time period is going to have mentally switched off to the point that it's likely to take the average person at least 5 seconds to realise that the self driving is about to become self crashing, come to the realisation that they need to assert control and go for the brakes. Then you need to move your feet from their rest position to go for the brakes which is likely to take at least three quarters of a second; which leaves you in a position where your foot has depressed the brake pedal around 13 milliseconds before impact, which sounds tragically descriptive of this incident.

This is why self driving cars were, are and will always be a stupid idea. While the driver is of course legally responsible for driving their vehicle, the self driving AI ought to have it's licence revoked.

Oh, that's right. Self driving cars can't drive well enough to pass a driving test. So why should they be allowed to be in control of a vehicle? The apologists for Tesla will of course point out that "auto pilot" and "full self driving" are just driver assistance tool like ABS or cruise control, which ignores that "autopilot" and "full self driving" are explicitly marketed as being fully self driving to people who can't understand the (serious) limitations of this technology. Tesla are certainly morally liable for the people their autopilot has killed, and while people are reasonably willing to accept Tesla drivers killing themselves they aren't anywhere near as likely to accept this killing other road users who are driving safely and legally.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon