back to article Tesla sued over Tokyo biker's death in 'dozing driver' Autopilot crash

Tesla is being sued by the widow and daughter of a man killed when an allegedly dozing driver let his Model X’s Autopilot feature steer it into a group of people. The complaint was filed on Tuesday in a court in the Northern District of California and assigned to Magistrate Judge Susan van Keulen yesterday. Yoshihiro Umeda, a …

  1. nanchatte

    Nah, Yeah... It's totally Tesla's fault that I fell asleep at the wheel, guvn'r...

    1. IGotOut Silver badge

      It's an obvious side effect of a half arsed "self drive". It's not good enough to actually self drive, despite its stupid name, yet the driver has so little to do, that boredom and lack of attention are almost a given

      1. jospanner

        It needs to work properly or not at all.

        Although I am a curmudgeon when it comes to this sort of thing. I love my old car without any assists or other nonsense. Technically less safe? Sure. But that is my responsibility. VW didn't lie to me and say that the car would drive itself...

      2. a pressbutton

        I Think it is a bit more interesting than that.

        Tesla's 'autopilot' is probably quite a reasonable driver. Possibly average.

        The thing is that it is not perfect.

        So, when it goes wrong and something bad happens, who is at fault?

        The answer on the face of it is pretty clear - the _human_ driver.

        Where it gets interesting is when the victims of the collision claim that the accident happened because the car is 'defective' and but for the defective Autopilot, that man would be alive.

        Tesla's lawyers have a narrow path to follow. They need to show that

        - the software did not fail to stop

        and

        - the _human_ driver was in control

        Popcorn...

        1. Bbuckley

          Absolutely wrong! The car can either drive itself or it cannot. My opinion is that AI will NEVER be able to drive a car - self driving can only be done by trains running on a track physically separated from people like those at airports. End of Story. There will never be safe self-driving cars (unless they are in tubes or on tracks).

      3. vtcodger Silver badge

        License to Kill

        People want autonomous vehicles. And were I a gambler, I'd bet on them happening eventually. But, flawed though human drivers are, the technical problems of "safer than human" autonomous vehicle control are enormous. The record is clear. Tesla's "autopilot" system is not reliable and is a menace to innocent third parties. And Tesla apparently can not be trusted to deploy driver assistance with proper concern for safety. It's past time to turn their system off. Permanently.

        If ("when" I think) some more responsible company eventually creates a vehicle control system that drives better than humans, Tesla can license the technology from them. In the meantime, Tesla's system should be turned off and left off. The $5000 folks paid for it should be refunded.

        Tesla's license to kill should be revoked.

        1. Anonymous Coward
          Anonymous Coward

          Re: License to Kill

          Yeah of course the machine did it?

          There would be a tiny if not zero number of drivers of a Tesla car that actually think their car is "self driving" and doesn't need any driver monitoring. Does this stop drivers falling asleep? Nope. Same as they do in other cars.

          Many other cars also have an automatic braking system that is the same or functionally similar to the on in this model of Tesla. This also does not stop the driver crashing into parked vehicles. My car's AEB decided that the car that had just pulled out in front of me was not a threat and did not activate AEB.

          The Autopilot software on that vehicle is a driver aid that can also be found on many other vehicles. It is not designed to be a full autonomous driving experience. Now there are millions of Teslas on the road with millions of drivers using them correctly without incident then it can't be put down as the car is dangerous and should be taken off the road. Otherwise everytime a driver tries to operate any car incorrectly and has an accident in any car then the same claim could be made.

          Every time a Ferrari has a crash at high speed it should be banned as they are marketed for their performance and they don't stop a driving doing it then they should be liable for the crash?

          As the driver aids are primarily safety features then it would surely be prudent to ask how many situations they have helped save lives or reduced accidents or avoided an incident. If a machine is able to reduce injury by ten times more than a time when it was unable to prevent it (but didn't cause it) then that would be pretty good? In fact most people's thinking would surely be "even if it can't prevent it in every case, surely any machine or device that can prevent a few injuries or deaths is better than one that can't"?

          The latest software will stop at traffic lights and junctions, which is a major problem in certain places (people running through red lights) so could potentially be a major lifesaver?

          But, you know when it's Tesla then of course certain people will just think, "ban it", "I don't understand the technology so it can't be good", "The name would confuse me so it must confuse everyone else".

          1. ChrisC Silver badge

            Re: License to Kill

            "Many other cars also have an automatic braking system that is the same or functionally similar to the on in this model of Tesla. This also does not stop the driver crashing into parked vehicles. My car's AEB decided that the car that had just pulled out in front of me was not a threat and did not activate AEB."

            And did you crash into it, or was your level of engagement in the driving process sufficiently high that you were able to brake manually anyway?

            "The Autopilot software on that vehicle is a driver aid that can also be found on many other vehicles. It is not designed to be a full autonomous driving experience."

            That might be what Tesla are saying in public, but that definitely isn't the impression they're giving to a growing number of their customers. And regardless of what their official stance is on Autopilot, human nature being what it is, the more driver aids that are provided to make it "easier" to get your vehicle from A to B, the less engaged the driver is likely to become with the whole process.

            And when you're less engaged in something, your levels of attention drift, your ability to quickly respond to changes in the process diminishes, and BOOM you've just driven into the back of a stationary obstacle that an alert driver would have more likely have been able to avoid...

            1. TonyJ

              Re: License to Kill

              "...human nature being what it is, the more driver aids that are provided to make it "easier" to get your vehicle from A to B, the less engaged the driver is likely to become with the whole process..."

              Cannot agree with this more. I've driven older, high performance cars such as 911's that only give power steering and ABS grudgingly, and even then, if at all if you're lucky. Stepping out of a modern car into something like that really reminds you as a driver how much you take the modern accoutrements for granted.

              Suddenly finding you are the only thing controller 300+bhp in a read wheel drive car with no traction control and no ESP tends to make you pay a LOT more attention.

              I had an S5. Four wheel drive and all the above aids and it drove like it was on rails while tending to stick to the road like glue. Even then it was terribly easy to speed because at 100MPH it was barely ticking over.

              I had a tyre blow on a country road - twisty and turning so I wasn't going particularly fast but apart from the tiniest of wobble on the nose it pretty much carried on as if nothing had happened.

              Driverless cars are, I believe, an inevitability but they do need to get much much better.

            2. Doctor Syntax Silver badge

              Re: License to Kill

              "And did you crash into it, or was your level of engagement in the driving process sufficiently high that you were able to brake manually anyway?"

              And if so how much was this delayed by expecting the AEB to have braked?

        2. jmch Silver badge

          Re: License to Kill

          "The record is clear. Tesla's "autopilot" system is not reliable and is a menace to innocent third parties"

          Tesla's autopilot is clearly not perfect, but that should not be the bar to clear. Is it better than the average human driver? We hear of all the accidents Tesla provokes, what about the ones it avoids that a human wouldn't have? Is that 1,100,10000? Do even Tesla themselves have any idea?

          In this specific case, yes the driver should have been in control, but clearly his inattention wasn't detected by the car, or it was and the driver wasn't sufficiently warned.

          With respect to the crash itself, this scenario is a known one from previous accidents. Surely Tesla should have tweaked the software to pay extra attention when a vehicle it has been following changes lane. So in this case surely Tesla has a case to answer.

          The single case however says nothing about general viability of self-driving cars

          1. druck Silver badge

            Re: License to Kill

            Tesla's autopilot is clearly not perfect, but that should not be the bar to clear. Is it better than the average human driver?

            Better than the average driver is nowhere near good enough, it needs to be two or even three orders of magnitude better, so it is a lower probability than a mechanical failure.

            Everyone who drives non automated vehicles has had minor lapses of attention and knows how easily it could have lead to an accident, so are accepting to some degree of genuine failings of other humans. However, no one will ever accept a death caused by the failing of AI, is not something that should have been prevented.

    2. Anonymous Coward
      Anonymous Coward

      An adaptive cruise control is supposed to stop itself from driving into the vehicle in front, regardless of speed.

      1. SJA

        The tesla did not drive into another vehicel.

      2. Francis Boyle Silver badge

        Not if the speed is zero

        Tesla's system is well known not to be able to "see" stationary objects which is actually fine for cruise control. (A stationary car, fire truck, or barrier is not "the car in front".) But since people can't be relied on to remain attentive when everything is being done for them the onus in on Tesla to up it's game. Maybe Musk should just swallow his pride an accept that lidar has its uses.

        1. Anonymous Coward
          Anonymous Coward

          Re: Not if the speed is zero

          If the stationary object is right in front of you and where you are going, any adaptive cruise control bloody well ought to be able to perceive it as such. If it can't do that because of being confused by things to one side, i.e. not where you are going, there is something wrong with it.

      3. Alan Brown Silver badge

        Adaptive cruise control

        The ACC on my 17 year old Nissan Primera cuts out below 23mph and has a number of limits on operation in the manual - including that it will only ever operate a MAXIMUM 1/4 of total braking force.

        The ACC on Tesla is explicitly noted in the manual as NOT being able to stop for stationary objects when it travels in excess of 50mph

        Different ACCs have different capabilities. More modern ACCs will operate down to 0mph on vehicle with automatic gearboxes, as a for instance - but they can still be spooked by paper bags amd overhanging hedges

    3. Anonymous Coward
      Anonymous Coward

      Queue - Comments by Tesla Fanbois

      who all worship the ground that Elon walk on.

      If you thought that Apple fans were bad then the Telsa versions are far worse.

      In their minds, Tesla and Elon can do no wrong... ever.

      bring on the downvotes.

    4. Captain Scarlet

      I would say both at fault, if you driving and become tired you need to get off the road asap.

  2. Brian 3

    and it's totally their fault that this group of idiots illegally stopped on the inside lane of a motorway too!

    1. Anonymous Coward
      Anonymous Coward

      Idiots for trying to help people who had an accident.. really? So next time you have an accident and get trapped you want people to just drive past you and leave you without helping? After all, they are idiots if they stop to help you right?

      1. Black Betty

        If the experts aren't safe,

        ...amateurs are at greater risk.

        It is not an uncommon occurrence for police and emergency workers, who know what they are doing, at accident sites to be hit by rubberneckers and otherwise inattentive or distracted drivers. A milling crowd of bikers ended up becoming part of a bigger problem than they were solving.

        Motorway laws are strict for a reason. "No stopping for any reason.", not except when it seems like a good idea at the time. You can end up with hefty fines for running out of fuel, or breaking down due to a known problem. If there are already people rendering assistance, you're more likely to make the problem worse than help.

        1. Joe W Silver badge

          Re: If the experts aren't safe,

          Not in civilised countries, no. In civilised countries you are required to stop and help. By law and by moral obligation.

          1. Charles 9

            Re: If the experts aren't safe,

            Oh? NAME the law, then, because what you're describing is a Duty to Rescue, which most civilized countries won't carry because obligation doesn't automatically convey ability, resulting in the common situation of someone trying to help only making the situation worse. Good example: how does a Duty to Rescue apply to someone passing someone drowning in the river when the bystander can't swim?

        2. JakeMS

          Re: If the experts aren't safe,

          That may be so, but sometimes it can take too long for those services and someone stopping to help could be the difference between life and death. This will be even more the case as electric cars come more common place that have big batteries which can ignite easily if you have a huge crash. (Ask Richard Hammond). You will want someone to help you get out of that car fast.

          Someone simply blocking a wound from bleeding out when your trapped in the car and unconscious can save your life too, there are many instances where stopping can save a life.

          If you witness an accident in the UK and stop to help, often once the police arrive they are quite happy you did, because someone who wasn't involved in the accident can be used as a witness, which is a far more reliable source of what happened than the two drivers who will blame each other.

          They will take your statement, thank you for helping and send you on your way.

          Most humans with any common decency will stop to help, If you have a heart attack on the street, would you say that a passer by shouldn't try to resuscitate you? Wait for a qualified doctor?

          I have training to revive people in simple cases (I've revived 3 people so far), as a passer by I would stop to help. But I'm not a doctor, so I should be told to leave you and not help?

          And, in this event that occured, a human driver would change lanes to go around the accident.

          Not simply plow into someone because "they shouldn't be there".

          A child runs into the road chasing a football, by the logic of this car:

          Speed up and run the kid down

          Human driver:

          Slam on breaks and/or swerve.

          Unexpected events happen on the road, human drivers take action for those events, and if the car is driving, it should too!

          You have to think of someone other than yourself sometimes...

          1. Alan Brown Silver badge

            Re: If the experts aren't safe,

            "A child runs into the road chasing a football, by the logic of this car: Speed up and run the kid down

            Human driver: Slam on breaks and/or swerve."

            You would be gobsmacked by the number of HUMAN DRIVERS who will blindly drive into the child and then blame the victim for running onto the road - an EXPERIENCED driver will be braking as soon as the ball appears.

            Similarly for unobservant drivers NOT seeing moving feet in the visible gap under parked cars as a precursor for an obscured child moving into full view.

            Individual drivers need to be taught these risks and it frequently doesn't stick.

            Once an AI is taught these things, ALL vehicles using that AI know the risk. In addition AI, observation systems don't lose concentration or get distracted by the nice arse on the girl on the footpath.

            That said, Tesla's "autopilot" is an advanced cruise control/driver aid system - NOT an AI and anyone dozing at the wheel whilst allowing the car to continue travelling should be charged with culpable homocide (choosing to operate in a dangerous manner)

            1. Charles 9

              Re: If the experts aren't safe,

              Many times the gap is too narrow for the viewing angle of the driver (especially if it's a car with a low ride height). Plus what if it's the child that pops out first...three feet from the driver? I don't care if you're going 5 mph, a child that pops out of the blue a mere three feet in front of you isn't going to end well, full stop.

              IOW, there's always Murphy.

          2. werdsmith Silver badge

            Re: If the experts aren't safe,

            A child runs into the road chasing a football, by the logic of this car:

            Speed up and run the kid down

            No that’s not the logic here, in this case the cruise speed was set to a higher speed but the car was going slower due to a slower car in front. When the car in front got out of the way the Tesla attempted to accelerate to the set cruise speed and then encountered the accident and didn’t have sufficient time to stop.

            My car is not a Tesla, but it will run behind another car in adaptive cruise, at the speed of the other car, when the car pulls off my car will accelerate up to its set cruise. Nothing special or Tesla about this, it’s common these days across all makes.

            I’ve not encountered a blocked lane in theses circumstances. I once encountered a lady who stepped onto the road in front of me in a High Street. The car was braking itself before I could react.

        3. IGotOut Silver badge

          Re: If the experts aren't safe,

          @Black Betty

          Rule 270

          You MUST NOT stop on the carriageway, hard shoulder, slip road, central reservation or verge except in an emergency, or when told to do so by the police, traffic officers in uniform, an emergency sign or by flashing red light signals. Do not stop on the hard shoulder to either make or receive mobile phone calls.

          I'd class helping at the scene of an accident as an emergency, and so would every police officer in the UK.

        4. jmch Silver badge

          Re: If the experts aren't safe,

          For the record, don't know in Japan but AFAIK on EU, cars are required to have an emergency triangle at all times. In case of any accident or incident that leaves a vehicle in a dangerous position for others, this should be set up 100m further back.

          Most people grossly underestimate how far 100m actually is. And even if the bikes didn't have these triangles*, the van involved in the first accident would have. While assistance should always be offered to those in need, those offering assistance especially in an active lane should be extra prudent.

          Ultimately though, it is the drivers' responsibility to be alert and avoid accidents.

          *Not having where to fit a triangle, I always carry an emergency yellow vest on my bike.

        5. Anonymous Coward
          Anonymous Coward

          Re: If the experts aren't safe,

          I am a trained first aid responder who also has a knowledge of electrical and chemical hazards and how to deal with them. If I'm not allowed to stop to help deal with an emergency in the absence of the official services, something is wrong.

          Fortunately perhaps for someone one day, you are wrong.

    2. Anonymous Coward
      Anonymous Coward

      I get the impression this was in Japan. Are you an expert on Japanese traffic law?

      If the event is as described, then if the car in front of you moves out suddenly to avoid a slower vehicle in front, your car may collide with that vehicle. Are you proposing a new law that anyone in front of a Tesla must get out of the way or travel faster?

    3. jospanner

      They were attending an accident, what do you want them to do?

    4. Yet Another Anonymous coward Silver badge

      Worse than that - they weren't even driving Teslas......

    5. Lennart Sorensen

      Since when is it illegal to be stopped because of a collision? And what do you know about Japan's traffic rules? They could be stopped because the road was jammed with other vehicles, or the car broke down. Lots of reasons that a vehicle can end up stopped. Tesla's autopilot isn't what its name suggests and doesn't work well at all.

    6. Anonymous Coward
      Anonymous Coward

      @Brian 3 - Sorry, mate

      you don't just crush people (or any other form of life for that matter) in front your car just because they're sitting there illegally. That's not what a responsible driver would do.

      What Tesla does here is pushing imperfect AI hoping somebody else will be made responsible for its failures. We all knew there are and (there will be) special cases when modest human intelligence (like aunt Lucy's for instance) will be vastly superior to what Silicon Valley programmers wil ever come up with. We're not talking about Alexa here, a car driven by algorithms is a deadly thing and I certainly wouldn't want to be in its way or behind its wheel.

      Sure, planes can land automatically but take a look at the conditions in which this can be attempted and consider the level of training and preparedness of the two pilots. Now compare this with the poor chap eating his doughnut in the driver seat of a Tesla under the control of the so called autopilot.

      1. Yet Another Anonymous coward Silver badge

        Re: @Brian 3 - Sorry, mate

        Not just Tesla - though admittedly they made it worse by telling everyone it was auto-pilot

        I got a hire car with "adaptive cruise control" but I didn't know that.

        I pulled out onto the M25 and clicked it to 70mph (cos I don't want a ticket) but I'm stuck behind a truck doing 60. The truck pulled off and the car accelerated forward upto 70mph.

        That's when I worked out what it was doing - but if I had been half asleep and the truck had pulled over to avoid a crash/debris/etc the car's cruise control would have driven me straight into it.

        1. ChrisC Silver badge

          Re: @Brian 3 - Sorry, mate

          "That's when I worked out what it was doing"

          I would have thought the fact that you hadn't already driven straight into the back of the HGV despite it travelling 10MPH lower than the speed you'd asked the cruise control to attain ought to have been a giant clue that it wasn't just a plain old "maintain this speed unless physically impossible to do so" type of cruise control...

          ...or did you actually think it was a speed *limiter*, and that all the time you were sat behind the HGV it was your right foot rather than the car electronics responsible for maintaining a steady 60MPH?

          1. Intractable Potsherd

            Re: @Brian 3 - Sorry, mate

            @ChrisC - go back and read the comment again, then come back and apologise.

            1. ChrisC Silver badge

              Re: @Brian 3 - Sorry, mate

              I read it a few times before posting my earlier reply, and having read it again now I still don't see any reason to change my PoV here, except to say that suggesting they may have thought it was a speed limiter they were setting was incorrect.

              However, I'm willing to be educated as to why I'm wrong here, so if anyone of the people downvoting me would like to put in a little bit of effort to explain why, rather than just either doing so silently or with a non-educational response like this one, please do - I genuinely am interested to know.

              My interpretation of YAAC's post was that:

              1. YAAC hired a car without realising it had *adaptive* cruise

              2. They did however believe it had standard cruise, which then they set to 70

              3. On encountering a vehicle doing only 60, they maintained a steady speed behind it despite having set what they thought was standard cruise to 70.

              At this point, there are only two ways for their car to have not rammed itself into the backside of the lorry at a closing speed of 10MPH. Either YAAC adjusted the cruise settings (by lowering the setpoint to 60, or by manually braking), or the car reduced speed all by itself.

              Now, had YAAC adjusted the cruise settings themselves, they'd then have had to adjust them *again* in order for the car to then accelerate itself back up to 70 once the lorry moved, so it wouldn't have taken them by surprise.

              Which leaves only one option based on the information provided - the car matched the speed of the lorry all by itself...

              If that was the case, then it's at THIS point in the drive where it should have been obvious the car didn't just have standard cruise control.

              1. Intractable Potsherd

                Re: @Brian 3 - Sorry, mate

                Thanks for getting back, Chris. Which leaves only one option based on the information provided - the car matched the speed of the lorry all by itself...

                If that was the case, then it's at THIS point in the drive where it should have been obvious the car didn't just have standard cruise control."

                I think that is exactly it - the cruise control was set at a point it wasn't able to attain 70mph. For whatever reason (and I'm not the OP), it became obvious that it was adaptive only when the acceleration began automatically. The OP didn't say that s/he had a problem with it, just that an unwary driver could have been caught out, and ended up in a situation where the car ran into something.

                For my part, upon re-reading what the OP wrote, I can see that there are some oddities in how it is constructed, but the message still seems to be clear to me - it accelerated without driver input, which is (to my mind) inherently dangerous, especially so if not ready for it.

                When I have a hire car, I don't play with the toys until I've got used to the basics (around 30 miles, usually), and definitely not in traffic. If possible, it'll wait until I've read the manual or found information online. I don't usually drive modern cars, and so know that there will be some hideous piece of automation that will work contrary to my expectations of being in control.

          2. Alan Brown Silver badge

            Re: @Brian 3 - Sorry, mate

            "I would have thought the fact that you hadn't already driven straight into the back of the HGV despite it travelling 10MPH lower than the speed you'd asked the cruise control to attain ought to have been a giant clue that it wasn't just a plain old "maintain this speed unless physically impossible to do so" type of cruise control..."

            That and the ACC's display showing that it had locked onto a leading vehicle, plus the setting for adjusting following distances. Even my old Nissan Primera has that as a LCD display along with warning beeps.

            You have to wonder about the competence of the OP

  3. Anonymous Coward
    Anonymous Coward

    Took delivery of a Model 3 just yesterday...

    Took delivery of new Model 3 just yesterday and on the drive home had a WTF moment. Driving on a (UK) motorway with 70mph limit reduced to 50mph temporarily, enforced by average speed cameras; very light traffic; plenty of gap between me and car in front so it seemed like a good point to try out the cruise control.

    Engage cruise control and ... major wtf.. the car starts accelerating hard, way past 50. Hastily cancel. Get home, check the manual and apparently it's designed to set itself to the current speed limit irrespective of what speed you're doing when you engage it. So it would have happily taken me to 70 and I'd have got a ticket.

    Every other car in the world, engaging (not resuming) cruise maintains your current speed. Even other Teslas, apparently. The Model 3's behaviour is just utterly insane. No idea which designer dreamed up that idea but thank God they didn't put him in charge of which pedal is brake and which is accelerator.

    [Tesla will probably claim that because the cruise is adaptive it doesn't matter. Well, it does to me: if I engage cruise in a line of traffic trundling along at 40, I don't want to hurtle towards the car in front and pray that the system then slows down. Likewise I wouldn't want to be the person in the car in front seeing someone hurtle towards me in the mirror.]

    1. Emir Al Weeq

      Re: Took delivery of a Model 3 just yesterday...

      >because the cruise is adaptive

      But it's not adaptive: the limit had been temporarily reduced and it didn't adapt.

      1. Anonymous Coward
        Anonymous Coward

        Re: Took delivery of a Model 3 just yesterday...

        Excuse me, but isn't the "termporary reduction" one of the fundamental aspects of "adaptive" cruise control? If not, please define your criteria.

    2. Stratman

      Re: Took delivery of a Model 3 just yesterday...

      Interesting that Tesla appears to treat the speed limit as a target.

    3. werdsmith Silver badge

      Re: Took delivery of a Model 3 just yesterday...

      No, every car drives at the speed that it is set to. They read the speed limit signs so they know when they are in a temporary but if a drive has set a different speed, that’s the one it will try to achieve, unless there is a vehicle in front travelling slower.

  4. jospanner

    Yeah this seems like a reasonable case.

    If the Tesla's "autopilot" makes drivers think that the car can drive itself, when it can't, then this is clearly a Tesla problem.

    Don't sell the car as having an "autopilot" mode. This is like if seatbelts came with a notice that said "protects you regardless of whether you wear it or not!" It's not babying, it's actively misleading.

    1. Black Betty

      Tesla's "autopilot" feature also presents multiple warnings before engaging. The real problem is that no amount of product warnings can prevent idiots from being idiots.

      If anything, the problem with Teslas is that under normal driving conditions, Tesla's "autopilot" works too well and drivers get away with eating, reading, playing games and otherwise doing anything but paying attention to their driving again and again. Until one day they don't.

      1. Alan Brown Silver badge

        "Tesla's "autopilot" feature also presents multiple warnings before engaging."

        But it's still called "autopilot" and wilful twattish drivers remain wilful twattish drivers

        IIRC Germany only allows it to be called "advanced driver assist"

  5. Anonymous Coward
    Anonymous Coward

    Mr. Musk, please

    do not ask me to babysit your AI while you're bragging about how great it is. This is cheating.

  6. Andre Carneiro

    Shouldn't they be suing the driver? Surely he's the one who failed to be in control of the vehicle?

    1. Anonymous Coward
      Anonymous Coward

      @"Shouldn't they be suing the driver? "

      The police should be charging driver with murder along with tesla who seem to have programmed their vehicles to drive dangerously.

      In most countries the driver is held responsible for their vehicle causing accidents, if they feel that they were mislead by tesla then they can take it up with them seperately, this since the driver chose to trust tesla and there has been enough evidence to raise reasonable doubts about tesla's safety

      1. Anonymous Coward
        Anonymous Coward

        Re: @"Shouldn't they be suing the driver? "

        Manslaughter. Murder implies the intention to kill.

        1. This post has been deleted by its author

          1. TRT Silver badge

            Re: @"Shouldn't they be suing the driver? "

            Sorry. It's just when I see that word it always puzzles me how the same letters in the same order can have such disparate meanings just by spacing them.

        2. Alan Brown Silver badge

          Re: @"Shouldn't they be suing the driver? "

          "Murder implies the intention to kill."

          Deliberately making a decision to continue operating a dangerous machine when impaired through fatigue, drink or drugs is grounds for vehicular homicide charges, in no different way to operating a dangerous machine such as a gun in such a condition.

          If you CHOOSE to operate a machine when in an unfit condition to do so, then you should face the full consequences of what may go wrong, not get a slap on the wrist and told "oh, it was an _accident_"

          The fact that so many people here - and so many jurisdictions worldwide - CHOOSE to defend the indefensible is _WHY_ there is a serious problem with road crashes and driving impaired.

          1. Anonymous Coward
            Anonymous Coward

            Re: @"Shouldn't they be suing the driver? "

            "The fact that so many people here - and so many jurisdictions worldwide - CHOOSE to defend the indefensible is _WHY_ there is a serious problem with road crashes and driving impaired."

            Because there are too many people utterly dependent on their cars just to make a living...and have spouses and kids to feed. The problem behind the problem is unintended consequences. Throw a bunch of breadwinners in prison, who's going to buy the groceries?

  7. Anonymous Coward
    Anonymous Coward

    why it this civil litigation rather than criminal

    What exactly are the police doing during all these deaths due to tesla's faulty technology? which it must be said seems to be designed to "self" drive in a dangerous fashion.

    All this company's self driving vehicles should be removed from the road until they admit that there cars do not self drive safely and the company should be charged with corporate murder given that there has been enough evidence to suggest that the company has been both aware that their cars were dangerous and that the danger is due to their software intentionally driving dangerously.

    1. Black Betty

      Re: why it this civil litigation rather than criminal

      Tesla makes no such claims. Tesla repeatedly warns drivers that they must retain control. Problem is that drivers discover that under most circumstances they can get away with not paying the attention that they should, and then one day the world blows up in their faces, but it's not their fault for doing exactly that thing that they were repeatedly told not to do.

      1. Anonymous Coward
        Anonymous Coward

        Re: why it this civil litigation rather than criminal

        In fact, the problem today is that the Internet creates a culture of people who want to have the most exciting thing to report, and people who want to emulate them. This is me sending a text in my Tesla. This is me editing a document in my Tesla. This is me playing guitar in my Tesla. And so on.

        I suspect most of us had that annoying kid in their class who was a bit out of control and constantly doing stupid things. Now imagine a world in which all the people like that were in communication with one another. The most connected country is probably the US, and so you get mobs of people carrying guns outside supermarkets, people trying to do the most stupid things in their cars, and the bigliest conspiracy theories. In the UK you get morons attacking phone masts because they don't understand the difference between computer and people viruses. They had rather believe people like them than annoying experts.

      2. Doctor Syntax Silver badge

        Re: why it this civil litigation rather than criminal

        You should ask yourself whether a car which leads drivers to behave in this fashion is fit for purpose.

        1. Charles 9

          Re: why it this civil litigation rather than criminal

          So put a spike in the steering wheel...and then your spouse gets rammed by a ghost driver.

          Sometimes, you just can't win.

  8. SJA

    Autopilot

    Who has ever wondered why commercial airplanes still have pilots? I mean they've had autopilot for decades....

    1. Anonymous Coward
      Anonymous Coward

      Re: Autopilot

      If you visit the Tesla Fanboi sites, it won't take you long to find posts like...

      "I can't wait to be able to sleep while driving to work"

      "I'll send my car out to get a charge"

      etc

      etc

      There is a particular planet that these people come from. It sure ain't Earth. Mind you the USA seems to have more than its fair share of crazies. Like those who turned up to a [cough][cough] peaceful demo in Michigan yesterday armed with Assault Rifles. (Pictures on BBC News site)

      The first time I saw people openly carrying Rifles while out shopping freaked me out. That was part of my decision not to want to stay there beyond my 2-year secondment.

      1. Doctor Syntax Silver badge

        Re: Autopilot

        "The first time I saw people openly carrying Rifles while out shopping freaked me out. That was part of my decision not to want to stay there beyond my 2-year secondment."

        I don't think I'd want to go there for two minutes and I lived in N Ireland for 19 years, mostly during the troubles.

    2. William Old
      Joke

      Re: Autopilot

      You've probably read this already somewhere else, but... in future, advances in AI will mean that commercial airplanes will be completely autonomous, but there will still be a man and a dog in the cockpit.

      The role of the man is to ensure that the dog has food and fresh water throughout the flight.

      The role of the dog is to bite the man if at any time he tries to fiddle with the controls.

      1. Anonymous Coward
        Anonymous Coward

        Re: Autopilot

        And if the man happens to be wearing an armored glove...?

    3. Anonymous Coward
      Anonymous Coward

      Re: Autopilot

      Because plane manufacturers like being able to point to the pilot when something goes wrong. Designing a plane without pilot at this point is probably possible, or very close to (it's easier than for cars), but they sure don't have an incentive to be assigned all the blame by default.

      See what happened for the 737 MAX.

  9. Anonymous Coward
    Anonymous Coward

    Only A Fool Trusts Tech Absolutely

    There is NO safe autopilot software and I doubt there ever will be.

    We have soo many crashes around the world and they are mostly controlled by the smartest things on earth.

    Only A Fool Would Trust Tech with something as precious as your own life.

    Drivers fault NOT the tech.

    Ad man will tell you whatever you want to hear to sell you something, let the buyer beware.

    1. SJA

      Re: Only A Fool Trusts Tech Absolutely

      Yeah, it's time the humans finally get replaced behind the wheel. There's thousands of car crashes everyday because of human drivers.

      1. Doctor Syntax Silver badge

        Re: Only A Fool Trusts Tech Absolutely

        Convert that to crashes per vehicle mile and then see how well AI can compare.

        1. Anonymous Coward
          Anonymous Coward

          Re: Only A Fool Trusts Tech Absolutely

          AIUI, the AI has us there, too, as it's the rarity of the accidents that makes them stand out. "One death is a tragedy, a million is a statistic," and all that rot...

    2. Anonymous Coward
      Anonymous Coward

      Re: Ad man will tell you whatever you want to hear

      The only problem with that statement is that Tesla does not advertise. It relies on its army of disciples to spread the Musk Gospel.

      I was a member of the cult for 3 years before I escaped. It is all consuming. You dare not criticise anything that the new messiah says or does or you risk excomminication. I sold my model S and escaped.

  10. Anonymous Coward Silver badge
    Holmes

    GunsTeslas don't kill people, people kill people.

  11. quattroprorocked

    Basic human factors

    The human brain is simply crap at paying attention to stuff, unless it is DOING the stuff.

    We've known this for decades. Why do car companies have to learn it all over again?

    IMO Tesla is probably good enough for platooning on motorways if the lead vehicle is being actually driven. And that's it.

    1. Charles 9

      Re: Basic human factors

      Because they're being PAID for it.

      "Will you walk away from a fool and his money...sonny?"

      Especially since if you don't, someone else likely will...and then use the money to take you to the cleaners.

  12. Bbuckley

    The obvious lesson is that we will never have self-driving vehicles other than trains. So dump the stupid and dangerous technology and make drivers into drivers.

    1. Charles 9

      1. People are PAYING for the privilege. LOADSAMONEY. SOMEONE's going to fill that demand, guaran-damn-teed.

      2. You can't fix stupid, and there are a lot of stupid breadwinners out there.

  13. j.bourne
    Mushroom

    Driver aids or auto drive?

    Here's my problem with the whole concept behind the Tesla driver aids, and it's summed up neatly in this quote from the article

    " he was able to "doze off" while not triggering any of the car's alarm features. These chimes and visual warnings are supposed to ensure drivers with the enhanced lane-keeping and cruise control features engaged pay attention to the road."

    If, with the 'aids' engaged, the driver is still supposed to be paying full attention to the road (obviously) why does the driver need the aids? If they're paying attention then the features aren't needed as the driver is perfectly capable of managing to control the vehicle. If the driver needs the aids then clearly the driver isn't capable of controlling the vehicle so shouln't be driving in the first place. The aids are superflous.

  14. Greg 38

    Darwin at work

    "Tragically, an Apple engineer was killed in the same year when his Tesla Model X accelerated into a roadside crash barrier."

    No, not tragically. This is Darwin thinning the herd of those who blindly bet their life that every possible scenario has been preconceived, properly programmed, and correctly implemented with fully functioning sensors. Honestly, assuming ALL possible scenarios can be programmed into the software to allow autonomous driving is the definition of hubris. Of all people, this Apple engineer should have known better. And as an engineer myself, sorry, I'm not trusting my life or my family's to the rest of the lot.

  15. Under the Looking Glass

    Under the Looking Glass

    I believe this is a very interesting case study for several reasons:

    1) This is perhaps the only company that get every single accident published and becomes subject of scrutiny (is just a statement of facts, not a critic to the news outlets)

    2) Been the only Company that releases a intermediary step into FSD as "Beta version", even as it warns on every autopilot engagement the driver has to be alert at all times and is accountable, is confronted with human nature as it is, so yes, it is using an undefined gray area

    3) Tesla publishes every Quarter the accident statistics showing when Autopilot is on, with Autopilot off, active safety features on and Autopilot off, active safety features off ( https://insideevs.com/news/419460/q1-2020-tesla-autopilot-lowest-accident-ratio/ )

    4) Is the very same situation as when every battery fire from a Tesla car were published, but were not given the number of ICE car fires reported for more context. Even though the fire statistics from Tesla proved to be much lower than on ICE cars, Tesla as forced to take measures to make the cars safer in case of accidents, also improving battery chemistry and efficiency as a side effect. So when a company takes such scrutiny in a positive way, it will undoubtedly drive to a better product over time, with or without advertisement.

    If you ask any large Insurance Company about the prospects of FSD, they will tell you what a dramatic impact will have on yearly accidents deaths as the great majority are due to human factors, so the efforts to achieve it are very commendable, and the probability of success are high due to processors having a much higher reaction speed as any human can ever dream to have. it is no longer a matter of if but of when, and the driving time factor is the number of programming hours invested, feedback from real life events while using it, improvement of neural network data processing and dependability of sensors/cameras used.

    I remember in the times when the Distributed Control Systems for Chemical Processes became more sophisticated and capable of running complete recipes, control room operators mistrusted the systems and were prone to take to "manual" because they were better, but the statistics showed that after a period of adjustment and big data analysis, even the best operator wasn't as reliable. After removing the human factor, all the attention moved to maintenance and improvement of measuring instruments and sensors. Mobility Automation is the next big step for Earth bound humans and will come sooner than many think, whether is wanted or not.

    1. Anonymous Coward
      Anonymous Coward

      Re: Under the Looking Glass

      Welcome to The Register.

      "I remember in the times when the Distributed Control Systems for Chemical Processes became more sophisticated and capable of running complete recipes, control room operators mistrusted the systems and were prone to take to "manual" because they were better, but the statistics showed that after a period of adjustment and big data analysis, even the best operator wasn't as reliable."

      I'd accept that, when a properly designed and maintained system was working within its predicted operating envelope (including foreeseen failure modes).

      Anyone still remember Buncefield, and the analyses that followed?

      Where in the lifecycle of "continuous product and service improvement" are Tesla's vehicles, and where are they headed?

  16. Man inna barrel

    Driver has nothing to do. Dozes off. Bang!

    If the car AI takes control of tasks that the driver would normally do, then I do not think it surprising that the driver fails to pay attention when they need to. I get the impression that train drivers need special training to cope with long periods of doing nothing, while paying attention all the time.

    This reminds me of a description of one of the worst jobs ever (in my opinion). This was in the 80's. In a canning factory, there is a labeling machine. Labelled cans stream out of the machine many times a second. An operator observes the cans zooming past. It is just a blur of colour. The operator's job is to stop the line if there is a visible change. If the machines are working well, the operator might never see a fault all day. I do not know if operators got special training for this, or were only put on inspection for limited periods, to preserve their sanity.

  17. Alan Brown Silver badge

    Naming of devices

    This is _WHY_ Germany forced Tesla to drop use of the name Autopilot in that country

    It's an advanced cruise control, that's it - and unlike aviation (where autopilots range from "keeping directional heading" and nothing else - through to full automated flights from takeoff/landing), drivers are not _required_ to be certified to know all the limitations of their devices before being allowed to use it.

    1. Charles 9

      Re: Naming of devices

      "...and unlike aviation (where autopilots range from "keeping directional heading" and nothing else - through to full automated flights from takeoff/landing), drivers are not _required_ to be certified to know all the limitations of their devices before being allowed to use it."

      Next question: WHY aren't they required?

      1. a pressbutton

        Re: Naming of devices

        Easy

        no-one would buy the car if you needed to spend £many_thousands and study for many_months before driving it -

        especially if there was another, similar car you could just drive away in.

        1. Charles 9

          Re: Naming of devices

          The point I was trying to make is why isn't it such a steep requirement to drive ANY car? Take Germany. As I recall, the driving license requirements there either are or used to be considerably stricter than most English-speaking countries.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like