back to article Oddly enough, when a Tesla accelerates at a barrier, someone dies: Autopilot report lands

A Tesla with Autopilot engaged accelerated toward a barrier in the final seconds before a deadly crash, an official report into the crash has revealed. Apple engineer Walter Huang was driving his Model X P100D on a Silicon Valley freeway on the morning of March 23 when the car, under computer control, moved into the triangular …

  1. JWLong

    Garbage In, Garbage Out..........

    Enough said.

  2. EveryTime

    One of those questions has an obvious answer: the cruise control was set to 70 MPH, and once it wasn't following another car it accelerated to the setpoint speed.

    1. Baldrickk

      Exactly.

      Here's why it might have hit the divider:

      https://youtu.be/6QCF8tVqM3I

      And El Reg has (I think) already run a piece on why the car won't stop for stationary objects.

      Here is WIRED's take on it:

      https://www.wired.com/story/tesla-autopilot-why-crash-radar

      And a summary feom elsewhere:

      Tesla "Autopilot" cannot see stationary objects. It has a radar, but it is 1-D and low resolution. It uses this to localize other moving cars. In theory, it could use this to see a highway divider or firetruck it's about to hit, but since it's 1-D that would also mean that it would have to slam on the brakes when it sees an overpass coming up, because it can't tell the difference between an overpass and a highway divider or a firetruck. It can assume that overpasses aren't driving at 60mph, so it will see other cars. The Tesla algorithm is "look for lane markings and try to stay between them, and don't hit any moving vehicles. If there is any stationary object, including another a vehicle in your lane, the "Autopilot" will plow right into it.

      1. Nate Amsden

        Didn't need to stop if it had stayed in the lane. No crossing solid lines. Should be pretty simple.

        1. Anonymous Coward
          Anonymous Coward

          Self Preservation mode

          Accelerate fast enough and you destroy the Evidence.

          1. msknight

            Re: Self Preservation mode

            Obviously not fast enough in thie case.

            1. Giovani Tapini

              Re: Self Preservation mode

              I've said it before and I'll say it again. Its a glorified cruise control. Calling it "Autopilot" vastly overstates its abilities and lulls drivers into a false sense of security.

              It does not really provide long term, hands free driving as Tesla say, yet it can manage long enough to remove any attention span you may need to do anything manually while you drop your coffee and put the phone down etc.

              If they stopped calling it autopilot it will remove the inappropriate belief that the car becomes self driving.

              1. Hans 1
                Facepalm

                Re: Self Preservation mode

                encouraged by the name and marketing, owners think [...]

                I've said it before and I'll say it again. Its a glorified cruise control. Calling it "Autopilot" vastly overstates its abilities and lulls drivers into a false sense of security.

                It does not matter what people think, with an autopilot in an aircraft the pilot must stay in position, be ready to take over anytime. People SHOULD KNOW.

                It is rather dumb that people think autopilot means autonomous driving when it does not. You are supposed to keep your hands on the wheel AND your eyes on the road. You can move your feet away from the pedals and enjoy the ride, HOWEVER, keep hands on wheel and eyes on the road.

                Reminds me of the story of a camping car driver who thought cruise control was autonomous driving, engaged it and went to the kitchen for a nice hot coffee .... or wing mirrors which mention "objects in (the) mirror are closer than they appear" No f'ing shit Sherlock ...

                You get a car, you work out what options it has AND what these do PRIOR TO USING THEM.

                The world cannot save all cretins, we are all trying very hard, but, you know, some are just beyond help. You can rename the option to super cruise control or whatever, there will always be cretins who think it means autonomous driving.

              2. steviebuk Silver badge

                Re: Self Preservation mode

                I think it's been too long. People will still think it's a car that can drive itself.

              3. MachDiamond Silver badge

                Re: Self Preservation mode

                "If they stopped calling it autopilot it will remove the inappropriate belief that the car becomes self driving."

                It's a glorified adaptive cruise control with lane keeping assist. One of it's biggest problems is that it works well enough in many, if not most, average driving situations, but royally screws up from time to time. People get lulled into letting it do too much with too little attention up until the point where it hits the emergency vehicle.

              4. tim292stro

                Re: Self Preservation mode

                "...I've said it before and I'll say it again. Its a glorified cruise control. Calling it "Autopilot" vastly overstates its abilities and lulls drivers into a false sense of security..."

                I've hadn't really noticed that people had such a messed up definition of autopilot until the last few days when this crash's report came out.

                This is Autopilot. Not a chauffeur, or a self-driving car. On and airplane, Autopilot holds a heading and an altitude. It does not steer around other planes or terrain. On a boat, autopilot will hold a magnetic/true-heading course, it will not avoid other boats or undersea obstructions and the shore. Sure other systems have been piped into those autopilots to do more advanced stuff - reaching waypoints can trigger the loading of the next waypoint with a new heading - and radars, GPS, TCAS, and transponders add to the overall information picture, but a person still needs to respond when all the alarms start going off.

                If you set a heading on a boat and then go below deck to get a cup of coffee for 10 minutes, the autopilot will drive itself right into another boat and not think anything was wrong with that. We don't have <u>self driving</u> commercial boats and planes yet either... The US Navy had a few crashes in recent history that show what happens on a boat when you don't pay attention to what your automation systems are doing.

                It seems that only in the delusions of people who are literally >>>dying<<< to get a self driving car, that the definition seems to have been misunderstood.

                1. sprograms

                  Re: Self Preservation mode

                  With the proper details entered into the system, a jetliner's autopilot does a lot more than simply fly straight-and-level. "Auto pilot" is a horrible name for "adaptive cruise control, plus "stays in a lane."

            2. Anonymous Coward
              Anonymous Coward

              Re: Self Preservation mode

              That's what the self igniting batteries are for.....

        2. dcathjlmif

          Yepp but the lane marking were worn and not replace, just as the whole of the clash protect system was missing (basic maintain of highway is simple too)

        3. Anonymous Coward
          Anonymous Coward

          "No crossing solid lines. Should be pretty simple."

          ISTR that the markings were mostly washed out and probably not easily recognized. The lane visually just expands to the left.

          The "bug" was reproducible:

          https://www.youtube.com/watch?v=j38PN79X6-Q&t=22s

        4. Anonymous Coward
          Anonymous Coward

          Simple for an experience human driver, apparently not for an artificial not-so-intelligence. The main theme in the article is correct: Tesla's use of the term "autopilot" was fatally misleading. Sad that a company that could have led the transition from fossil fuel to electric transportation may now be derailed by an irresponsible (juvenile?) marketing decision.

          1. Schultz

            Simple for a... human driver, not for an artificial not-so-intelligence

            I believe Tesla get treated unfairly here. First, humans are also involved in stupid accidents and we accept that as fact of life. To expect a perfect record for self driving cars (or the 'autopilot' stages run by tesla, which can be considered precursors thereof) is not rasonable. And yes this new type of accidents will look stupid to us human overlords, but the human errors often look stupid too - read your news if you don't agree.

            Second, this article makes a complicated case for why the autopilot is too blame (hands on or off the steering wheel,...). But traffic rules must be simple to follow, and the rule applying here is simple: the human driver must remain in control. KISS, otherwise you just feed the lawyers.

            Finally a comment on battery fires. Extinguishing a battery fire with water does not work. Not a fault if Tesla. Looks like the firefighters have to get some training to prepare them for the electromobility era.

            I agree with Kieran that Tesla blatantly oversells their cars and the autopilot function. Just be aware when you think about your next car purchase, but don't use it to build a criminal case against Tesla.

          2. Updraft102

            Nah, Tesla is a big (enough) corporation. Those get to do whatever they want in the US.

      2. JDX Gold badge

        So Tesla has follow-distance control but no emergency stop? Isn't that becoming standard even on regular cars?

        1. AndrueC Silver badge
          Meh

          So Tesla has follow-distance control but no emergency stop? Isn't that becoming standard even on regular cars?

          My Honda Jazz has it but only at low speed. I've had it trigger once and it was - sorta - right. Someone pulled out onto a roundabout in front of me. It would have been a cheeky but safe lunge if it wasn't for the fact they were towing a trailer. So I stopped (no panic, just slowed and waited). But I was a bit irritated and concerned about being rear-ended so when the trailer was half way across my front I accelerated. I knew the trailer would clear before I got there but the car disagreed. Cue lots of beeps, lights flashing on the dashboard and the brakes coming on.

          But I think it only trggers below 30mph so wouldn't help in this scenario.

          1. jeffdyer

            Honda Jazz? Surely pensioners don't read the Register?

            1. DMoy

              I'm 73 and receive a government pension, though I still have to work too. I read The Reg every day between sessions of designing machinery in Autodesk Fusion 360, building and programming projects for my Arduino boards, learning to play guitar, and riding too fast on my BMW K1200R. Though from your remark, I'm pretty sure that by pension age, you'll be too addled to comprehend The Reg, I'm not, and neither are a lot of other Reg readers. Stifle your ageist remarks. Sooner or later you'll be talking about yourself.

              1. Robert Helpmann??
                Pint

                To a productive member of society:

                I'm 73 and receive a government pension...

                Have an upvote and a virtual beverage!

              2. Lars
                Happy

                Born before or after the bomb?

                1. Lars
                  Happy

                  "Born before or after the bomb?"

                  If you are 73 then you are probably born 1944, and before or after the bomb. If you try hard you may understand why I know it.

              3. Anonymous Coward
                Anonymous Coward

                I'm 16 and receive pocket money, though I still have a paper round too. I read The Reg every day between sessions of designing machinery in Autodesk Fusion 360, building and programming projects for my Arduino boards, learning to play guitar, and riding too fast on my Gilera Runner 50. Though from your remark, I'm pretty sure that by pension age, you'll be too addled to comprehend The Reg, I'm not, and neither are a lot of other Reg readers. Stifle your ageist remarks. Sooner or later you'll be talking about yourself.

              4. Truckle The Uncivil

                @DMoy

                Obviously not a smoker then.

        2. phuzz Silver badge

          "So Tesla has follow-distance control but no emergency stop?"

          It does have emergency stop if (eg) the car in front of you slams it's brakes on, but as explained up thread, it might not be able to 'see' stationary objects.

          1. ckm5

            It does have emergency stop if (eg) the car in front of you slams it's brakes on, but as explained up thread, it might not be able to 'see' stationary objects.

            That's ridiculous - my 10 year old Volvo can 'see' stationary objects and will warn about them loudly, as in windshield flashing red and lots of warning noises.... No automated braking as it's too old for that feature, but cruise control will dramatically slow down the car if engaged, including downshifting for engine braking.

            Happens sometimes if you are in a long left turn lane cut out of a median and there is a control box or other square-ish object on the other end of the turn lane (but on the other side of the cutout) which you may be approaching rapidly as you reach the left turn point...

          2. Alan Brown Silver badge

            "but as explained up thread, it might not be able to 'see' stationary objects."

            Tesla is explicitly clear that the autopilot is unlikely to detect and stop for stationary objects in its lane

            when travellling in excess of 50mph

            Autopilot is an enhanced cruise control. It's not a robot driver.

        3. MachDiamond Silver badge

          "So Tesla has follow-distance control but no emergency stop? Isn't that becoming standard even on regular cars?"

          I hope not. That would be the first thing I'd want to rip out of a new car. Sometimes it's better to stay at the same speed or go faster and maneuver than to slam on the brakes. I don't see any sort of autonomous car being able to make that decision anytime soon.

          1. Michael Wojcik Silver badge

            Sometimes it's better to stay at the same speed or go faster and maneuver than to slam on the brakes. I don't see any sort of autonomous car being able to make that decision anytime soon.

            Agreed. The automated panic braking ("collision avoidance", which really means "convert a collision at the front end of your car to one at the back end") in my wife's new Volvo is a huge pain in the ass. It triggers inappropriately all the time, generally when some idiot pulls into the lane in front of the car at too short a distance (in the computer's opinion). It's just luck that she hasn't been rear-ended yet by a tailgating vehicle when that happens.

            Volvo's Pilot Assist features are highly rated by reviewers who like this sort of thing. I'd hate to try to drive a vehicle that has a low-rated implementation.

        4. JohnG

          "So Tesla has follow-distance control but no emergency stop?"

          Teslas do have Automatic Emergency Braking - but, as with other vehicle makes, the cars brake for objects that they detect. Like people, cars may drive into things that they don't "see".

      3. Anonymous Coward
        Anonymous Coward

        " If there is any stationary object, including another a vehicle in your lane, the "Autopilot" will plow right into it. "

        If true, in my place, the day when everyone has this, deaths on road will rocket high vs. today.

        You can find anything idle on the roads here: rocks, animals, idle cars of a random idiot having a phone call etc ...

        1. Updraft102

          " >> If there is any stationary object, including another a vehicle in your lane, the "Autopilot" will plow right into it.

          If true, in my place, the day when everyone has this, deaths on road will rocket high vs. today."

          How so? Cars today will plow right into anything you aim them towards. How would it be any different?

      4. Anonymous Coward
        Anonymous Coward

        @Baldrickk

        "If there is any stationary object, including another a vehicle in your lane, the "Autopilot" will plow right into it. "

        Sorry, that last line confused me - how come it can spot a moving car/object in front of it and slow accordingly but not a stationary one? That makes no sense and suggests it would plow into a traffic jam which is clearly not the case since Autopilot works in traffic. I suspect the radar signal from the divider wasn't big enough at first to cause a reaction but you'd expect that when fairly close to it the computer would realise there's a stationary object in front and at least try to slow down. The fact that it didn't suggests a serious bug rather than an overall design fault.

        1. Anonymous Coward
          Anonymous Coward

          Re: @Baldrickk

          how come it can spot a moving car/object in front of it and slow accordingly but not a stationary one?

          The sensors can't see far enough ahead to spot objects off in the distance, and by the time those stationary objects finally come into range, at highway speeds, it's far too close for the vehicle to react and the collision is inevitable.

          Essentially, Telsa's autopilot feature is like driving with Mr. Magoo*--or Carrie Underwood during a snowstorm.**

          The future of self-driving / autonomous vehicles is to prevent those types of drivers from ever being on the road--but we're clearly not there yet, and Tesla isn't helping when it's designing and promoting systems as "autopilot" when they're far worse than most human drivers.

          * - Mr. Magoo was extremely nearsighted / almost blind cartoon character that frequently got into hijinx as he would bump into things.

          ** - In her song "Jesus Take the Wheel" she hits a patch of ice and skids out of control. At that point she ignores any previous driving instruction she might have received, takes both hands off the wheel in order to pray for Jesus to change from his role as co-pilot to actual pilot, and steer her to safety.

        2. Anonymous Coward
          Anonymous Coward

          Re: @Baldrickk

          Because tracking stationary objects is too hard for its pea-brained AI? I've driven on the Cali freeway system. It's not much different than NYC, but a lot easier to negotiate than, say, Philly. Anyone who uses autopilot (or cruise control, for that matter) in those kinds of road systems might as well point a loaded revolver at their head and hope the next chamber is empty.

          1. Baldrickk

            Re: @Baldrickk

            Seeing the stationary objects is easy enough.

            Determining which ones are actual hazards and which ones are overhead gantries, street furniture, potholes/cracks, leaves blowing across the road, stationary traffic in another lane or an actual threat is another matter entirely.

            Lidar + processing power (as seen in autonomous vehicles) maps out the surroundings. With the 1D radar used for autopilot, there is no positional sense - at all.

            You decide if you want to keep slamming your brakes on automatically and unnecessarily when on the freeway, causing someone to ram you up the rear, or filter out all the stationary returns.

            Bear in mind that the driver is meant to be in control at all times, Autopilot or no.

            Traffic slowing to a halt is easy, you can track the change in velocity and match it.

        3. MachDiamond Silver badge

          Re: @Baldrickk

          "how come it can spot a moving car/object in front of it and slow accordingly but not a stationary one?"

          I wonder if there is a limit on detections (low-pass filter) to prevent falsing. It can see moving things going in the same direction because the differential speed is lower. When something is seen moving at 75mph, it gets filtered out as a false signal.

  3. OlaM

    The autopilot probably accelerated because cruise control was set to a higher speed and the car in front made it slow down. When the car moved out of view, it sped up to reach its set speed. And it's not wrong to call it an autopilot. An airliner autopilot will happily fly straight into a mountain if you tell it to. It's just steering automation, not HAL-9000. But I agree that Tesla are jerks for blaming the victim, particularly when it should be extremely easy to detect splitting lanes in the map data.

    1. diodesign (Written by Reg staff) Silver badge

      Re: OlaM

      "An airliner autopilot will happily fly straight into a mountain if you tell it to."

      IMHO if you manually tell it to do a dangerous thing, it stops being an autopilot at that point. Aircraft autopilot follows routes, with set safe altitudes, and terrain-following radar to avoid collisions.

      Tesla's tech shot off into a barrier.

      C.

      1. The Oncoming Scorn Silver badge
        Terminator

        Re: OlaM

        I thought he worked for Apple, that may explain why he was driving it wrong.

        1. Anonymous Coward
          Anonymous Coward

          Re: OlaM

          "I thought he worked for Apple, that may explain why he was driving it wrong."

          I think you were downvoted for a variety of reasons, but I also think there's a sensible point to be made there. Apple have created an expectation of what electronic systems do, and I imagine it's shared by their engineers.

          At any time, people's expectations are related to general state of the art. In a time when a lot of people had coal fires, it didn't seem odd that a steam locomotive needed someone to shovel coal into a firebox. In a world of central heating, it seems a bizarrely dangerous idea.

          Whatever Apple's faults as a company (I'm not going there) Apple stuff does pretty much what it says on the box. If an Apple engineer read "Autopilot" as "pilots car automatically" it would be unsurprising. Transportation technology probably wasn't his thing, or perhaps he wouldn't have bought a Tesla. He wanted an Apple type experience, i.e. pay a whole lot of money for something and then expect to have it do what it seems to claim.

        2. dnicholas

          Re: OlaM

          Fuck it I laughed. Going to hell anyway

      2. Mayday

        Re: OlaM

        "Aircraft autopilot follows routes, with set safe altitudes, and terrain-following radar to avoid collisions."

        Not true. Even advanced autopilot systems like those in a Cirrus SR type aircraft will happily fly itself into terrain (whilst screaming TERRAIN and flashing red at you if certain features are added/enabled) or even chase itself into a stall* if a pilot does not intervene. This is why these and most other aircraft have a big AP Disconnect button on the control yoke to disable the thing instantly.

        I'm not sticking up for Tesla here, just defining what an "autopilot" actually is. Tesla need to sort that out. They really do.

        *I've tested this "feature" personally. They insist on it on your first flight in the thing.

        1. Anonymous Coward
          Anonymous Coward

          Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

          Being of an antipodean nature, my first thought here is "Mt Erebus" and "Flight 901" (not to mention "orchestrated litany of lies"). However, that was 1979 and presumably aircraft autopilots are now a bit more advanced.

          Are there any more recent inadvertent controlled flights into terrain that might provide useful for this discussion?

          1. Stoneshop

            Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

            Are there any more recent inadvertent controlled flights into terrain that might provide useful for this discussion?

            Here you are. Looking at a few of the recent crashes classed under CFIT that involved modern airliners, they were all caused by crew ignoring or responding too slow or incorrectly to warnings.

          2. Anonymous Coward
            Anonymous Coward

            Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

            The Sukhoi scandal:

            https://en.wikipedia.org/wiki/2012_Mount_Salak_Sukhoi_Superjet_crash

            1. Alan Brown Silver badge

              Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

              "The Sukhoi scandal:"

              The captain of the jet was Alexander Yablontsev (57), a former Russian test pilot;

              Human factors to the fore again - and yet another example of why ex-military fliers are a poor choice for civil transportation. They tend to press-on regardless when anyone sensible and cautious would have diverted. Being able to safely land 95% of the time is one thing but cleanup after the last 5% is problematic and unlike a military aircraft the people sitting in the back didn't sign on for that risk.

              1. RPF

                Re: Aircraft autopilot ... terrain-following radar to avoid collisions.

                SOME ex-military pilots. Not ALL.

        2. MachDiamond Silver badge

          Re: OlaM

          "I'm not sticking up for Tesla here, just defining what an "autopilot" actually is. Tesla need to sort that out. They really do."

          An aircraft autopilot is just a cruise control with lane keeping assist. The difference is that somebody else (controllers) are watching where the plane is in relation to other aircraft and directing pilots to make course corrections when there are conflicts. There is also a lot more room in the sky lanes. Planes using autopilot are also more likely to be flying IFR so they have had a course plotted that doesn't have them aimed at mountains that they can crash into accidentally.

        3. JohnG

          Re: OlaM

          Tesla repeatedly tells owners that Autopilot is in Beta, that they need to keep their hands on the steering wheel at all times and that they do not yet have "Full Self Driving". In the vehicle, there are two modes available: Traffic Aware Cruise Control and Automated Lane Keeping - but that doesn't sound as sexy as Autopilot or Full Self Driving - and some apparently intelligent drivers seem to ignore all the warnings and fixate on the marketing terminology.

      3. Anonymous Coward
        Anonymous Coward

        Re: OlaM

        Autopilot on planes is simple. Theres a lot more space to work in, mountains tend not to move and theres a lot less traffic.

        1. Hans 1
          Coffee/keyboard

          Re: OlaM

          Autopilot on planes is simple.

          You are joking, right ? Do you have any idea of the sheer number of flight parameters there are on an aircraft ? Ever heard of lift, roll, pitch, yaw, stall, altitude, air density, thrust...?

          On modern aircraft, an autopilot can land the bloody thing - OF COURSE, PILOT MUST BE READY TO TAKE OVER ANYTIME, because unexpected things can and will happen, most likely at the worst possible moment. It is called Murphy's law ... the most important law of nature for aviation engineers, pilots, and now Tesla drivers, so it seems.

          Tesla is 100% right, the driver was careless when the accident happened, was NOT ready to take control of his car. Yes, USian highway maintenance shares some of the blame too ... but, behind the wheel, you are responsible for your car. Yes, Tesla autopilot is nice to have, on USian highways probably not as useful as in Europe ... certainly NOT a replacement for a driver, though.

          1. tim292stro

            Re: OlaM

            "...Autopilot on planes is simple.

            You are joking, right ? Do you have any idea of the sheer number of flight parameters there are on an aircraft ? Ever heard of lift, roll, pitch, yaw, stall, altitude, air density, thrust...?..."

            I think he means the data entry for autopilot systems and what they are meant to control is simple... On something like an Airbus A310, the autopilot has only a few controls: heading, altitude, speed.

            All of the flight surface controls that affect the airframe's trajectory are manipulated by fairly simple optimized mathematical models and some simple tuned filters to control dynamics - but the user doesn't get a window into those on the display. They just get three settings and a few knobs to adjust "the big picture".

            Hell if you set your altimeter's barometric value (which tells the plane what altitude it's at) wrong your automated landing system (ALS) can fly you right into terrain... The actual skill on a modern commercial plate still occupies a seat and has to RTFM to get the procedures right...

      4. CraPo

        Re: OlaM

        German Wings 9525 anyone? https://en.wikipedia.org/wiki/Germanwings_Flight_9525#Cause_of_crash

      5. JimC

        Re: OlaM

        I think there's probably an age thing going on here. I grew up in the 60s and 70s, and my father was in the flying business then. My default understanding of an autopilot is something that flies the plane straight and level when there's nothing difficult going on, and needs to be overruled for take off, landing, evading Messerschmitts or Migs and anything else difficult. That fits the Tesla offering reasonably pretty accurately. I think of autoland and so on as extra capabilities above and beyond the autopilot.

        Clearly a younger generation is thinking differently, and we have a cultural gap. For them, it seems, autopilot is not the right word.

        1. David Nash

          Re: OlaM

          The difference is that this is aimed at consumers, not at professional pilots.

          It takes a lot more training and people take it a bit more seriously (rightly or wrongly) than learning all the controls of their car and what every bit does.

          So a large number of consumers don't know precisely what an airline autopilot does and does not do? And that makes it their fault if they therefore make a mistake about the capabilities of a system aimed at them for domestic cars?

          I don't think so.

        2. Uffish

          Re: autopilot age gap

          I'm old generation and autopilot means nothing to me in the context of cars but I can quite believe that the Tesla Marketing Dept is very attached to the 'autopilot' designation - for all the wrong reasons.

          As an engineer, if I wanted a self driving car I'd take a taxi or if I was a plutocrat I'd have a chauffeur and a car. I would probably enjoy driving a Tesla but I would rip out the 'auto-da-fé' before I drove it.

          I would rather not have an accident in a car but the hurt would be worse if it was some sodding, half baked, handful of assorted transducers and a huge touch screen, marketing department driven, programmable controller that caused the accident.

        3. Alan Brown Silver badge

          Re: OlaM

          "For them, it seems, autopilot is not the right word"

          It has nothing to do with age.

          _MOST_ non aviators/seamen think that an autopilot is some magical device that can handle everything thrown at it.

          1. Charles 9

            Re: OlaM

            You want my opinion on why people have misconceptions about auto-pilots? Blame the Abrams Brothers...and "Otto".

      6. Bill Michaelson

        Re: OlaM

        AirLINER autopilot (flight management) systems typically do all that good stuff. Aircraft autopilot systems of earlier vintage and lower grades can be as simple as a wing-leveler or a heading or altitude hold device and can indeed direct the airplane into a mountain or other obstacle. It is the pilot's responsibility to know well the capabilities and limitations of the specific system and supervise the operation appropriately at all times. It is also worth noting that the more capable systems require the most complex and nuanced supervision, thus requiring the most training and experience to do safely.

        Yet even the simplest of such aviation mechanisms are called autopilots.

        There is a distinction between the level of training, experience and certification required to operate airplanes as compared to automobiles, owing to several factors. That is a core issue. The other core issue is the regulatory environment that has allowed the deployment of these tools to insufficiently prepared drivers.

      7. tim292stro

        Re: OlaM

        "...IMHO if you manually tell it to do a dangerous thing, it stops being an autopilot at that point. Aircraft autopilot follows routes, with set safe altitudes, and terrain-following radar to avoid collisions..."

        Autopilot is pilot automation - but only REDUCING the pilot workload. If someone or something is watching your heading and altitude, it can give your brain cycles to look into why your #3 engine is running a little warm. It helps you follow your flight plan easier too, getting you from one waypoint to the next and lining you up for a nice landing on the runway you punch into the flight computer.

        Now realize that they still load navigation data onto those things with 8" floppies on some older variants, and you'll start to get an idea about the technical limitations of what an autopilot can do. Your average 737 doesn't have the latest Nvidia Volta GPU in its telemetry rack - it's a step up from vacuum tubes and loading programs by tape or punch card.

        Aviation autopilot also has the benefit of a mandated air traffic control in Class A airspace, so that comes with an external nanny in the event something goes off. You may have also noticed the lack of erratic turns and road maintenance issues on your last flight (although the turbulent air potholes can bee a real pain).

        Getting a self driving car to work without human oversight is a HUGE effort, and is has never even been attempted at commercial scale in aviation or marine markets (there are recent attempts to get systems working but nothing is being deployed like Waymo on the water or in the air).

        "...Tesla's tech shot off into a barrier..."

        Sure that's one damning way to look at it, but as an engineer I also look pragmatically at the longer phase that was used to describe this situation: "it was following a car and then the car went through an interchange and then the Tesla drove into a barrier". I also noted in the NTSB statement that the following distance was set to minimum...

        So with those two data points, I immediately apply my expertise in driving in the S.F. Bay Area, where our roads are crap, and so are our drivers (on ballance).

        I can imagine a scenario where, the Tesla was following a car that decided late, to take the interchange a different way, and made a moronic move across the gore point which was poorly maintained (we are lucky if Caltrans fixes a crash barrier or guard rail within a month let alone a week - now take a guess how bad the paint markings are...). Now in my imagination I can see the Tesla following closely behind that idiot who jumped across the gore - with the lane markings partially obscured by the closely followed car in front (the idiot's car). In that case, the car was probably simply following the car in front into the gore that was poorly marked, and once the idiot completed his late freeway change to the other direction across the gore, the Tesla realizing he was crossing a line decided not to follow him. From there the line which would have been the left solid line may have come back and the car thought it was in a lane and tried to center itself (remember it's narrow at the tip of the gore). Another part of the code, facing south and brightly lit from head on by the sun (because that's the direction that interchange faces) no longer saw a vehicle in front - many reasons possible, a Tesla has previously not seen a truck trailer across lanes because of poor lighting, I'd speculate that the camera based vision system Tesla chose still can't see in adversarial lighting conditions. Then with no one in front the car attempted to return to the un-obstructed cruise control set point (even though the speed limit is 65, people will do 90 because they feel like it - Tesla drivers out here love their electric torque and drive like jerks.

        So the steering would be trying to center itself on the two lines it detected, and the cruise setting would not think there is a car in front so would speed up to the set-point.

        To me, this looks like a failure of localization (GPS-only can easily be off by 3-meters, or about a full lane). Without having a high resolution map and LIDAR to compare your position on the road, to known fixed things like concrete barriers, bridges, and road signs - relying on radar within a few miles of Moffett NASA Ames Federal Airfield which has the civil aviation surveillance radar for the S.F. Bay area, isn't a good idea - and that pretty much leaves you with visual navigation to keep you in lane.

        See previous comment about which direction the forward facing camera was pointing relative to the sun, and our terrible road maintenance practices in California and the obscuring of oncoming obstructions by the car in front. If anything I'd be surprised if Caltrans doesn't get a bit of a spanking on this for road conditions, and then Tesla being dragged over the coals for not having good enough sensory perception and geo-localization.

      8. Alan Brown Silver badge

        Re: OlaM

        " Aircraft autopilot follows routes"

        The Dunning-Kruger is strong in this one.

        The one on my old Cessna 302 simply held heading and altitude.

        Anything more that is an added feature. Autopilots are very dumb pieces of kit. The more advanced ones can fly in a more-or-less straight line from waypoint to waypoint and load the next waypoint in as they reach each target Even the alttude is dictated set by pilots (unless directed otherwise by ATC, that tends to be "as high as you can fly without stalling, for best economy."). Collision avoidance is another system. Automated landing is another system too. NONE of it is a substitute for having someone at the pointy end ready to step in when the mechanisms can't cope - the systems are there because flying is mostly tedium(*) and otherwise people get bored/sleepy and do silly things.

        The moment anything unexpected happens, an autopilot goes "bong" and hands control back to the meatsack - and unlike a car an aircraft with nothing controlling it will spend several minutes flying in a straight line before it needs a correcting hand on the controls. It's called "inherent stability" and all airliners are designed with lots of it built in(**). You'd be lucky to get 10 seconds hands-off in a car before you need to intervene, mostly because unlike aircraft your car _can't_ travel in a dead straight line due to roads seldom being straight (and even where they are, they have drainage crowns trying to push the car to the sides). On top of that, cars are in a challenging environment with a lot of other obstacles in close proximity(***) that are NOT being centrally monitored/directed, nor do you have comms with all the other drivers around you(****)

        (*) It's 30 mins to an hour of chaos at each end. The rest is incredibly boring. A _good_ transport pilot may have 1 or 2 incidents in his entire career. Pilots who save the day due to their incredible flying skills are most commonly the same pilots who overconfidently got themselves and their passengers in the shit in the first place.

        (**) Military fast jets and aerobatic aircraft are "unstable" as this is what allows them to be thrown around the sky. Some are so unstable that without a computer constantly correcting the desired flight path they'd be out of the sky in a few seconds or less. (F22 and F35 being prime examples)

        (***) In "crowded airspace", the routinely nearest thing to you is tens of miles away at the same altitude, or at least 1000 feet above/below you. You have several tens of seconds to react to anything out of order. In a car, having somethere between zero and 2-3 seconds is routine.

        (****) All aircraft in an area are using the same frequency, and that doesn't matter whether it's controlled or uncontrolled, All pilots hear the instructions and know what the other guys are doing, or they're telling each other what they're doing. When things start going pearshaped everyone knows to shut up, listen and get out of the way. Compare and contrast with the average fumducker driving blindly into a crashzone or causing chain reactions by rubbernecking instead of looking where he's going.

      9. astrodog

        Re: OlaM

        Sadly no, airliner autopilots will happily fly the plane directly into other aircraft, above their service ceiling, beyond their fuel range or even directly to the ground: https://en.m.wikipedia.org/wiki/Germanwings_Flight_9525

        “Lubitz deliberately crashed the aircraft. He had set the autopilot to descend to 100 feet (30 m) and accelerated the speed of the descending aircraft several times thereafter.”

      10. Updraft102

        Re: OlaM

        IMHO if you manually tell it to do a dangerous thing, it stops being an autopilot at that point. Aircraft autopilot follows routes, with set safe altitudes, and terrain-following radar to avoid collisions.

        Autopilot existed long before aircraft had any capability of doing any of that. That's not the definition of autopilot.

        If you set the autopilot to follow a given heading at a given altitude, it will do just that, even if there is a mountain in the way. As for whether it is dangerous... does that make the definition of autopilot outcome-based? If you set autopilot to maintain 090 at FL330 and nothing happens, that's safe, therefore it was still autopilot, but if you set it to maintain 090 at FL30 and run into a six thousand foot mountain, that was unsafe, therefore it was not autopilot (even though the CDR will indicate that autopilot was on at the time of impact, in a CFIT?)

        Air Inter flight 148 crashed into a mountain while approaching Strasbourg on autopilot in 1992. The captain had set the autopilot for a 3300 ft/min descent accidentally; his actual intent was to dial in a 3.3 degree descent. Right at that moment, there was a little blip in the altitude, probably a little bit of turbulence, which caused the autopilot to enter an emergency mode, which made it descend faster than even the very rapid 3300ft/min rate. Thing flew right into a mountain... a mountain it would have missed if the plane had descended at the rate selected.

        This happened in one of the most automated planes in existence, the Airbus A320. It didn't fly over the mountain or change course or any of the things that you suggest autopilots do... it wasn't told to do any of that, and it didn't. It was still on autopilot at the point of impact.

    2. Dodgy Geezer Silver badge

      ...It's just steering automation, not HAL-9000....

      At least one HAL 9000 was even more antagonstic towards humans than the Tesla code is....

      1. Michael Wojcik Silver badge

        At least one HAL 9000 was even more antagonstic towards humans than the Tesla code is....

        Sure, but it was consistent. If Teslas always tried to kill their occupants, we wouldn't have stories like this one. The problem is they're unpredictable; you never know when they'll turn on you.

        Or won't turn, as the case may be.

    3. Voland's right hand Silver badge

      But I agree that Tesla are jerks for blaming the victim, particularly when it should be extremely easy to detect splitting lanes in the map data.

      The right person to blame is the USA highways administration. That particular divider same as many similar dividers in other places have NO HATCHING in front.

      Here is an example of how a similar lane split is done in Europe and UK: http://www.m44.co.uk/NewsImages/60.jpg The hatching in front is designed to produce both noise and shake when you cross it so it is not just visual. It will shake the driver to take control. Compared to that, the USA and that particular divider have nothing. The divider appears out of nowhere in the middle of the tarmac. I had a close call in exactly the same place barely missing that same divider a few years back driving to San Jose after a LHR-SFO flight. The way it is laid out (same as most similar places on a USA motorway) is criminal in its incompetence.

      1. Anonymous Coward
        Anonymous Coward

        "The right person to blame is the USA highways administration."

        If that was the case someone would already have hit it, and the barrier would be damaged .......... like it was.

      2. Anonymous Coward
        Anonymous Coward

        Blame Jerry Brown and the idiots at CARB

        Driven that 101/85 interchange since the 1980's. It was perfectly safe until they added that idiotic Diamond Lane (HOV) lane connector about 15 years ago. Thats Jerry Browns fault. Idiotic idea piled on idiotic idea.

        Some background. Back in the 1970s' during Jerry Browns last totally shambolic governorship he had the bright idea to reduce highway capacity at peak-hours by 25%/33% by reserving a lane for multiple occupancy vehicles. A very small percentage of total vehicles at rush hours. The bright idea was to socially engineer people out of single occupancy vehicles into using car pools. By making peak hours travel even more unpleasant. Over the next 30 years vehicle miles numbers almost doubled but the percentage of multiple occupancy vehicles stayed exactly the same. So about 90% of the Diamond Lanes miles at peak hours were basically empty while other 2/3 lanes of freeway stop and go.

        So what did the bright sparks behind the Diamond Lanes idea do. Reassess their traffic engineering failure. No way. The doubled up by starting to build more Diamond Lanes, like the one at 101/85, and to hide t he fact that they were a massive failure gave tens of thousands of special bumper stickers to people who drive hybrids and electric cars to use the Diamond Lanes and, presto, the Diamond Lanes now have traffic at peak hours. About 90% of it single occupancy hybrid vehicles. Before they allowed hybrids in the Diamond Lanes the 280/85 to 280/17 run at rush hours in the Diamond Lanes was 65mph the whole way. Now you'd barely break 35mph due to all the hybrids. All with one occupant.

        The easiest way to make ordinary people in the Bay Area hate electric / hybrid owners is to tell them about the special CARB Diamond Lane stickers. Most have not heard of them. And all get very angry given the patent unfairness of the special rush hour highway lanes for these very "special" people. Tosser one and all.

        https://arb.ca.gov/msprog/carpool/carpool.htm

        1. katgod

          Re: Blame Jerry Brown and the idiots at CARB

          Get angry with the wrong people much?

      3. Anonymous Coward
        Anonymous Coward

        The right person to blame is the USA highways administration.

        Pretty certain the road layout was like that before someone tried to use autopilot ....

        Sure its a bit crap but otoh if the in car systems can't work in that environment they shouldn't be used nor sold as suitable for it, which is the alleged issue.

        To paraphrase your logic : 'You didn't rebuild your roads properly for our cars so its your fault'

        Road layouts and road maintenance quality will vary. The big question is do they vary so much that they can't be handled in a test suite. Or in other words, there are too many variable and unpredictable edge cases for statistical or programmatic methods to automate the task.

        This is why I'm very much a skeptic on autonomous cars , especially in cities like London until we have much more real time integration of both car, bike, pedestrian and street level systems with significant infrastructure investment. Best of luck on the inner ring road ...

        1. AndrueC Silver badge
          Unhappy

          Re: The right person to blame is the USA highways administration.

          This is why I'm very much a skeptic on autonomous cars , especially in cities like London

          Not just London. There are several roundabouts around Banbury that have lane markings and a lot of vehicles ignore them. Sometimes it's just a handful of lazy drivers but at some roundabouts it's endemic. Not that I'm a fan of lane markings but FFS they put them in place for a reason.

          The Brackley A43 roundabout has just been 'upgraded' to a light controlled junction with lane markings and once they wear out I predict problems. I'm really glad I don't have to commute across that. The roundabout over M40 J11 is bad enough :(

        2. Fred Dibnah

          Re: The right person to blame is the USA highways administration.

          "...real time integration of both car, bike, pedestrian and street level systems..."

          What might these bike and pedestrian systems be? I hope you're not thinking that cyclists and peds will be told to carry a beacon to prevent AVs from hitting them.

          1. chrisf1

            Re: The right person to blame is the USA highways administration.

            "...real time integration of both car, bike, pedestrian and street level systems..."

            "What might these bike and pedestrian systems be? I hope you're not thinking that cyclists and peds will be told to carry a beacon to prevent AVs from hitting them."

            I'm saying I would have my doubts they can safely (safely enough?) integrate into a mixed environment without such systems and hence incremental roll out of open mixed systems is possibly harder than the public debate might suggest.

          2. Michael Wojcik Silver badge

            Re: The right person to blame is the USA highways administration.

            What might these bike and pedestrian systems be? I hope you're not thinking that cyclists and peds will be told to carry a beacon to prevent AVs from hitting them.

            No, no, no. Automatic pants, so you don't have to direct your feet while you play with your phone.

        3. MonkeyCee

          Re: The right person to blame is the USA highways administration.

          "This is why I'm very much a skeptic on autonomous cars , especially in cities like London"

          My stepdad learnt to drive in London. He drives like he's under fire, with little regard for what the roads are signed as, but where you can fit a car through. My mum (now) drives much the same way, albeit with a steady stream of apologies for the assorted words of power being cast her way. Missed careers driving white vans or a minicab.

          They do this because it *works* very well off peak, and quite well on peak. Mildly terrifying in the passenger seat, hence why I tend to hide in the back.

          Since an autonomous vehicle will give way to someone driving like my parentals, who would want to be stuck in one when everyone notices they can just cut you up and your car will let them. Even if they don't normally drive like arseholes, if doing so will get you through faster they'll start doing it.

      4. juice

        "The way it is laid out (same as most similar places on a USA motorway) is criminal in its incompetence."

        That may well be true, but Tesla is an american company, and as other comments have pointed out, this is in an area which has been heavily used for self-drive testing. So they should have been fully aware that this is a potential scenario, and their software/hardware should be configured to address it.

        It's also worth noting that even in other countries, hatching and other visual indications may not always be there - on quieter roads, it may have worn away, or if they're doing roadworks, it may simply not have been repainted yet.

        Auto-pilot mechanisms need to deal with *all* scenarios, not just the best-case ones.

      5. Oneman2Many Bronze badge

        Real world barriers get damages, road markings wear out, road signs get covered in gunk, etc. The system should be able to cope with that

      6. Anonymous Coward
        Anonymous Coward

        Ah, so the road markings are at fault. Well that's nothing a few billions of $ can't put right, so a few people can pretend their adaptive cruise control is an "auto pilot" whatever that is.

        The real world is never going to be perfect for autonomous vehicles, even if they kill fewer people than meatbags, they are still going to kill people. Wait until there are 15 year old autonomous boxes buzzing about with faulty sensors and boss eyed radar due to shit servicing by lazy owners (we know they are lazy as the can't be bothered driving themselves).

        Now, where's my Santogen and a copy of the Daily Fail?

      7. harmjschoonhoven
        Thumb Up

        Re: How in it done in Europe

        Dutch Rijkswaterstaat designed sophisticated crash cushions called RIMOBs in the 1980s. They are pre-deformed iron tubes which collapse like an harmonica on impact. At the time we were involved in making high-speed films of crash tests and I remember that my manager had a compressed RIMOB tube on his window-ledge which looked like a piece of abstract art. These and simular infrastructure saved many lives.

        1. Alan Brown Silver badge

          Re: How in it done in Europe

          "Dutch Rijkswaterstaat designed sophisticated crash cushions called RIMOBs in the 1980s."

          Plastic 44 gallon barrels full of water (water attenuators) are much cheaper and just as effective for the most part. They're don't require complex engineering works to setup and can be replaced in minutes when someone does drive into them.

          Fitch barriers (same principle but using sand) are equally effective and only slightly more expensive.

          https://en.wikipedia.org/wiki/Impact_attenuator and http://www.plasticjersey.com/ (water filled plastic jerseys are attenutaors too. Their concrete brethren are not.)

          Even if more permanent barriers are deployed/destroyed, these kinds of attenuators are often dropped in temporarily whilst replacement works are arranged. The youtube video of how dangerous the leadup to that gore is, has other shock value in that there's no kind of attenuator at all - not having one and not physically coning off the gore would be a criminal matter in a lot of countries.

      8. EPurpl3

        There have been Tesla related accidents in other countries too, countries with "better" lane splits.

    4. Anonymous Coward
      Anonymous Coward

      "An airliner autopilot will happily fly straight into a mountain if you tell it to."

      BULL EXCREMENT.

      It might do so.

      BUT, it won't do so quietly.

      Trust me, you'll have all sorts of alarms and flashing lights going off in the cockpit and extremely loud voices telling you to "PULL UP. PULL UP".

      These alarms and voices will start in very good time, more than enough time to enable you to take evasive action.

      Thus, the only time an airline will "happily fly into a mountain" is if you ignore all the warnings.

      Finally, there is the little question of "Minimum Safety Altitude" on the charts. All aircraft will always be above that. The only time they go below MSA is to land.

      1. This post has been deleted by its author

      2. This post has been deleted by its author

      3. Alan Brown Silver badge

        "BUT, it won't do so quietly."

        The autopilot will.

        "Trust me, you'll have all sorts of alarms and flashing lights going off in the cockpit and extremely loud voices telling you to "PULL UP. PULL UP"."

        None of those are connected to the autopilot, nor will they pull up for you.

        The Germanwings aircraft that crashed into a french mountain a few years back had been programmed to do it by the suicidal pilot. It didn't try to avoid the obstacle, all it did was fly in the direction and height it was told to go at.

    5. Prst. V.Jeltz Silver badge

      "While we can assume Huang did not notice the Tesla was in between lanes and heading toward a crash barrier, neither did the Tesla"

      Very telling, you'd think if you were even half paying attention , youd notice your car had driven off the road , and maybe press the brake before some off road obstacle appears

  4. Anonymous Coward
    Anonymous Coward

    After the last childish outburst...

    ..I might be hoping for some entertainment from the Musk who seems to have been descending into an overly defensive and emotive mindset. Indeed, a "bunker mentality".

    Having said that, I temper my hopes for entertainment with the fact that somebody has died needlessly due (IMHO) to the ongoing over-promise and under-delivery of autonomous vehicles, and that even if Musk will man-up, accept responsibility and apologise, (1) I doubt he'll mean it, and (2) no amount of apologies will help the deceased, their family and friends.

    A very sobering thought for all of us: Most of us don't work in roles where we potentially put human lives at risk. Park that for a moment, put yourself in the situation that your job does. Now extend that to the idea that you messed up, and somebody is dead because of the inadequacy of your efforts. How do you make good from that? I've worked for somebody who in a previous role was underground safety manager for a large, deep coal mine. He had multiple experiences of miners being killed, and having to tell the families that although he was the Underground Manager, their loved ones weren't coming home. His eyes were strangely soulless, almost like the shark in Jaws. I don't think that was the case before he did that job.

    1. Jemma

      Re: After the last childish outburst...

      And he put himself willingly in that position in an industry that has a very poor safety record and a high chance of getting killed even *if* you do everything perfectly.

      I've very little sympathy for the guy. About the same amount for a girl killed on the most dangerous road near my home bar none (Birch Park) - you *do not* ride a pushbike down there at 3am without lights or helmet wearing black and expect to live. I won't even take the Wolseley down there, in broad daylight, because of the sociopathic soccer moms screaming flat out down the middle of the road.

      I do have sympathy for the driver however and personal experience of deranged cruise control. The Renault Safrane 2.0/2.2 had a very nasty cruise habit. If you were 25+mph off your cruise set speed it'd purr up to speed gradually like a well fed and happy Rolls-Royce. Anything under that and it'd take off in "boyracer mode" like a Siamese cat with a firework rammed up its butt. It could catch you out even if you expected it. I really really hope they changed the cruise control logic for the Biturbo version - because the results would have been similar to what happened to the Tesla. Maximum splat.

    2. Anonymous Coward
      Anonymous Coward

      Re: After the last childish outburst...

      The reality is that these people convince themselves that the pros outweigh the cons. To the extent that the death of a handful of people is worth it for perfecting this technology and what it could do for the human race.

      (I am not saying this is my view, but how people justify it)

      1. rg287 Silver badge

        Re: After the last childish outburst...

        The reality is that these people convince themselves that the pros outweigh the cons. To the extent that the death of a handful of people is worth it for perfecting this technology and what it could do for the human race.

        (I am not saying this is my view, but how people justify it)

        In reality, if your deaths-per-million-miles is lower than the average for entirely manual cars, then that's good.

        You shouldn't treat it lightly, but it's also not black and white. There are grades of safe-safer-safest. How many fatal accidents were there that day involving conventional control vehicles - even when adjusted for number-of-cars in circulation? What about the day before, or the day after?

        As it currently stands, the biggest problem with AutoPilot appears to be the marketing and users refusing to RTFM and understand the system (which is entirely to be expected).

        Aircraft autopilots weren't perfect to start with either. But the pilots were generally better trained in their limitations. Air France 447 flew into the Atlantic precisely because the autopilot bombed out and the human crew had lost all spatial awareness and weren't in a position to take effective control... seems familiar.

        1. Alan Brown Silver badge

          Re: After the last childish outburst...

          "Air France 447 flew into the Atlantic precisely because the autopilot bombed out and the human crew had lost all spatial awareness and weren't in a position to take effective control... seems familiar."

          It was worse than that. the pilots were so disoriented that _they_ flew the aircraft into the deck in their blind panic.

          If they'd let go of the sticks the aircraft would have returned to level flight. It was only an iced-up pitot (one of 3). They were so busy "trying to regain control" and fighting with each other that they didn't spend any time actually assessing the situation. You may as well have had Minions in the front seats.

    3. Dabbb

      Re: After the last childish outburst...

      "Most of us don't work in roles where we potentially put human lives at risk."

      That's one the most important points - how many developers working on self-driving cars have background in developing medical applications or airline systems, anything that might kill a human being for that matter, and how many of them were building websites and mobile apps before they started programming something that weights a tonne and moves at 100MpH ?

      1. Anonymous Coward
        Anonymous Coward

        automonous driving system developers

        These tend to be of 2 types

        1. hardened embedded developers who have worked on other types of safety-critical system (think industrial control systems, military)

        2. algorithmic developers who work on the code that transforms raw sensor data into information the control system can (hopefully) act on.

        No mobile devs or web-monkeys. IMO the risk comes from the algorithmic side as the algorithms are the product of recent research (IMO a partially solved problem) and the mind-set required to write research code (= push the boundaries) is very different from that of safety-critical code (protect the boundaries).

    4. T. F. M. Reader

      Re: After the last childish outburst...

      somebody has died needlessly

      And let's not forget the people in the other cars that were directly hit by the Tesla - the Audi and the Mazda. It is not clear to me if they were hurt, but they could have been. Even if they didn't suffer as much as a scratch there is a lot of damage to their property. And in this and in other similar situations quite a few other drivers may have to swerve, brake hard, and in general take extreme evasive action, putting themselves, their passengers/families, and yet other people at risk through no fault of their own.

      Of all the people involved, only one bought the new shiny Tesla, engaged the Autopilot, trusted it to a degree, mistakenly or not, ... In a situation like this they all are innocent victims, whether or not the Tesla driver was partially responsible. It seems clear from the description that Tesla are at least partially responsible, and all those other people on the road should be a factor in their requirements, design, implementation, testing, marketing, etc., as much or more than the driver probably, because he driver is supposed to have more information and act accordingly. Other drivers may not even realize the car in the next lane is a Tesla (just that it is blue), may be on Autopilot, its driver may be out of control for many seconds, etc.

    5. DropBear
      WTF?

      Re: After the last childish outburst...

      Now extend that to the idea that you messed up, and somebody is dead because of the inadequacy of your efforts. How do you make good from that?

      EASILY. I would say with a shrug, but your death or mine doesn't even warrant that much. No, I'm not talking about me - I'm talking about doctors. If you think any of them will have trouble sleeping at night because you died because of something they did (or more likely, failed to) do, think again; they'll do the exact same thing tomorrow. Those who would have felt responsible are either younglings who'll learn soon enough not to, or aren't doctors any more. If they'd actually care, they'd go mad. So they just don't. Those who are still there don't see you as a person. You're just more meat for the system. A lot of it dies. Plenty remains. See? Easy...

      1. Holtsmark Silver badge

        Re: After the last childish outburst...

        "Those who are still there don't see you as a person. You're just more meat for the system."

        -I feel sorry for you and the healthcare system that you are in.

        I know plenty of extremely experienced doctors who continue to care for their patients as people throughout their career. That they will have sleepless nights due to their work is clear, but they deal with it.

        At the same time, there are clearly some fields of medical work, where patient death occurs more often (with or without the intervention of a doctor). Not everyone can continue to function in these fields, however, there are enough "safe" fields that one can move to if it gets too much..

      2. Paul Cooper

        Re: After the last childish outburst...

        " No, I'm not talking about me - I'm talking about doctors"

        I'm sorry, but you are WRONG! I know several doctors and nurses who work with terminally ill children - perhaps the hardest job in the medical world. Each and every one of them cares enormously when a child dies; so much so that some have breakdowns and have to leave that job. And that's when they know that the child is likely to die, have probably spent much time preparing parents and relatives for the death, and have worked hard to ensure that the child's passing is as peaceful as possible. Doctors and nurses do NOT become "immune" to the death of patients, unless they are totally unfitted to be in their profession - or possibly, even be members of the human race.

    6. Goldmember

      Re: After the last childish outburst...

      "the fact that somebody has died needlessly due (IMHO) to the ongoing over-promise and under-delivery of autonomous vehicles"

      This is not the case. Teslas are NOT autonomous vehicles. They don't claim to be such, either. The problem here is that it seems people treat them as though they are. Like the bell end in the UK a few weeks ago who was filmed climbing into the passenger seat of his car with the autopilot engaged. Or the other guy to be killed a couple of years ago, who was too busy taking selfies and watching DVDs whilst driving in Autopilot mode to notice a bloody great truck ahead of him.

      Tesla has to change its attitude with regard to the "Autopilot" software; it should be renamed, and the point stressed that it's purely for aiding driving. They really should market this differently, as you can't eradicate the inherent stupidity of humans.

      But for fuck's sake... if you're driving a car, YOU have a responsibility to give your full, undivided attention to the task at hand. It's a huge responsibility. A simple mistake made in a split second can permanently alter or even end lives. Ultimate culpability has to lie with the driver, unless the car is fully autonomous. Which these ones are not.

      Yes, the tech drove the car into a part of the road it should not have driven in. The driving aid failed in its task. But based on the information provided so far, it seems that the driver had transferred too much of his responsibility to the tech. Had he been paying attention he could have seen the trouble ahead and applied the brake, and things would have worked out very differently.

      1. Alan Brown Silver badge

        Re: After the last childish outburst...

        "The problem here is that it seems people treat them as though they are. Like the bell end in the UK a few weeks ago who was filmed climbing into the passenger seat of his car with the autopilot engaged. "

        Climbing into the back or passenger seat has been a "thing" for quite a while - but it wouldn't be hard to prevent either. The cars have weight switches in the seats and can tell when someone's pulling this shit but it takes someone to program the things to recognise "naughty driver" activities.

      2. Charles 9

        Re: After the last childish outburst...

        To Joe Stupid, autopilot = autonomous vehicle, and they're too dumb to know otherwise, so if you can't fix Stupid, you'll have to fix the perception.

  5. Anonymous Coward
    Anonymous Coward

    When I took some refresher lessons and by accident reached for the wrong stalk on the wheel, the one that had the cruise control on the now modern car (hence refresher lessons) my driving instructor would go spare, what reaction would this autopilot elicit from her I'm scared to imagine..

    1. Prst. V.Jeltz Silver badge

      You cant rely on the stalks to be consistent from one car to the next. I think your instructors in the wrong job!

  6. mikeyg
    Meh

    My understanding of the Tesla Auto pilot is that it's simply a combination of:

    1- Adaptive Cruise Control

    2- Collision Avoidance Control

    3- Lane Following Control

    4- Lane Change Control

    When all are turned on at the same time it becomes Autopilot. Can anyone say if I am right or not?

    It would be fun to have a Tesla to rip apart and play with, but they are still a bit expensive for that!

    1. Oengus

      1- Adaptive Cruise Control

      2- Collision Avoidance Control

      3- Lane Following Control

      4- Lane Change Control

      FTFY

      1. JeffyPoooh
        Pint

        @Oengus

        You're being generous with #4, unless we accept that the average of two lanes is a reasonable option.

        1. Oengus

          Reading the article I think that the Tesla realised that it was not in the correct lane and was moving itself to get it into the correct lane... It just had a hard interrupt in the process.

    2. JohnG

      Correct. Automatic Energency Braking is an option which by default, is enabled at all times.

      There are two levels of Autopilot: Traffic Aware Cruise Control and Autosteer. Automated Lane Changing is an option, which is disabled by default. All of the Autopilot features are in Beta and every time they enable Autosteer, drivers get a warning of this, telling them that they shoud keep their hands on the wheel at all times

  7. Fazal Majid

    Lack of LIDAR

    Tesla cheaped out by not including a LIDAR, unsurprisingly as those are still extremely expensive, but no self-driving car or ADAS 3+ should be allowed without it.

    As for Musk, he richly deserves all the opprobrium headed his way for his despicable attempts to pin blame on the victim.

    1. Lars
      Happy

      Re: Lack of LIDAR

      "despicable attempts to pin blame on the victim".

      This is really the standard procedure in industry. If you have followed "aircraft investigation" it always starts with the manufacture, be it plane or engine, blaming everybody else first. I am not sure there is even any choice in reality to this. (How would that work)

      I remember a helicopter crash involving a Sikorsky. I took more than five years and a hell of a lot of money to prove it was the fault of a Sikorsky hardware part. Might have been less expensive to just let it go, but then again there are insurance and all the lawyers and what not.

      There is a "sobering" sentence in one of the links provided - “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

      So indeed people expect too much from an autopilot, any brand.

    2. Alan Brown Silver badge

      Re: Lack of LIDAR

      "Tesla cheaped out by not including a LIDAR"

      Which is funny, because my 15yo Nissan has one in its ACC.

  8. Anonymous Coward
    Anonymous Coward

    Not an "autopilot"

    It's about time they banned these driver "aids". If you don't want to drive, take a bus/train, if you want to drive a car then drive the fscking thing properly. Hands on the wheel, 100% of your attention on the road.

    1. Dazed and Confused

      Re: Not an "autopilot"

      > It's about time they banned these driver "aids".

      It's no good the manufacturer saying but we told you to look where you're going too, so don't blame us. The average car driver is not like an airline pilot plus co-pilot pair. If they don't have to concentrate they won't concentrate. Even if they are nominally looking where they are going their minds will wonder. So when a problem happens they aren't immediately ready to handle it.

    2. Anonymous Coward
      Anonymous Coward

      Re: Not an "autopilot"

      OK, so what if there's no mass transit in your area, you're still expected to work on time despite something like uncontrolled epilepsy, you live alone, and you can't afford to move?

      1. Anonymous Coward
        Anonymous Coward

        Re: Not an "autopilot"

        <i.OK, so what if there's no mass transit in your area, you're still expected to work on time despite something like uncontrolled epilepsy, you live alone, and you can't afford to move?</i>

        Uber, bicycle, car-pooling, charity bus? Whatever the solution it certainly isn't "drive a car", autopilot or not.

        1. PM from Hell

          Re: Not an "autopilot"

          I'm surprised by the fact you got any up-votes and can only assume these were from people who live in a metropolitan area who only work at a single site. I'm a Contractor who works all over the country. At the moment I'm fortunate in working close to home. I live in a rural location only 17 miles from my office but could not use public transport to get to work and actually work a full day. Taking the earliest bus in the day out and the last bus home I would arrive at work after 9 am and have to leave by 4:30. The commute would take about 3 hours a day, and yes I am serious. There used to be far more frequent buses but those days are long gone, unfortunately I have also committed the 'crime' of living in one county and working in the next. This means that the bus from my village actually takes me in the wrong direction so I need to travel 30 minute to the next town, wait 15 - 20 minutes for the bus in the right direction then off we go again for another 45 minutes (don't forget the first 30 minutes were in the wrong direction). Driving gets me to work in under 30 minutes and costs a fraction of the bus price. This is true for a large proportion of the people who live in my village. Even people living closer to town need to use cars, parents who need to drop children off at childcare or school cannot use public transport.

          And as an aside to the luddites who feel the use of cruise control is insane I do use it to control my speed when traffic is light as I see a benefit in fuel efficiency. This does not mean I give up control or responsibility for driving.

      2. Anonymous Coward
        Anonymous Coward

        Re: Not an "autopilot"

        You seriously think someone with "uncontrolled epilepsy" should be allowed to let an unsafe car "drive" for them, risking the lives of other people? Sorry, the rights of the disabled end where they put my life at risk!

        There's not a good solution for your hypothetical example, but it will come eventually when true autonomous cars are available. Unfortunately Tesla's shitty driver killer products risk setting that back by years.

      3. Adam 1

        Re: Not an "autopilot"

        Let me follow your logic. The person of your concern has a medical condition that prevents them from driving. They have enough money to live on their own rather than share, but not enough money to move somewhere closer to their employment and their employment is such that they cannot easily find a job that is more conveniently located. And the way you suggest that they may make this work is to buy a US$75,000 car which has a feature that can drive by itself up until the moment it can't.

        Yeah, no. Self driving cars will be an amazing source of freedom to many people with medical conditions, elderly, disabled, even people under the influence of alcohol or other drugs. This certainly should not be understated as a benefit, but the problem is where certain people who should know better imply the technology is more advanced than it is, then run for the hills when it isn't.

        I want to see manufacturers put their money where their mouth is before they are allowed to imply the car has an autopilot or similar technology. If they paid you out a million dollars if your car was at fault in an accident whilst self driving, and 10 million if that accident resulted in a permanent injury for anyone involved, and 100 million to the family of anyone killed, you might find that companies such as this are a lot more restrained when making these claims.

        What worries me here isn't the failure of some sensor, but that we see a company not acknowledging that the design of their system (even down to its name) incorrectly encourages people to trust it beyond its capabilities. That is a design flaw. It needs to be rectified. Maybe it needs to pull itself over and stop if the driver isn't paying attention. When a plane crashes, Boeing or Airbus don't sit back and say well pilot/ground crew screwed up here, case closed. No, they figure out why their safety processes and systems didn't fire or were ignored, and implement changes in both instructions, design and training programs to make it less likely. I'm seeing none of that here. It's entirely about saving their reputation. Until that culture changes, I don't want these things on my roads.

      4. tim292stro

        Re: Not an "autopilot"

        "...OK, so what if there's no mass transit in your area, you're still expected to work on time despite something like uncontrolled epilepsy, you live alone, and you can't afford to move?..."

        Then statistically that person is a heck of an outlier (specifically a poor, solitary, epileptic, living outside a major metropolitan area - I'd estimate less than 0.01% of the total global population), and obviously not the target audience for this type of product. Tough life.

        1. Anonymous Coward
          Anonymous Coward

          Re: Not an "autopilot"

          OK, so you wanna kill people. How Spartan of you...

    3. rg287 Silver badge

      Re: Not an "autopilot"

      It's about time they banned these driver "aids".

      Which ones? All of them?

      Does that include ABS? Traction Control?

      What about <30mph emergency-braking that can prevent fender-benders in crawling or start-stop traffic?

      They're not pretending to drive the car for you, but objectively improves safety.

      1. Domquark

        Re: Not an "autopilot"

        "They're not pretending to drive the car for you, but objectively improves safety."

        But that's the issue - they ARE "pretending" to drive the car for you. Take the example (here in the UK) where a driver was taken to court because he engaged "AutoPilot" on a motorway (freeway for you chaps across the pond) and climbed into the passenger seat to read a book! That (very stupid) man felt comfortable enough to actually do that - where do you think he got that sense of security from? If you are driving a "manual" car (even with ABS/traction/cruise control etc), you need to do something continually - at the very least steer. If you didn't, you would drive straight off the first bend in the road!

        And the safety systems are questionable too. I know someone who has a car with "auto braking" for avoiding low speed crashes and cruise control which keeps speed/distance to the car in front automatically. It works well, until the sensor at the front of the car gets the slightest bit of dirt on it. 5 or 6 squashed mosquitoes and the whole system fails!

        The future looks like a place where drivers won't know how to drive or control the vehicle, but will have the responsibility to do so when things go wrong.

        1. rg287 Silver badge

          Re: Not an "autopilot"

          If you are driving a "manual" car (even with ABS/traction/cruise control etc), you need to do something continually - at the very least steer. If you didn't, you would drive straight off the first bend in the road!

          Stupid is as Stupid does.

          There is a long list of apocryphal stories about RV drivers climbing into the back to make a coffee. Occasionally they're not urban myths, even when the cruise control is not branded "AutoPilot".

          Nature just creates a smarter idiot.

          1. Chris 239

            Re: Not an "autopilot"

            Holy carp! I followed your link, the woman that turned on the cruise control and got up to make tea lives in a place called CRETINgham !! you couldn't make that up!

    4. tiggity Silver badge

      Re: Not an "autopilot"

      I had a car with cruise control but, after trying it a few times, I did not use it afterwards.

      It was too easy to lose concentration when the CC was doing the work - not a good idea on a 70 MPH motorway.

      Obviously people are different, I'm sure some drivers may be able to stay fully focused with CC doing the work - I can't pull that trick off so avoid CC use.

      1. Anonymous Coward
        Anonymous Coward

        Re: Not an "autopilot"

        At least you are thinking while driving.

      2. Gordon861

        Re: Not an "autopilot"

        My only use of 'cruise control' in my car is for average speed camera areas, set the speed from the satnav gps and take my foot off the throttle. Otherwise it's too easy to drift up to breaking what is often a overly low limit.

      3. JohnG

        Re: Not an "autopilot"

        "It was too easy to lose concentration when the CC was doing the work - not a good idea on a 70 MPH motorway."

        True - and this effect can be increased with Atutosteer. The best approach is to consider yourself like the captain of a ship and that Autopilot is a really inexperienced and stupid trainee at the helm, requiring supervision at all times.

      4. Alan Brown Silver badge

        Re: Not an "autopilot"

        "It was too easy to lose concentration when the CC was doing the work - not a good idea on a 70 MPH motorway."

        There's plenty going on around you. ACC is great inasmuch as you don't have to worry about having to regulate your speed, but that simply gives you more time to look out for other drivers hellbent on killing everyone around them.

        Then again if you're losing concentration with CC on, you ARE one of those drivers. Don't touch the radio.

        1. Charles 9

          Re: Not an "autopilot"

          Two problems dovetailing here. One, quite simply, You Can't Fix Stupid. The other is the cruelty factor: being too dumb to live isn't considered by moral standards to be worthy of capital punishment (meaning if someone demonstrates themselves to be too dumb to drive yet too poor to have it done for them, we can't just tell them, "Then YOU LOSE. Game Over. Better Luck Next Life."). The dilemma make me keep thinking to those countries with high suicide rates, wondering if their societies actually have socially-endorsed relief valves for "dead-enders".

    5. JimC

      Re: drive the fscking thing properly. Hands on the wheel, 100% of your attention on the road.

      That would be nice if the meatsacks actually did. Unfortunately, as is clearly demonstrated every time there's poor visibility, the average driver is not safe on the road either.

      The sad reality is that an automated car will not be "safe" at the current level of technology. But a manual car is not safe either. The simple fact that the crash attenuator had already been destroyed by a human driver demonstrates that.

      Its not a choice between an imperfect technology that kills 50 people a year and no deaths, that's easy. Its a choice between the imperfect technology and 200 deaths a year caused by human drivers. The other question that could be asked - and will be in some readers life time I predict - is how many people, many of them entirely innocent 3rd parties, must die because we permit manually controlled cars?

    6. Alan Brown Silver badge

      Re: Not an "autopilot"

      > It's about time they banned these driver "aids".

      The USA's NTSB has assessed the aids as reducing the crash rate in Teslas by at least 40% over non-assisted vehicles.

      Most people regard driving as a chore and the same people letting themselves be killed by adaptove cruise controls are even more likely to be killing themselves or those around them if they were 100% in manual control.

      Back in the 1980s the most common causes of fatal crashes was "Driver spent so much time fiddling with the radio and not looking at the road that vehicle crossed centreline into oncoming traffic/left road and hit a tree". Even a radio isn't necessary. For one crash I'm aware of (car drove under an 18-wheeler on a dead straight road) the driver's hand was still gripping the pack of sandwiches in the bag on the passenger seat when they cut her body out of the vehicle.

      1. Charles 9

        Re: Not an "autopilot"

        "Even a radio isn't necessary."

        Ever considered that monotony is a key element of highway hypnosis?

    7. robin thakur 1

      Re: Not an "autopilot"

      I find them useful within their limits. The problem, as I understand it, is that:

      -The driver has to be of sufficient intelligence and aptitude to read up on them and use them appropriately and build up a feel for how the systems behave in different situations or when they fail. Nobody checks whether this is the case or not

      -They differ by manufacturer and there are no standards enforced by the industry. e.g. Park Assist is different on every car

      -You are dependent on the manufacturer explaining adequately how they function, the limitations and not just being a marketing bullet point

      -They might change over time with software updates

      -Have they been tested to destruction in all world markets the car is on sale in and do the characteristics vary by market, are these results publicly available?

      By way of an example, if my Audi was in adaptive Cruise mode with lane keeping assist on, it might seem like it's operating autonomously but if it can't detect the lane markings for a few seconds (for several different reasons), it doesn't sound an audible alarm it just changes the lane marking on the cockpit/HUD from green to gray and you suddenly notice it drifting a little.

  9. Anonymous Coward
    Anonymous Coward

    The Tesla was running Windows 10, and got a forced uodate. Probably for a 20 year old fax driver that has to be shipped because Windows stops working without it.

    1. Anonymous Coward
      Anonymous Coward

      I think there's better chances it runs some flavour of Linux. Maybe it panicked because the driver tried to play a DRM protected song...

      1. This post has been deleted by its author

      2. Anonymous Coward
        Anonymous Coward

        If Linux, then systemd, obvs.

    2. JohnG

      Actually, Teslas run Linux.

  10. katrinab Silver badge

    5 seconds is not enough

    Studies have shown that you need 26 seconds to take over from autopilot, to figure out what the car is doing, what everyone else is doing, what you need to do to correct the situation, and tgen actually do it.

    1. JustWondering

      Re: 5 seconds is not enough

      26 seconds? You are already behind the wheel, supposedly with your hands on it. How long does it take these people to react to an emergency otherwise?

      1. Anonymous Coward
        Anonymous Coward

        @JustWondering - Re: 5 seconds is not enough

        Then why would I activate the Tesla Autopilot ? I'm already driving the damn thing.

        When was the last time you saw airline pilots faking flying the plane with their hands on the commands while AP is on ? Aviation has had a non negligible number of cases when AP disengaged and pilots were never able to regain situational awareness in time.

        1. arthoss

          Re: @JustWondering - 5 seconds is not enough

          from own experience with adaptive cruise control: having it enabled, you can press the brake pedal faster, as your foot is not on the acceleration, you are safer if you turn your head and another car veers into your path at the same time, you save more energy, slowing down cars at the end of a jam are detected and in a jam you don't have to constantly accelerate and decelerate.

          1. YetAnotherLocksmith Silver badge

            Re: @JustWondering - 5 seconds is not enough

            In a limited format, that's great. If you are driving at 5mph traffic in a car that does 0 to 60 in 3 seconds, and it takes you even 1 second to react, how many other cars can your car hit before you press the brake pedal?

            1. This post has been deleted by its author

        2. JustWondering

          Re: @JustWondering - 5 seconds is not enough

          Because it isn't really autopilot? Doesn't an alarm go off if you don't hold the wheel? It's not like you can jump in the back for a quickie while the car drives

        3. JohnG

          Re: @JustWondering - 5 seconds is not enough

          "When was the last time you saw airline pilots faking flying the plane with their hands on the commands while AP is on ?"

          Pilots trying an autopilot system that is in beta might do somehing along those lines. The Tesla Autopilot systems are in beta.

      2. rg287 Silver badge

        Re: 5 seconds is not enough

        26 seconds? You are already behind the wheel, supposedly with your hands on it. How long does it take these people to react to an emergency otherwise?

        Too long. That's why people have managed to have serious accidents for the past century all on their own, without the aid of technology.

      3. Anonymous Coward
        Anonymous Coward

        Re: 5 seconds is not enough

        26 seconds.

  11. Lee D Silver badge

    Ignore ALL the auto-pilot stuff. Let's pretend that doesn't exist. Let's pretend it IS just cruise control.

    Every modern car I know with cruise control has auto-braking on it. It can follow any solid object and stop if it detects an obstacle BEFORE it smashes into it and puts your brains through the steering column.

    Whether or not the Tesla was "auto-piloting" by any definition, why the hell did the car not say "Hey, that's a solid object and I have less than a second before I die a burning death of carnage and flame" and at least to attempt to brake?

    The answer? Because it's SIMPLY NOT CAPABLE OF DETECTING OBJECTS. Simple at that. It has no idea if there's a solid impenetrable concrete barrier in front of it, approaching at speed. It simply will not brake.

    Whether or not you think you can trust the thing in a lane on a highway, it can't even do the simplest, most basic thing with all its fancy sensors. Detect a solid object directly ahead within the path of its steering that approaches at a normal vehicular speed. Whether that's a lorry or a concrete divider.

    Given that it can't even sort the basics out, using an array of sensors, what the hell are people doing entrusting lane-control, speed management, etc. to it via a camera?

    Notice: IT DIDN'T EVEN TRY TO BRAKE. Not even at the last second. It just doesn't know that the nearest object is 10m away, the car is travelling at 10m/s, yet that object is coming towards it at 10m/s and it has less than a second to DO SOMETHING about it, and it can't even be bothered to beep, brake, swerve, avoid, or display a fecking popup on the dash.

    These things are deathtraps if they can't do the simplest things. Things that "non-intelligent", non-autopilot cars can do as a matter of course with their cruise control. Things that are increasingly STANDARD on modern cars, let alone some extra-bolt-on to some (let's face it) prototype boy's toy.

    I'm not suggesting reliance on them. I'm not suggesting that the driver was attentive or shouldn't have been in control. But, ffs, come on. It just smashed into a solid object which happened to be approaching it at the same speed as it's wheel-speed, and it didn't even notice, care, react or warn of anything. It just went bang. Lanes or not. Accelerating or not. Human or not. Just smack, straight into it.

    Stop trusting your life to this junk.

    1. John Brown (no body) Silver badge

      "Every modern car I know with cruise control has auto-braking on it."

      Not sure what you mean by "modern" but my 2 year old car has cruise control but it's neither adaptive nor capable of applying braking. (I don't get to choose my car, the company supplies it. Sometimes I get to pick a colour)

    2. Ben Tasker

      To be fair, what you seem to be talking about is AEB - which Tesla's do have.

      But, most (if not all) models of car with AEB disengage it above a certain speed (usually about 30) so that false alarms don't lead to cars braking sharply in the middle of the motorway, causing a hazard in themselves.

      So it's not too surprising that AEB didn't trigger in this case. Though as you say, it's concerning that the car appears to have done precisely nothing, even in the final moments to suggest it even knew what was coming.

      As concerning as that is, though, Tesla's response is far more worrying. That habit of blaming everyone but themselves does not inspire confidence. Yes, the impact attenuator was missing, but it was only needed because Tesla fucked up. It's absence potentially worsened the accident, but did not cause it. Yes, his hands were off the wheel, just as the hands of many people using Autopilot are off the wheel sometimes (his hands were, however, on the wheel for than 50% of the preceeding minute).

      Tesla do a massive disservice to the autonomous car industry (outdone only by Uber, in some respects). Their cars lack hardware that would dramatically improve safety, and their attitude as a company towards accidents and safety is one of a company that should no longer exist.

      1. James Wright

        I own a tesla, their hands on wheel detection is suspect at best. Many times I get the warning even though my hands are on the wheel. For them to try and cast aspersions because hands on the wheel were not detected is wrong. This is just them throwing FUD at the problem.

        1. Baldrickk

          For them to try and cast aspersions because hands on the wheel were not detected is wrong.

          Hands on wheel or not, if the driver had been paying attention, one of the hands on wheel, steering, braking sensors would have triggered had the driver had a response to the actions of the vehicle.

      2. Baldrickk

        Their cars lack hardware that would dramatically improve safety

        A permanently enabled immobiliser would dramatically improve safety.

        A fully autonomous system would improve safety

        a foot-thick crash barrier of honeycomb aluminium to absorb impacts would improve safety.

        Not all of those are desirable, with convenience and cost. Sometimes when creating a product, you do have to draw a line.

        Not that making safer cars is a bad goal, but a Tesla is also arguably one of the safer cars around.

        I don't know exactly what that driver was doing, but when you get people like the guy mentioned above who was arrested for climbing into his passenger seat to read a book - there's not much you can do for that person.

  12. D. Evans

    Not Surprising

    I put of posting this earlier but I finally succumbed to my base nature.

    I cycle north on El Camino most afternoons and I'd see this car. The personalized number plates were a give-away.

    The drive never looked at the road, even in peak traffic.His head was always down on a phone or tablet. And he held up traffic as the Tesla was generally slow. I heard a number of cars use their horns to signal their displeasure.

    An unfortunate end but not a surprising one.

  13. Anonymous Coward
    Anonymous Coward

    Car's behaviour makes sense

    I, like probably a few others, drive past this junction every day. So I'll take a stab at those questions. First some more context: the left two lanes of the 101 are HOV lanes (where stickered Teslas are also allowed, don't get me started...). Left most lane goes to 85 (its an HOV exit), second left stays on 101. People are always figuring this out at the very last minute.... Finally, if traffic is flowing here, its flowing at 80-90mph. Speed limits mean nothing on California freeways.

    Thus:

    What happened to the car that the Tesla was following – did it take the 101 or the 85? That would be critical to understand the car's behavior

    -it probably took the 85 at the last minute, clipping the gore/nose area, dragging the Tesla with it

    Why did the Tesla speed up rather than slow down?

    -as said by someone else, no car in front, resume set-speed

    Did the Tesla detect it was not in a lane? And if so, why did it not do more to get into one of the actual lanes?

    -if you're already in the gore, it looks just like a lane

    Did the Tesla know what route Huang was intending to take - the 101 or the 85? And if so, did that affect its behavior?

    -If he was in the second to left lane, the 101.

    Not to make light of it, but why didn't Huang take the 280?

    1. Anonymous Coward
      Anonymous Coward

      Re: Car's behaviour makes sense

      Traffic at 80mph/90mph on 101? You must be joking. At 4am maybe. Been driving this stretch of 101 since the Federal Speed Limit was 55mph.

      The usual speed on this part of the 101 at non rush hour is 65mph/70mph. But once traffic starts building closer to 60mph. Just add one or two semis to the mix and its slow and go. If you need to get down the Peninsula faster you go 280. Although try doing much over 75mph on 280 and the CHP will pick you off. 280/92 and 280/85 are the usual killing zones. Last time I commuted 280 down to 880 there would usually be at least two or three traffic stops each direction, every day.Only time I ever got a speeding ticket was at 280/380 and the freeway speed limit was 55mph. Thats how long ago it was. Fewer safe places to do traffic stops on 101 but the CHP/ local PDs do pull over people on a regular basis.

      As for the rest. The 101/85 interchange was fine until they messed it up by adding the Diamond Lane feeder overpass about 15 years ago. As I'm not one of the anointed ones who drive a hybrid / electric car with a CARB yellow sticker I use the far right lanes just like all the other plebs to get on 85 so made no difference to me.

    2. Michael Kean

      Re: Car's behaviour makes sense

      "Not to make light of it, but why didn't Huang take the 280?"

      Might have been aiming for the nine and three quarters...

    3. Alan Brown Silver badge

      Re: Car's behaviour makes sense

      "Speed limits mean nothing on California freeways."

      60+ years of traffic studies have proven that speed limits don't mean much anywhere. If they're within 5-10 mph of the 'design speed' then drivers cluster to them.

      A limit too high for the road will result in most drivers slowing down below the design speed and a limit too low for the road will result in most drivers speeding up to something alittle above the design speed. In both cases "speed spread" will drastically increase (and that's more dangerous than simple speeding as it results in people wanting to pass slower traffic. A slow driver is one of the most dangerous hazards on the road)

      The _only_ way to change driving speeds on a road is to change the road and a lot of the changes you'd think are obvious (like narrowing lanes to slow traffic) actually have the opposite effect.

      This also applies to the presence of pedestrian fencing, crossings, traffic lights, speed humps and parking restrictions in urban environments - perversely they usually result in _lowered_ pedestrian safety because drivers speed up and exhibit tunnel vision. Chicanes only slow drivers down near the chicane and they accelerate away form them, increasing noise and air pollution in the viciniity, whilst paying less attention to nearby hazards.

      One of the easiest and cheapest ways to dramatically slow urban traffic is to remove the centreline and turn off traffic lights - but at the same time you'll find that the traffic moves more smoothly and is less likely to snarl up in peak periods. Human factors (aka traffic psychology) turns up a lot of weird shit and it's only recently been applied to roads vs the 50 years it's been the most important part of aviation safety.

      1. Charles 9

        Re: Car's behaviour makes sense

        "One of the easiest and cheapest ways to dramatically slow urban traffic is to remove the centreline and turn off traffic lights - but at the same time you'll find that the traffic moves more smoothly and is less likely to snarl up in peak periods."

        Have they tested that in countries where the rules are rarely followed anyway, like southeast Asia?

  14. llaryllama

    Playing Elon's advocate here..

    I am wondering out loud what the net benefit/loss is so far with these self or pseudo self driving cars. I wasn't able to find any proper studies but admittedly haven't tried very hard.

    Do road deaths and injuries decrease when they are being used? It's an important question to ask because if the answer is yes then we should support their use and development even when there are mistakes or accidents.

    There are some pro-car people making a noise that even one death in these vehicles proves they are not fit for purpose etc. etc. while ignoring the fact that traffic accidents are a leading cause of death and injury in the modern world.

    I used to enjoy driving when I was younger but as I get older I am starting to think I am just adding to the world's problems by doing so. I guess I see neither autonomous vehicles nor meat bag drivers as the future, it would be cool to see more towns adopt smart urban planning and less reliance on cars or powered transport outright.

    1. Dodgy Geezer Silver badge

      Re: Playing Elon's advocate here..

      ..Do road deaths and injuries decrease when they are being used? It's an important question to ask because if the answer is yes then we should support their use and development even when there are mistakes or accidents....

      In theory, yes. Fully automated transport systems should be much safer than individual humen controlled ones.

      In practice, getting there will involve experimentation, and the risk, not only of individual disasters. but of occasional major ones. Until the engineers have the process well optimised.

      Take a look at the history of rail travel. It's pretty safe (although not 100%) BECAUSE of the accidents in the past...

    2. ArrZarr Silver badge

      Re: Playing Elon's advocate here..

      Absolutely. People seem to be treating Autonomous Cars as if they need to be perfect before they will be better at driving than humans.

      They don't, they just need to be faster than your friend while you're both running away from a bear.

    3. JohnG

      Re: Playing Elon's advocate here..

      Statistics apparently show less accidents/fatalities when using Autopilot on a Tesla, which is why DirectLine give a 5% discount on insurance premiums for Autopilot equipped Teslas.

  15. ecofeco Silver badge

    In America, the driver is ALWAYS responsible

    This is the law, coast to coast, The driver is ALWAYS responsible and liable. ESPECIALLY if it is something that was preventable.

    i.e. keep your damn hands on the wheel and pay attention to all warning instruments.

    This bullshit that new technology is somehow confusing has been a con game since the dawn of time.

    1. Anonymous Coward
      Anonymous Coward

      Re: In America, the driver is ALWAYS responsible

      You're missing a point here, the guy was not the driver. Tesla was driving the car.

      I sometimes see irresponsible ads for cars (Nissan is one of them) where smiling young female drivers take their beautiful hands off the wheel and let progress do its magic and drive the car. This is insane. Your brain should drive the car not the other way around.

      1. hoola Silver badge

        Re: In America, the driver is ALWAYS responsible

        In which case then, ultimately Musk is responsible. It does not matter if he wrote the code, the buck has to stop somewhere. At the moment with all this self driving guff, there is no one responsible. Get some CEOs in the courts with manslaughter or dangerous driving charges and all of a sudden things will change.

        That will not happen due to the amount of money behind these grubby tech outfits, and Tesla is a tech outfit. The provide a (albeit expensive) disposable piece of consumer electronics. If they were a car manufacturer they would not be having all these problems because safety usually comes first due to experience and all the regulation.

      2. Baldrickk

        Re: Tesla was driving the car.

        No, Tesla was not driving the car. The car wasn't driving the car. The Driver was driving the car, and it was his responsibility for whatever the car did.

        "Autopilot" aka assisted cruise control was on, but those are driving AIDS, not an autonomous system.

        1. Charles 9

          Re: Tesla was driving the car.

          If that were the case, why even call the stinking thing an "autopilot", which evokes specific imagery in people's heads, for better or worse (and image matters a lot--image was partly to blame for the panics that kicked off the Great Depression)? Such things should be barred from being called anything of the sort, seeing as how You can't fix Stupid and leaving them to Darwin leads to lawsuits and cries of being inhumane so you can't win.

      3. JohnG

        Re: In America, the driver is ALWAYS responsible

        "You're missing a point here, the guy was not the driver. Tesla was driving the car."

        No, These things are driver assistance aids and Autopilot is in beta - the driver must be aware and in control at all times. Every time a driver enables Autosteer in a Tesla, there is a warning to this effect.

        1. Charles 9

          Re: In America, the driver is ALWAYS responsible

          No amount of warning is gonna help if people are in such a state as to zombie through it. That's why there's such a thing as "click fatigue".

          People NEED to drive to make ends meet; for many, there are no alternatives (no mass transit, taxis cost more than they make, and no coworkers nearby). If they lose the ability to drive, you're essentially telling them FOAD, which can adversely affect crime rates and so on. It's a complicated but very real problem, which is why the increased push for end-to-end automated transportation; there's a significant use case for them.

    2. JohnG

      Re: In America, the driver is ALWAYS responsible

      Also in the UK. It is similiar with the daft errors people make when using in-car navigation systems - the driver is the one with the driving license, not the nav system or autopilot.

  16. Anonymous Coward
    Anonymous Coward

    The sad reality...

    ...of rushed to market, not ready for prime time products.

    Working in the auto industry for many years I have in recent times observed the rush to market mentality of those looking to be the "first" to offer a commercially viable autonomous vehicle, (AV). There is a "wild west" mentality in many quarters due to the potential financial pot of gold. IMNHO federal governments world wide have abdicated their responsibility to establish minimum safety, security, design, engineering, validation and sustainability standards for AVs and the tech being used in them.

    Naïve consumers are being used as crash test dummies and the results are not acceptable. Tesla, Uber et al must be held accountable for the deaths and injuries their defective vehicles impose. These vehicles should not be allowed on public roadways until independently tested and certified to be safe and reliable. How many more Tesla autopilot drivers must die before Tesla suspends operation of their autopilot operation and fixes all of the technical issues?

    1. Charles 9

      Re: The sad reality...

      They'll just counter that HUMANS aren't fit to drive given such horrific casualty rates. It's just you don't know about them because they're so frequent as to be "normal". Sauce for the goose, sauce for the gander.

    2. JohnG

      Re: The sad reality...

      "The sad reality...

      ...of rushed to market, not ready for prime time products."

      The snag is, these systems need to learn through data gathered in the real world. The nuances of driving in the real world are not all available through the use of simulations and test tracks. The Tesla Autopilot systems are in beta and Tesla cars collect and send driving data back to Tesla (Tesla cars are always online to mothership.tesla.com).

  17. Anonymous Coward
    Anonymous Coward

    Apple engineer was driving...

    Bzzzzt! Wrong! He wasn't driving.

    1. Lars
      Coat

      Re: Apple engineer was driving...

      "Bzzzzt! Wrong! He wasn't driving.". Yes, but he should have been driving to stay alive..

  18. This post has been deleted by its author

    1. Dodgy Geezer Silver badge

      Re: These are all valid questions

      HumoUr and quips are how the Brits deal with disaster and tragedy. Worked very well during the Blitz...

      1. Lars
        Happy

        Re: These are all valid questions

        "HumoUr and quips are how the Brits deal with disaster". Yes, but what about Brexit, humour and quips?.

        1. Fred Dibnah

          Re: These are all valid questions

          It's a shame the people who made Twenty-Twelve haven't made a brexit comedy series to run alongside the real thing. But I suppose we wouldn't be able to work out which is which.

    2. Anonymous Coward
      Anonymous Coward

      Re: These are all valid questions

      Some people find Darwin stories funny. Others don't. Diversity - it's a cultural imperative.

  19. Shane Sturrock

    My 2 cents

    OK, so a little background on this driver. He had complained about the car's behaviour multiple times at this exact spot and yet didn't appear to have learned that the software wasn't able to cope. The particular location had previously been hit by another car so the attenuator was already crushed which made this much worse. CalTrans hadn't fixed it after the previous crash. 101 is also quite a light road surface based on when I used to drive along it and the markings are often faint to non-existent so for a human driver it can be tricky as others have noticed. Add to that the speed of the traffic along there, how close together people drive leaving very little room to actually see the road and react. Basically, we have a driver who wasn't paying attention (observed by other road users to be looking in his lap and not looking at the road), had experienced problems at this point in the road previously, and on a road with poor markings, an already crushed attenuator and driving at speed close to other cars. Just an accident waiting to happen and it could just as easily have been someone in any car running on cruise control and not properly paying attention. But lets place the blame on autopilot rather than the driver who was actually responsible here. Autopilot is known to be twitchy under tricky conditions and just is a driver assistance package but while it gets better with each release it isn't self driving so the meat sack at the wheel needs to pay attention.

    1. Anonymous Coward
      Anonymous Coward

      Re: My 2 cents

      In what sense could this guy be described as an "engineer"?

  20. Anonymous Coward
    Anonymous Coward

    When will we learn? It's all about the money again.

    As human beings we can't even write fart apps for phones or simple CSV loaders without them screwing up and needing fixes! How the hell are we going to write 100% foolproof software that can operate a 3/4 ton piece of machinery that can move in any direction on a public highway at speeds up to 100mph? FFS!

    We need at least 10 years of solid off road, test track testing of these autonomous vehicles, tested to full destruction in as many situations as can be created. A serious pile-up will kill 20, 30 upwards of 50 people and yet we allow these companies to play with these "toys" on public highways.

    All these car companies can see is one simple thing, "We need to be the first to full production with an automatic vehicle, beating everyone else no matter what it takes!". They will, like every fucking company on the planet, simply cut corners to make it happen and this time people will die, lots of people will die. How many people will die while they test these things? Don't quote me the "right stuff" either. When they tested rockets before the manned flights they first used animals and later on the pilots were chosen knowing they will most likely die, they agreed and they tested on private ranges, out of sight and away from populaitons. These cars are being tested on public roads with drivers to take over when it fails. We all know that a human being cannot out think a machine fast enough, a machine can think in terms of nano-second timings, the best we can do is a few milliseconds, we're going to lose every time and sadly the chosen "test pilot" and others too will lose.

    The "test pilots" will continue to die on public roads and take others with them until one day something really big happens to make governments step in put a stop to it, we'd all better pray we're not in that 50 car pile up when it happens.

    1. Dodgy Geezer Silver badge

      Re: When will we learn? It's all about the money again.

      I remember the Brits being the first to have a jet airliner flying - the Comet, which set the standards for all the jet aircraft which came after it.

      Unfortunately, being first means that you are the first to run into new problems that no one else has encountered. In the Comet's case it was metal fatigue (not well understood at the time) due to constant pressure/depresurising cycles. The aircraft which came second learnt a lot from the Comet...

      1. Lars
        Happy

        Re: When will we learn? It's all about the money again.

        @ Dodgy Geezer

        What a funny and so British thing to claim.

        The Comet was interesting like the Titanic or the first British airship that crashed on its maiden flight.

        Disasters are always interesting and there are often important things to learn from them.

        As you mention pressurization I assume you believe the Comet was first there too, but all the normal suspects like the French, the Germans, the Americans and the Russians had that before the Comet. Also the British Avro Tudor before the Comet.

        https://en.wikipedia.org/wiki/Cabin_pressurization#History

        The Comet set no standards, all that could have been learned from that disaster was the importance of good craftsmanship and quality control.

        There are quite a few stories, of variable quality, regarding the cause of the "problem". The square windows are mentioned, and did not help, but a well made plane could have square windows. A more plausible explanation was that the rivets were forced through the skin and not predrilled causing small cracks in the aluminum skin.

        Think of thin folio, it's very hard to break by pulling it, but give it even the smallest crack and it will snap easily.

        Claiming metal fatique is very far fetched as those plans were doomed the day they were built, nor was metal fatique anything new in any field.

        The British, very self centred, view is of course that had it not been a disaster Britain would have become a leader in that field. I doubt it a lot, Boeing was well on the road. They used the huge experience from their bombers and they got it right with the engines under the wings, there was no way the British could have prevented them or won. Looking back at the flying boats, they won that competition too.

        So, Dodgy Geezer question your information, people who don't do it might end up being run by caricatures like Bojo, Mogg, Davis and a Vicars daughter.

        The one impressive thing about the Comet was how much effort was put into finding the reason, only those tests should have been done before and not after.

    2. JohnG

      Re: When will we learn? It's all about the money again.

      "We need at least 10 years of solid off road, test track testing of these autonomous vehicles, tested to full destruction in as many situations as can be created."

      Simulations and track testing really don't give adequate data, notably of variations in road signs and markings, the behaviour of other road users, etc.

      1. MachDiamond Silver badge

        Re: When will we learn? It's all about the money again.

        "Simulations and track testing really don't give adequate data, notably of variations in road signs and markings, the behaviour of other road users, etc."

        A test track should have variations in signage and markings to simulate best, average and worst conditions. It would be easy to come up with a suite of tests akin to a friend/foe shooting range that would throw at driving systems the worst situations a cynical old bastard like me can think up.

        The DARPA Urban Challenge was conducted on a decommissioned military base in the residential area and that was a superb place to do the testing. It's a great reuse of the facility. A section of houses could even be used to house people and shops during the tests. Balls and replica dogs could be made to dart out into the street. Overhead lines could dangle bicycles along the road that the cars need to detect and avoid. Etc. Etc. Only cars that have passed the torture test should be allowed onto public streets. Type certification for aircraft is very rigorous so that a new model is thoroughly tested before the first paying passenger steps onboard. The same should be required for a mode of transportation that is statistically more dangerous.

        1. Charles 9

          Re: When will we learn? It's all about the money again.

          But some things CAN'T be simulated or anticipated: like Murphy. Why do you think the final phases of drug trials involve real people? Because, as they say, "Ain't nothin' like the real thing, baby."

  21. Herby

    Please Drive...

    Yes, I've driven this section of road a few times, some before the silly flyover for the carpool lane. Please note that I drive the vehicle, even when it has cruse control. Sometimes I tap the brake to disengage it, but I still drive the vehicle.

    Sorry, I don't need a silly sparky vehicle that is truly unaffordable for all but people who want to be snobs, or just need to drive in the carpool lane. Yes, I drive a nice SUV that gets terrible mileage, but DOES have a 300+ mile range that I regularly use while driving down I-5 to southern California to visit in-laws (nice people, by the way). I sincerely doubt that I could make it there in any sparky car in under 6 hours like I do in my SUV.

    I see many Waymo vehicles around, and they all have their Lidar beacons scanning the countryside. I have doubts that they will replace drivers soon for longer distances. I'd be willing to try one from the driveway of my house, down to my in-laws parking lot behind her condo. I suspect that it might run out of gas somewhere along the way, and probably take more that the requisite 6 hours. These companies have a LONG way to go. The silly sparky car isn't even close.

    1. YetAnotherLocksmith Silver badge

      Re: Please Drive...

      So I am getting the impression you don't like the electric cars that you've never tried, then?

      The point of an automatic car is that it, if working correctly, means you'll do that long journey in two parts with a short stop in the middle. It'll take maybe 8 hours, during which you'll go to sleep overnight, and arrive in the morning. A net gain of 8 hours.

      However, currently, these don't work as well as that, so you're held captive by a crap system that demands you monitor it closely so that when it gives you 0.5 seconds to react, it can blame you for literally ramming a police car.

      Give it ten years.

    2. MachDiamond Silver badge

      Re: Please Drive...

      If you are driving down the 5 freeway, you would have to be driving a Tesla. You would need to take the 99 with other brands to use DC fast chargers. I found it sorta funny when looking at fast charging in The Peeples Republik of Kalifornia to see the two main North/South freeways segregated by EV brands.

      BTW, the trip is pretty easy with an EV and you can find several YouTube trip videos from people traveling those routes. That said, you would not be required to sell your fuel hog if you were to get an EV for daily use. Moonbeam hasn't unilaterally signed a law that mandates that yet, but he does have some months left before he's out. Bog only knows what silliness he will commit before November.

  22. Anonymous Coward
    Boffin

    Nothing is right first time

    Jet aircraft had to go through teething problems such as the Comet design, I'm willing to cut Tesla some lack here.

    1. Anonymous Coward
      Anonymous Coward

      Re: Nothing is right first time

      They can have their teething problems OFF THE PUBLIC ROADS!

      Or did I miss it and those early jets were taking passengers and crashing into airports killing people while they worked the bugs out?

      1. Pete4000uk

        Re: Nothing is right first time

        I agree about keeping them off public roads, but sadly many people have been killed by jets crashing from some then unknown issue

      2. Anonymous Coward
        Anonymous Coward

        Re: Nothing is right first time

        They are only following the illustrious example of Microsoft, which has been treating its users as unpaid beta (and sometimes alpha) testers for decades now.

        But at least Windows doesn't usually kill you.

      3. rg287 Silver badge

        Re: Nothing is right first time

        Or did I miss it and those early jets were taking passengers and crashing into airports killing people while they worked the bugs out?

        Well yeah actually, three De Havilland Comets broke up in 12 months before they grounded the fleet and discovered this new thing called "Metal Fatigue".

        1. Lee D Silver badge

          Re: Nothing is right first time

          Gosh, if only you could trial them at slow speed on things that are lesser risk, in areas where it's safer to go wrong.

          Everything from golf carts ("Drive me to hole 9") to milk floats, theme park transport to warehouses.

          No, nobody did that. Nobody bothered. It was straight into "self-driving car, but it's not really self-driving, but everyone think it's self-driving, smack bang on public highways with all the other drivers at 80mph".

          There's a little train for the kiddies, that's actually just a wheeled vehicle, that drives around my local shopping centre. Pay £1. Stick the kids in. You loop around the upper level and come back. There are low-speed pedestrians everywhere, the train makes a noise to let you know it's coming, it travels at about 5mph with some minimum-wage employee on the controls, past other shoppers, in a controlled environment, on a fixed route (that isn't railed or anything, just drives through the shoppers).

          That would be INFINITELY safer to test on, especially as regards "What if someone just steps out in front of us". Worst case, you'd catch a pedestrian on the ankle. I see no evidence that Tesla's etc. code has been tested significantly in such environments. Hell, a self-driving shopping cart! Genius! And a sub-product you can actually sell even if you can't scale it.

          But these things are still making stupid mistakes and are entirely unpredictable.

          1. Charles 9

            Re: Nothing is right first time

            As the saying goes, "Ain't nothin' like the real thing, baby." There's just no substitute for actual, on-the-road testing, just as the final phase of clinical trials always involves actual people.

      4. JohnG

        Re: Nothing is right first time

        "They can have their teething problems OFF THE PUBLIC ROADS!"

        Then the systems will never be ready for public roads, because they will not have been tested in the real world and will have insufficient data/"experience" of the variations in real world road markings, signage and driver behaviour.

        "Or did I miss it and those early jets were taking passengers and crashing into airports killing people while they worked the bugs out?"

        That is precisly what happened with the Comet and numerous other aircraft types. Of course, manufacturers and safety regulators attempt to address all the bugs before the aircraft enter service but numerous accidents have resulted in recalls and retrospective changes. This is pretty much the story of every accident investigation programme on TV.

  23. Anonymous Coward
    Anonymous Coward

    Still a bit of uncertainty

    The report gives a lot of detail, but there are still unanswered questions, particularly about what the driver did or didn't do.

    This leaves a bit of wriggle-room for those who would rather blame the driver than consider that the vehicle was primarily at fault. For my part I think that in effect the 'autopilot' committed suicide for reasons unknown.

    Why it did so requires a detailed technical investigation, but in the meantime I think it is a gross mis-representation, leading to a false sense of security, to call the driver-assist function an 'Autopilot'.

    The end result is a system that can fail catastrophically, and that should be sorted out before the public are allowed to drive these things on public roads.

    1. Anonymous Coward
      Anonymous Coward

      Re: Still a bit of uncertainty

      "For my part I think that in effect the 'autopilot' committed suicide for reasons unknown".

      Ever read Frank Herbert's "Destination: Void"?

      1. Anonymous Coward
        Anonymous Coward

        @Archtech - Re: Still a bit of uncertainty

        No, that story passed me by. Either that or I read it so long ago (~50 years) that I forgot it.

        Perhaps people who develop AI for safety-critical uses should spend some time reading SF back to the 50s, to get an idea of the fuck-ups that they should be watching out for.

  24. Chris G

    All or Nothing

    Autonomous vehicles will be fine when they work but IMHO at the present autopilot and it's relatives from other companies are on a par to human based drug trials. The drivers are guinea pigs who are part of the process of developing the tech.

    I don't think any system that gives a false sense of security can be allowed on the roads because it will end in accidents like this one, a tired driver who like most of us thinks he can pay attention to the road, relax, text or whatever will put his trust in these systems and then sometimes pay the price for his inattention. If the system either was not on or even not installed the accident would probably not have happened.

    At the current state of development anyone in control of an autonomous or semi-autonomous vehicle is ultimately responsible for it, if even the testers working for people like Uber and Google are allowing their attention to lapse and subsequently having accidents, then the systems should not be on public roads.

    When they work reliably with adequate redundancy, sensory and analytical systems maybe they would be usable.

    Though not something I would want, the simplest way to be able to get in a car,give a destination and sit back and relax is to have either a driver or control from a central source that oversees and choreographs all of the vehicles in a given zone and hands over to the control for the next zone when passing into it ( The next horror ' The Internet of Vehicles') .

    In large cities and conurbations autonomy doesn't really make sense, it's a vanity.

    To some extent governments and authorities are partly to blame in encouraging manufacturers to roll this stuff out early before development is sufficiently advanced.

    1. Charles 9

      Catch-22

      What you propose, however, is a Catch-22.

      Because the ONLY way to make it considered trustworthy on public roads is to TEST them. But the ONLY way to test them reliably is to use public roads. There is NO substitute.

  25. bish

    Fire Department

    I realise that everyone is far more interested in attacking or defending Tesla's flakey autopilot, but can I ask: what were the fire department doing, pouring water on a burning battery? Electric and hybrid cars are pretty common now (more so in the Valley, I'd guess), so either no one has bothered to tool up the fire fighters with suitable extinguishing materials, or they haven't yet realised that pouring water on a chemical battery is probably the second worst thing you can do, behind setting fire to it in the first place.

    1. Ben Tasker

      Re: Fire Department

      The water is used to cool the packs. They actually used foam to try and extinguish the fire.

      1. YetAnotherLocksmith Silver badge

        Re: Fire Department

        Indeed.

        There's not a lot you can put on a few hundred kilos of burning lithium to put it out.

      2. Charles 9

        Re: Fire Department

        The lithium in battery packs as I recall isn't raw metallic lithium but rather in a compound. The result is that the material is not nearly as water-sensitive. That's why airliner guidance for a phone battery on fire is to douse it; for something as sensitive to fire as an airliner, they wouldn't be saying this if they didn't consider the ramifications carefully. In this case, cooling down the battery to prevent further thermal runaway is clearly more of a benefit than the risk of a lithium reaction.

    2. Anonymous Coward
      Anonymous Coward

      Re: Fire Department

      The thing about water and lithium is a bit overstated. Lithium is less reactive than the other alkali metals. Yes it produces hydrogen when wet, but the rate of production of hydrogen depends on temperature and surface area. Apply lots of water to keep the temperature down, because simply denying it air won't work terribly well once it's over 85C.

      1. Lee D Silver badge

        Re: Fire Department

        The problem is the cell-compartmentalisation. It takes only one cell to become a thermal runaway and you have flames. But cooling them involves pouring water on the middle of a battery that's almost entirely solid metal contained in a metal box. It's hardly accessible.

        It's not going to go "boom" on contact with water, but it's going to expand, release gas, put pressure on nearby cells, all contained in a fixed size box with thousands of other cells that were - until then - untouched.

        And as shown - even days later the tiniest sliver of stray/punctured/scored metal from the accident shorting out the cell starts a fire again.

        I have seen a Macbook keyboard physically expand upwards and then ping keys off it, over the course of just minutes, because a battery got wet. The battery was literally 2-3 times its original size and punched/warped the aluminium casing in a matter of seconds. That's not helpful.

  26. RobertLongshaft

    Why the hatred for Tesla and Musk?

    Is it because he told you the truth about your pantomime profession?

    Did the driver have his hands on the wheel like he was supposed to? No. Case closed.

    1. Anonymous Coward
      Anonymous Coward

      Dear "Robbed of your longshaft",

      Did the valid criticism of Your Beloved Elon cause your tech erection to flag?

      1. DropBear

        While I don't find Elon's reaction particularly tasteful, I don't think he did anything out of the ordinary either. Tesla didn't lie; they were simply quick to point out anything and everything that may have contributed to the crash beside their own role. Hardly surprising, that. Every single person and company I can think of does exactly that immediately whenever blamed for something. It may not be the reaction you're looking for, but it's certainly human nature...

    2. sabroni Silver badge

      re: Is it because he told you the truth about your pantomime profession?

      Oh no he didn't!!

  27. Paul Hargreaves

    There are a lot of non-Tesla drivers in these comments.

    Anyone who actually owns one, having spent a few minutes getting to know Autopilot, quickly learns it's limitations. It's pretty damn good on motorways, but you know as soon as you come to junctions you have to actively tell the car what to do.

    Any one who is stupid enough to just let the car drive, without paying any attention at all, would probably be the sort of person who would have done the same thing in a car without the feature.

    1. Baldrickk

      Non tesla driver here

      I test drove a Model X. Got the sales-person to turn on Autopilot

      Car immediately accelerated hard and pulled sharply to the left, presumably to find the line, only it was sharp enough to have taken me off the road had I not resisted the turn which stopped it. Autopilot was immediately turned off again.

      It's not fully autonomous, and I wouldn't be happy to leave it trying to drive without my guidance/overwatch if I were to get one.

      1. JohnG

        Re: Non tesla driver here

        "It's not fully autonomous, and I wouldn't be happy to leave it trying to drive without my guidance/overwatch if I were to get one."

        Which is exactly what the user manual says you should do. The autopilot systems are in beta and full self driving is not yet available (FSD probably won't be available for a long time, probably eons or elons)

      2. DryBones
        Pint

        Re: Non tesla driver here

        It seems to me that the machine vision is being done wrong, and completely backward, and needs to go back to first principles.

        How do I stay on the road?

        - First, find the edges of it. Edge detection is key.

        - Lanes have a mostly standardized width, so it is pretty easy to figure out how many there should be. If the number is sufficiently fractional a lane is probably merging.

        - Next, look at the motions of other cars, they are likely to give a good indication of pathing.

        - Last AND least, look at lane markings, because 101 has too many bloody places where they didn't paint over the old markings so they cross over each other and run straight into barricades.

        How do I navigate unexpected obstacles?

        - My vehicle can be described as a rectangular prism of "Will bend and break if intersected".

        - Around it there is another slightly larger area of "Keep Out Zone" that I want to try to protect.

        - I should choose a path that will allow me to pass without allowing any intersections of my "Keep out zone" with the current and projected paths of objects. It does not matter if it is a ball, a child, bicycle, or car, it is not desirable to hit it.

        - It is easier to identify things like wind-blown paper, bags, etc which are not a problem than the myriad things which are, so train for the smaller set and treat the rest kinematically.

        1. Ken Hagan Gold badge

          Re: Non tesla driver here

          - First, find the edges of it. Edge detection is key.

          Edge not found for unspecified reason. Now what?

          - Lanes have a mostly standardized width,...

          Not on this road. Now what?

          - Next, look at the motions of other cars,

          Road full of nutters who left it too late to be in the correct lane. Now what?

          - Last AND least, look at lane markings

          Computer vision is rubbish and delivers a *clear* identification of a lane marking that doesn't actually exist. Now what?

          Human beings suffer all of these problems, but get around them by understanding the road and the others users at a far higher level, so when they receive implausible information they can reject it and try harder to find some better sources. We find this process so easy that we usually don't even realise we are doing it. Sadly, we've no idea how we do it. The workaround, so far, for autonomous vehicles is to spend shedloads of cash on numerous and exotic sensors that far outstrip the capabilities of our eyes and ears.

          1. DryBones

            Re: Non tesla driver here

            Rubbish. Think about how you stay on the road sometime. If all those things fail YOU are going off as well. What if you suddenly go blind or have a stroke. Hey, same result.

            The processes I listed were my understanding of how I stay on the road through less than ideal conditions. There are likely more, but they build What a hierarchical process that gives different weights to different types of data and rejects or adjusts if there are contradictions.

            My point was that the behavior I see reported from self driving vehicles seems like it relies most on things like lane markers that go totally awry when the highway department gets involved, so the way the vehicle determines position and navigation may need a rethink.

            1. Charles 9

              Re: Non tesla driver here

              That's PRECISELY the problem. We DON'T think about it. Not consciously, at least. It happens all SUBconsciously in our autonomous mind, and one of the things we've learned through machine learning is that it's bloody hard to teach intuition, because most of the time we don't know how OUR OWN intuitions work. You can't teach something you don't understand. And before you disregard the idea, consider how much conscious thought we put into walking, which we typically learn as babies when our capacity for reasoned, conscious thought was limited to begin with, yet nigh everyone from schoolchildren to adults handle the process with hardly a second thought. If you want to see just how much goes into a basic walking gait, try playing QWOP (look it up with your favorite search engine).

              1. Alan Brown Silver badge

                Re: Non tesla driver here

                "If you want to see just how much goes into a basic walking gait, try playing QWOP"

                or watch someone learning to walk again.

        2. Alan Brown Silver badge

          Re: Non tesla driver here

          - First, find the edges of it. Edge detection is key.

          The first self-driving tech attempts tried that, they found the car freaked out when the edges changed suddenly (like when the roadway became a bridge) and couldn't cope well if the edge wasn't strongly defined (a lot of places don't paint white lines on the edge)

          What people have found is that everything you think you know about driving isn't actually what you know about driving. It's all the edge cases which make it hard.

    2. Anonymous Coward
      Anonymous Coward

      Anyone who actually owns one, having spent a few minutes getting to know Autopilot, quickly learns it's limitations. It's pretty damn good on motorways, but you know as soon as you come to junctions you have to actively tell the car what to do.

      So in other words it's fucking useless and no better than cruise control just with a misleading name.

      Mine's the one with a manual gearbox and no flaky ELON9000 trying to murder me, open the pod bay doors Elon.

  28. Anonymous Coward
    Anonymous Coward

    Why???

    Why would an engineer - of all people - do such a thing? Riding at the front of a 2-ton lump of metal and plastic travelling at high speed towards other such lumps approaching at equally high speeds, with a head-on crash averted only by some buggy software?

    Even seen as a method of committing suicide, it is excessively complicated.

    1. YetAnotherLocksmith Silver badge

      Re: Why???

      Of course, we don't know which Apple product this guy was working on, do we? If he were the real Miles Dyson, and Elon figured it out... Or, more likely* it was just neutering the competition.

      *unlikely!

  29. 0laf Silver badge
    Holmes

    Self driving cars will always kill people. The only question will be - do they statistically kill fewer people than people driven cars.

    The headlines are never going to go away.

  30. Anonymous Coward
    Anonymous Coward

    Everything makes mistakes

    100 people die on US roads per day, likely due to stupid human mistakes, little mention of that.

    1 person dies due to 1 stupid mistake of an autonomous system, the news won't stop going on about it.

    1. Anonymous Coward
      Anonymous Coward

      Re: Everything makes mistakes

      Weird. Human's crash cars all the time and it's not news. Computers hardly ever crash cars and it's news. How does that work? I don't see the difference.

      Oh wait, is it do with how common the things are?

    2. Wayland

      Re: Everything makes mistakes

      Computer driven cars are in the minority. The crashes are ones that people would not have. A human driver would have no problem with leaving the 101 for the 85 yet should not have allowed the car to attempt this. So it was human error to have allowed the car to do this.

      All car crashes should be regarded as human error. Any time a Tesla crashes on autopilot it's the error of the driver who allowed autopilot to be in control. Alternatively someone hacked the car and murdered the driver.

      1. Baldrickk

        Re: A human driver would have no problem with leaving the 101 for the 85

        As the barrier was damaged from a previous crash, and we haven't heard about a Tesla crashing there before, I think it is safe to assume that a person had indeed crashed there previously.

        So much for human drivers having no problems.

      2. Jediben

        Re: Everything makes mistakes

        "A human driver would have no problem with leaving the 101 for the 85..."

        The damaged barrier present BEFORE this event suggests otherwise...

      3. JohnG

        Re: Everything makes mistakes

        "A human driver would have no problem with leaving the 101 for the 85 yet should not have allowed the car to attempt this."

        The fact that the crash barrier had not been repaired since being damaged in a previous accident indicates that at least one human driver had a problem leaving the 101 for the 85.

      4. Alan Brown Silver badge

        Re: Everything makes mistakes

        "The crashes are ones that people would not have."

        Are they? Are you really sure about that?

        "A human driver would have no problem with leaving the 101 for the 85"

        And yet, the crash attenuator on the gore was removed, because someone had crashed into it in virtually the exact same manner as the Tesla did. Had it been there the crash would have been perfectly survivable.

        More to the point, video of this piece of road clearly shows a misleading lane marker trivially capable of pulling unwary drivers directly into the gore - and I'll point out _again_ that someone had already crashed into the gore, which is why it was in the dangerous state it was in.

      5. Malcolm Weir

        Re: Everything makes mistakes

        Very late, but why oh why do people insist on commenting without bothering to look at the data?

        "A human drive would have no problem..."

        EXCEPT ONE DID. In a Toyota Prius. Which resulted in the crash barrier being damaged. Which resulted in the Tesla not benefiting from that barrier. Etc...

        1. MachDiamond Silver badge

          Re: Everything makes mistakes

          "EXCEPT ONE DID. In a Toyota Prius. Which resulted in the crash barrier being damaged. Which resulted in the Tesla not benefiting from that barrier. Etc..."

          Yes, somebody hit the barrier in a non-Tesla vehicle previously. The big "but" is there isn't any information on what might have caused the crash. I can't count how many times I've seen somebody dart across a few lanes to make an exit they weren't in line for. I expect that some of the time those nimrods don't make it successfully and crash.

    3. Baldrickk

      Re:1 person dies due to 1 stupid mistake of an autonomous system

      Tesla autopilot isn't autonomous.

      1. tom dial Silver badge

        Re: Re:1 person dies due to 1 stupid mistake of an autonomous system

        The fact is that the driver of the Tesla apparently chose to abandon control of the car to the autopilot, so it was autonomous in fact even though it was not Tesla's intention that it should be allowed to operate that way.

  31. DrXym

    When will people learn

    It's not the normal events which confound automated driving systems, it's the abnormal ones.

    The reality is unless a vehicle capable of handling all situations in the road safely, the driver must be compelled to pay attention. An alert driver combined with an autonomous vehicle is far safer than an autonomous vehicle by itself.

    1. Wayland

      Re: When will people learn

      "An alert driver combined with an autonomous vehicle is far safer than an autonomous vehicle by itself."

      But there's the rub, how can a driver remain alert if he's not doing anything most of the time?

      In order for computers and humans to drive cars together then the human must be involved all the time whilst the computer assists to make the job easier and more precise.

      For instance power steering makes steering easier and more precise so the car is driven better with less effort. The 'autopilot' should be a co-pilot.

      1. DrXym

        Re: When will people learn

        Well that's the point I was making. If you don't keep the driver engaged and the car does something dumb, then there is no human intervention when the car piles into a truck / tree or whatever. An alert, attentive driver can hit the brakes even when the car is doing something dumb.

        And if necessary that means the car has to force the driver to be alert. Force them to hold the wheel, monitor their face, reaction times, issue activities to perform, keep them engaged with drive. And start bleeping and slow down if they don't react.

        The problem is Tesla didn't bother with any of that in the first instance and has only begrudgingly implemented it now.

        They're not alone in this - all autonomous cars have the same issue.

        1. JohnG

          Re: When will people learn

          "Force them to hold the wheel, monitor their face, reaction times, issue activities to perform, keep them engaged with drive. And start bleeping and slow down if they don't react.

          The problem is Tesla didn't bother with any of that in the first instance and has only begrudgingly implemented it now."

          This is incorrect - the Tesla Autopilot does (and did at the time of the accisent) monitor if the driver is holding the steeering wheel and will first warn the driver but will ultimattely disengage. If it believes the driver is still not responding, it will engage hazard flashers, pull the car over and stop.

          1. Charles 9

            Re: When will people learn

            "If it believes the driver is still not responding, it will engage hazard flashers, pull the car over and stop."

            Is it just me, or am I picturing one of these going into the ditch when it tries to do this on a road with no shoulders?

      2. MachDiamond Silver badge

        Re: When will people learn

        "The 'autopilot' should be a co-pilot."

        A problem happens when you take to much activity away from the person behind the wheel. There has to be a balance between relieving a driver of some tasks while requiring them to perform others to stay involved.

        I use cruise control whenever I can. I get better mileage and I don't skip a heartbeat when I spot a cop and find I've been speeding up and I'm too far over the limit. On the highway in light traffic, I wouldn't be constantly changing speed anyway so letting the car handle staying at the same speed isn't a big deal. I still have to steer and keep an eye on what's ahead and around me. I'm also ready to tap the brakes to disengage the CC if I see traffic slowing ahead or if I'm going to be exiting the motorway. Take the task of steering away and I'm not really doing anything at that point. If I don't need to stay aware of the surroundings constantly to do the job of steering, it's likely my attention will start to wander worse than normal.

  32. Wolfclaw

    Telsa scared of big law suits and having their autopilot and maybe even the cars themselves deemed dangerous to operate/drive. One big thing for NTSB to look at, is the time and requirements to put out the battery fire twice !

  33. Anonymous Coward
    Anonymous Coward

    NOT autopilot

    whoever decided to call it autopilot is the idiot that needs the blame.

    You called it something that "normal*" people think is the car driving its self, no matter how much you tell them it isn't, they now wont believe you, especially when you keep promising the F*&king moon.

    1 person is to blame and it wasn't the non-driver...

    Lets make it clear Musk is NOT a fucking genius, he got lucky with paypal, the rest is just having buckets of money!!!.

    1. DrXym

      Re: NOT autopilot

      It should have been called advanced lane keeping or similar. Autopilot is such a vague term that people obviously misinterpret what it does and the limits of such a system.

      Not just Tesla however. No system is remotely close to full autonomy on the open road. It's not the normal that catches them out but the abnormal.

    2. Anonymous Coward
      Anonymous Coward

      Re: NOT autopilot

      As we know people are dumb, just make the computer a little less.

      If CarPosition == SafeMotorWay then AutoPilotAvailable = True Else AutoPilotAvailable = False

      If CarPosition == (Junction - 1 Kilometre) then WakeUpDozyDriver(Now()) = True

      Function( WakeUpDozyDriver(StartTimer) {

      Do until NOT DriverDozy {

      AnnoyingBeepVolume = Now()-StartTimer

      If Now()-StartTimer >= 5 then ShortSharpElectricShockVoltage = Now()-StartTimer*12

      }

      If If Now()-StartTimer >= 10 then {

      AssumeDriverDead = True

      MovetoHardShoulder = True

      CallParamedics = True

      PointsonDrivingLicence = PointsonDrivingLicence + 1

      }

      }

      Function (TestDriverDozy) {

      If (HandsonWheel == True AND EyeTrackingRoad == True AND

      DriverSaysCorrectHeadsUPDisplayCode == True) then false

      }

      Sorted - Elon can just give me a ride on one of his rockets in lieu of royalties!

      1. Charles 9

        Re: NOT autopilot

        How does the car know where it is on a road junction with no lines and poor markings otherwise? Especially at the bottom of a stack in stop and go traffic (which can throw off GPS and inertials, tespectively)?

        Suppose the driver is in a fugue? Or sleepdriving? Both can result in false positives for driver awareness.

        Put it this way. If WE can't always get it right, what chance does any technology we make have?

      2. JohnG

        Re: NOT autopilot

        Most of that is what it does (or attempts to do), aside from disabling autopilot at junctions.

  34. Frenchie Lad

    Unfunnily Enough

    Cheap jokes when someone is killed are really inappropriate even when dishing out merited criticisms of Tesla's "trial & error in a live context" approach to progress.

    1. DropBear

      Re: Unfunnily Enough

      Fine, I promise not to tell any jokes at the funeral. I promise absolutely nothing about anywhere else. Go take your over-the-top piety somewhere people actually care for it.

  35. Wayland

    And in a final effort to pin the blame for the crash on Huang

    Was Huang the driver or simply riding in the car?

    The 101 is a very scruffy old road with a lot of fast moving traffic through Silicon Valley. You need your wits about you and should not be expecting a computer to drive you. Definitely can't expect your car to follow another car off the 101 onto the 85.

    What concerns me is whether the cruise control failed to release the car to the driver.

    1. Baldrickk

      Re: And in a final effort to pin the blame for the crash on Huang

      He was the driver.

      Why would the cruise control release the car to the driver? By all accounts, he never tied to take control.

  36. Anonymous South African Coward Bronze badge

    And this is why I will still prefer old school cars without autopilot or any such fancy gimmickry. I want to remain in control of the car at all times.

  37. Anonymous Coward
    Anonymous Coward

    Do Tesla use agile ?

    Maybe Tesla just isn't agile.

  38. Anonymous Coward
    Anonymous Coward

    Ghost train

    How long before driving resembles a fairground ride where we are all passengers scared witless by the unexpected and no idea of whether we will get to journey's end? I'd rather play on the Dodgems.

  39. The Central Scrutinizer

    It's all rubbish

    To paraphrase James May on Top Gear years ago.... It's all rubbish anyway. Self driving cars were invented years ag6.... they're called taxis.

    1. JohnG

      Re: It's all rubbish

      I have had a few taxi rides in and around Slough where I would have felt safer in an autonomous vehicle in beta.

  40. Named coward
    Trollface

    Alternate explanation

    The Tesla saw the barrier, realised it had no time to stop or evade, and tried to accelerate to 88mph...

  41. Anonymous Coward
    Anonymous Coward

    The Sukhoi sandal:

    https://en.wikipedia.org/wiki/2012_Mount_Salak_Sukhoi_Superjet_crash

  42. Emmeran

    The forgotten part of the story

    This particular "Apple Engineer" had repeatedly commented that Tesla's Autopilot didn't work well on this portion of the 101 yet he insisted on personally testing it every time a new software release was pushed out.

    Somewhat just rewards for self-testing.

  43. James 36

    Optional

    from what I have read we have multiple contributory factors

    1) the driver, even though he has had issues in the spot before ,was not paying attention (happens in manual cars all the time. You make some huge number of decisions when driving and there is an error rate accidents occur when one or more bad decisions coincide.

    2) the Tesla SW and HW was unable to deal with the environment at that spot

    3) the road furniture had not been repaired in a timely manner

    all three of these contributed to the accident, without one of them the accident may not have happened.

    what to do

    1) change the sales/marketing guff around "autopilot" call it enhanced cruise control or something else , there will still be idiots who think they can not drive whilst driving I have seen people reading news papers whilst driving normal cars on the motorway FFS

    2) tesla look at improving their SW and HW to reduce the risk of this specific issue

    3) make sure road furniture is repaired before re-opening the road

    it will not stop people dying , it may reduce that sort of accident. People die on the road all the time and there is a constant iteraive process to reduce that . it just takes time.

    The main outstanding issue is still one of liability IMHO

  44. Rob Fisher

    Room for improvement

    There are no hard and fast rules of thumb that apply. Simply measure accidents of various severity per miles driven. Ensure that these numbers improve with each software release, and that in any case it is better than human drivers in similar environments.

    If this can be done then automation is a benefit. If not then it isn't.

    Debates about capabilities of hardware and software, user interface design and marketing are all secondary to that.

  45. martinusher Silver badge

    You would have thought a software engineer would have known better

    The Tesla autopilot is good but its nowhere near true autonomous driving. I'd trust it to keep me in a lane, following traffic on the freeway, just like other adaptive cruse control systems. I would not trust it to make correct course and speed choices at freeway off ramps. For those of you unfamiliar with California motorway exits, they're characterized by erratic signage, abrupt changes in direction, inconsistent lanes and often poor surfaces. You often find signs on them, yellow diamonds with "Watch for stationary traffic", which is Californiaese for "Blind corner with traffic backed up at a ramp signal ahead, sorry about the (lack of) sight lines". Anyone who comes off at speed without paying serious attention to what's going on is asking for trouble.

    I could understand some non-technical person not appreciating the capabilities of the autopilot but I'd expect someone 'in the trade' to know a bit better.

  46. Anonymous Coward
    Anonymous Coward

    Maybe Autopilot is the wrong name but someone would call it that - Tesla got there first

    I am a Tesla owner, and have been for over two years. I have somehow managed to stay alive whilst sometimes using this supposed buggy/demonic system that is waiting to drive you into a wall, or lorry etc.. In real life, Autopilot is a nice to have, I mainly only use it in heavy slow traffic, as it is entirely at home in this environment. It understands that motorbikes are filtering, people are braking late and accelerating fast. It has loads of limitations, like you would never use it on an A road, as it does not understand traffic lights, and really really doesn't understand roundabouts.

    There have been a lot of comments about the car, driver and especially the software. The car is fantastic, but should get better. And, I can confidently say it will get better. You should not rely on the software features in your day to day driving, I brake happily without the need for the collision assistance to emergency stop for me. It has pretty good collision detection, but it still throws a fit in the same place every day, when the road bends round but all it can see is a parked car.

    There seem to be way too many people here who are having way too much fun taking pot shots at Tesla and Elon Musk. Buy the car, or don't buy the car, but don't stand on the side and ridicule the people that did buy the car. For the record, Tesla did not market Autopilot, and it wasn't even available when I bought my car, this feature was enabled while I was waiting the many months for it to be built and shipped to the UK. But, I still paid good money for the feature to be enabled, and I also watched the associated stupid videos of people mucking about with autopilot.

    I do also find it amazing how one car has a crash, and it has everyone up in arms. Its not like Petrol (or Gas for our American cousins) is nice and safe. How many bikers injuries are caused by spilled Diesel?

    Did I mention, it is still a damned good car.

  47. Piscivore

    ABS is OK

    ABS: Yes, because I can't pulse the brakes fast enough to avoid skidding or panic and just stamp the pedal when I should be doing something else. So ABS brakes better than I.

    CC: No, because I tend to lose concentration and my reaction time drops.

    Anything else: No, because I don't drive on a racetrack but real roads with other idiots around and potholes, pedestrians and more hazards to avoid.

    If I want not to have to drive, I let someone else do it for me, not a bloody algorithm.

    1. Charles 9

      Re: ABS is OK

      "CC: No, because I tend to lose concentration and my reaction time drops."

      See, it's the opposite for me. Not having to worry about the speedometer and my foot's position on the accelerator allows me to keep my head up and scanning the road better, especially in areas prone to "bear traps" (jurisdictions that live off outsider traffic tickets).

  48. This post has been deleted by its author

  49. DvorakUser

    Already been mentioned, but for summary....

    1. Adaptive cruise control will lower itself if you are following a slower car (e.g. if you set cruise to 75, it will slow to the speed of the car in front. Once that car is 'gone' and the road is clear, the car will accelerate to the desired speed).

    2. The lane markings where this accident took place are in HORRIBLE condition, going off of the video footage that I've seen. It's easy to see how someone not actually paying attention could, in fact, follow this Tesla (almost) all the way into the barrier (which it seems someone else had actually done prior to the Tesla).

    I will agree, though, that a radar upgrade beyond 1-D should be performed on all Autopilot-capable vehicles. If it can't tell that the thing that's not moving isn't something that it's going to safely pass by, I think that's a good indicator something is wrong. Right now, it's like getting the check engine light to go off by undoing the bulb - yes, it makes the problem go away, but at what cost?

  50. This post has been deleted by its author

  51. fishbone

    You still are supposed to be smarter than the things you possess, especially if they're moving.

    1. MachDiamond Silver badge

      "You still are supposed to be smarter than the things you possess, especially if they're moving."

      Do you get out much? There are plenty of my neighbors that aren't qualified to watch paint dry.

  52. Luiz Abdala

    Dead man switch?

    Trains are able to follow a precise path, yet we still have accidents with them.

    In one in particular, the man on the controls could not see the appropriate signaling at that particular time of day, and the investigators verified that fact actually was the causing factor, again, reinforcing, at that particular time of day, the signals could not be seen.

    In other occasions, the men behind the controls fell ill, and were unconscious prior to the crash.

    How could a sane person ignore the fact that his vehicle was headed toward a concrete barrier at 70mph? Was he awake/sober/ in command? Did the man have a seizure or heart attack or anything that would impair his ability to swerve away or brake? Could he SEE the barrier, did he have sun in his eyes, like James Dean?

    If his car crashed on its own due to autopilot going titsup, why where other 3 vehicles involved? Did the other people also fail to notice the erratic behavior on the Tesla? Why didn't it crash SIDEWAYS, because that's what people in control would try to do, wrestle the controls?

    Too many questions. Not enough answers. There is even a specific term for when it happens on airplanes, crashing on the ground with a controlled aircraft... Controlled Flight Into Terrain - CFIT.

    This is a standard CFIT scenario. He crashed into the barrier without explanation, neither equipment nor crew could detect the disaster until too late. Finding the likely cause would fall back into the other scenarios - equipment failure, pilot error... or into what must be changed for ALL cars that are dealing with automated driving, just like Boeing has been doing since the 70's.

    1. Caffeinated Sponge

      Re: Dead man switch?

      Agree that there seem to be operator involvement questions, but my reading of the sequence was that the vehicle crashed, killed its passenger and lost its front portion then the remainder rebounded back to the road and the other vehicles were involved. Hitting a hard surface usually involves deflection. If a large portion of the car was sheared off, then probably an erratic, spinning deflection that would be very difficult to avoid at highway speed and close quarters.

  53. Luiz Abdala

    Dead man switch?

    Quick reminder: a CFIT can happen with either man or machine, it is not specific. The Tesla crashed just like an airplane, and nobody knows why or how either one that could avoid it, did so.

  54. Caffeinated Sponge

    No comment from Musk yet

    He’s probably trying to decide who to call a kiddy-fiddler this time.

    More seriously though, Tesla are going to need to rename that cruise control system at some point. Calling it ‘autopilot’ is just causing accidents and dragging the self driving field as a whole down through its visibility.

    The fire is less surprising. That’s what lithium based batteries tend to do. Doesn’t take much physical damage to set them into a runaway thermal state, and when you have so many cells piled together (sorry!) then it’s unsurprising that a heavy impact followed by a fire caused other cells not immediately involved to have problems later. Lithium based batteries can be very power dense but they possibly should have limits on size for this sort of reason although it will stall electric vehicles until the technology is commercially replaceable.

    1. Charles 9

      Re: No comment from Musk yet

      "Lithium based batteries can be very power dense but they possibly should have limits on size for this sort of reason although it will stall electric vehicles until the technology is commercially replaceable."

      You could be chasing unicorns there since the key element here is its power density. Meaning, is the main reason they're catching fire the fact they're made of lithium or the sheer amount of energy they contain? Because if it's the latter, then you've hit a common-mode fault, and anything of comparable (or higher--think hydrocarbons) energy density AND the ability to drain them slowly (versus, say, explosively) will have similar problems.

  55. JBFUK

    Why don't they understand...

    The majority of us do not want self driving vehicles - yet all manufacturers continue to chase this concept. For those who don't like to or cannot drive, take a taxi, train, bus, if you have the money hire a driver..

    Even once the systems are developed further, putting your life in the hands of a computer program which will always have bugs and may not have your survival at the top of it's list of priorities is not a good idea.

    I suppose it could be said that those who make such a poor decision as to trust one of these systems and pay the ultimate price are victims of natural selection.

    1. Anonymous Coward
      Anonymous Coward

      Re: Why don't they understand...

      "The majority of us do not want self driving vehicles - yet all manufacturers continue to chase this concept. For those who don't like to or cannot drive, take a taxi, train, bus, if you have the money hire a driver.."

      Are you sure about that? I would imagine a vehicle always on call on a moment's notice (there tends to be a wait for taxis), able to take you from wherever A to B regardless of weather (buses and trains run to timetables and usually don't get you directly there--bad news if it's raining) without breaking the bank (for those who don't have enough money to afford cars OR taxis) would be a Good Thing.

    2. Marthawillam

      Re: Why don't they understand...

      nice saying

  56. Recovering Lawyer

    Who’s to blame: US vs European view

    There is a huge difference between commercial/military airplane autopilot and automobile autopilot. Airplane autopilot can actually land the plane. The most interesting cultural aspect of this ability to me is the fact (at least true a number of years ago) that if a large commercial plane crossing the Atlantic and landing in the USA is a US headquartered airline, the instruction is for the pilot to turn OFF autopilot and fly the plane manually. If the airline is European headquartered, the instruction is to turn ON autopilot. Not sure which is better, but the good ol’ US of A (and Tesla) believe in self determination to the extreme.

    1. MachDiamond Silver badge

      Re: Who’s to blame: US vs European view

      "turn OFF autopilot and fly the plane manually."

      The most dangerous and busy times are on takeoff and landing. By "hand flying" the aircraft, the pilot is in touch with what the plane wants to do due to wind, temperature, humidity and how much the aircraft weighs at the time. If the pilot is just "ready" to take over from the autopilot should there be an issue, they aren't going to have a feel for how the plane is flying before the proximity of the ground becomes a pressing issue. Once the plane has taken off it's often easiest to just give it instructions and let it figure out how to make it happen. When just cruising at altitude, letting Otto do his thing is a no-brainer. Landings? I'd rather an experienced pilot is doing it.

  57. Marthawillam

    not confirm yet which mistakes was that, my friend saw the smashed car on a towing vehicle a https://www.ritewaytowingnyc.com/ rite way towing nyc told to my that accident was dangerous,

  58. Cartechnewz

    Tesla Disadvantages

    Tesla is an American all-electric car manufacturer which was started by Elon Musk in 2003. The company has been growing since the past decade at a rapid speed and there seems to be no stopping.

    The company’s first car, the Tesla Roadster, was introduced in 2008 to showcase the capabilities of an all-electric car.

    From then the company has introduced several electric cars across the globe such as the Model S sedan, Model X SUV, the Model 3, Model X, Model Y, Model S etc.

    FOLLOWING ARE THE DISADVANTAGES OF OWNING A TESLA ELECTRIC CAR:

    CHARGING OF THE CAR:

    While taking the car to trips, it becomes very inconvenient to stop and charge the car at regular intervals. In every 500 miles, the car needs to be charged.

    While road tripping, we already stop for a bio break, food, coffee and stopping again for charging the car just adds on the duration of the journey. For road-tripping or long distances, Tesla is not a very good option.

    REGULAR PLUGGING AND UNPLUGGING:

    Every time the user would start the car or stop it, it will take a minimum of 30-40 seconds. The start of the car is not very smooth.

    There are times when the car even takes longer to start or finally switch off. This shows that the quality of the engine is a little poor or slow in case of Tesla as the pickup time is slow.

    HIGH MAINTENANCE:

    The outside body of Tesla has a coating which helps in making the car-free from dust and rust and gives it a new look.

    However, Tesla is a high maintenance car and does require regular servicing which adds up to the overall after the cost of the car.

    With the same price, one can always buy other models or brands which have a lower maintenance cost as well as fewer efforts need to be put in to maintain the car.

    LACK OF SUPERCHARGING STATIONS:

    A city, state or a country which lacks a supercharge station cannot accommodate Tesla. Hence, the car can’t be used everywhere and can’t be driven anywhere.

    It is very selective and it all depends on the availability of the supercharge station in the particular area. During the way, if a supercharge station is missed by mistake, it can lead to added time and cost. Moreover, the car will not function without energy.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like