back to article US watchdog opens probe into Tesla's Autopilot driver assist system after spate of crashes

A US government agency has formally opened a probe into Tesla's so-called Autopilot system following a spate of well-publicised crashes over the past few years. The investigation covers over three-quarters of a million vehicles, which has got to be a decent chunk of the US inventory shifted by Tesla since the start of the 2014 …

  1. jmch Silver badge

    About time too

    Maybe it's an issue with Tesla, or maybe the system works really well, much better than human drivers, and it really is just the idiot drivers not paying attention that's to blame.

    Either way it's good to have an independent agency look into it rather than Tesla themselves saying "nothing to see here". And if at the very least Tesla are forced to rename their glorified cruise control to something that doesn't lull drivers into a false sense of security, so much the better

    1. Gene Cash Silver badge

      Re: About time too

      So can we rename Autopilot as "cruise control" and rename cruise control as "speed assist"?

      I assume everyone remembers all the tired old jokes about people engaging cruise control in their motor home and then crashing because they went back to fix a sandwich or something.

      1. EricB123 Silver badge

        Re: About time too

        "I assume everyone remembers all the tired old jokes about people engaging cruise control in their motor home and then crashing because they went back to fix a sandwich or something."

        I heard if was coffee. But the man who did that sued, and now I see all new car manuals say something to the effect that the cruise control is only for speed control, NOT anything more.

        Now about that lady who sued McDonald's for millions when she burned her thigh spilling their coffee in her car...

        1. Moldskred

          Re: About time too

          Sigh. The McDonald's coffee case was not the frivolous mockery of the court system it is usually portrayed at.

          No, the woman did not 'sue McDonald's for millions'. She sued for about $20,000 to cover medical expenses and related loss of income. The amount sought increased once an actual law suit got under way, but it was never anywhere near 'millions'. Further, the woman was a passenger in the car and the car was stopped when she spilled the coffee. And she did not "burn her thigh" -- she got third degrees burns over six percent of her body and lesser burns over sixteen percent, and had to undergo skin grafts and was partially disabled for two years.

          Further, McDonald's served their coffee at a significantly higher temperature than other fast-food restaurants, _and_ there was a history of customers being scalded by their coffee which McDonald's knew about -- they had settled claims for many such cases before -- but not taken any measures to prevent.

          It's true that the jury awarded her $2.7 million in punitive damages (as well of $200,00 in compensatory damages) but this was reduced by the judge to to $480,000. The verdict was further appealed by McDonald's and the parties then settled for an undisclosed amount.

          So, in short, she did not _seek_ millions, she was not _awarded_ millions, and no, she was not driving a car when it happened.

          1. Anonymous Coward
            Anonymous Coward

            Re: About time too

            Thank you very much for your debunking, it's really a good read.

          2. Anonymous Coward
            Anonymous Coward

            Re: About time too

            Thanks for saving me the time posting the same thing. The kind of crud that's out there almost makes you think it's McD paid payback for her having the temerity of taking them to court for something they let be after already having been told repeatedly to be a major risk..

    2. martinusher Silver badge

      Re: About time too

      There's a third possibility. ElReg is UK focused so relatively few people would have had significant experience driving on US roads. Out West where I live they are a mess, especially in urban areas like Los Angeles -- badly designed, badly surfaced, badly lit, inconsistent and erratically placed signs, you name it and its a safety nightmare. Obviously we humans adapt as best we can but its still very easy to make mistakes, especially at night where the notion of varying the brightness of signals -- something you take for granted in the UK -- is foreign.

      (...and the road markings tend to disappear completely when the road is wet. None of that nice reflective plastic stuff you get in the UK, its just paint which 'may or may not work'.)(Road adhesion disappears when the road's wet as well -- cracked and patched concrete isn't an ideal braking material.)

      So I can see how an AI system could be confused. We've got plenty of Teslas here but I'd guess that most owners aren't stupid enough to take the 'Autopilot' thing literally.

      Incidentally, the DMV (our version of DVLC) has endemic problems with Teslas. See:-

      https://www.teslarati.com/tesla-bias-dmv-inhibits-ev-revolution/

      1. hoola Silver badge

        Re: About time too

        The state of the road maybe the case BUT the fault still lies with the technology, how it is sold and the complete idiots that appear to turn it on and then pay no attention to what the vehicle is doing.

        Look at the lunacy where everything has to be controlled through a touch screen. If using a mobile phone is a distraction then poking at a touch screen for critical functions is equally distraction. Our Golf has a touch screen to control everything BUT, driver critical functions are still with twiddly knobs, real buttons or a stalk by the steering wheel. I know where they are and can find them without looking. You can access some through the touch screen but it is not exclusive.

        For some reason Tesla have the cult like following in the same way Apple does, the difference being that Apple produces hardware that generally works and critically does not kill people.

        1. werdsmith Silver badge

          Re: About time too

          The way they are sold is with warning after warning about being in control at all times and these warnings continue after the sale.

        2. Anonymous Coward
          Anonymous Coward

          Re: About time too

          You don't need to use the touch screen. It's same as you describe your Golf, on the Model 3 all the driving functions are controlled from the pedals and steering wheel with the left stalk controlling indicators, headlights and wipers and the right stalk controlling gear shift and autopilot engagement. There are a couple of twiddly knobs on the steering wheel for music and autopilot adjustments (max speed and distance from car in front) The last time I touched the screen was weeks ago to use sat nav. It is annoying to have to look to the side to see your speed but you get used to it.

          I use autopilot on highways as the rules of the road are a bit simpler for the car - no traffic lights, no intersections and one direction of traffic. I pay attention as if I am regulating speed and steering and take over when needed. If the road in is bad condition I would not use autopilot. I agree with you that it is mis-sold and you ALWAYS have to be paying attention.

      2. Muscleguy

        Re: About time too

        That sounds awful. Here in the UK and especially here in Scotland where the roads can be genuinely icy they have been laying down special tarmac, it's golden brown for a start but it's a bit like sandpaper too, extra grippy.

        You find it on corners or those places where braking is expected such as the approach to corners or roundabouts or other intersections. They also show up in contrast to the black tarmac and act as an indicator of 'slow down a bit here'. They don't use it unless it's necessary.

    3. gandalfcn Silver badge

      Re: About time too

      "it really is just the idiot drivers not paying attention that's to blame." So they aren't allowed to use the system because only stupid people buy Tesla's cars and use Autopilot. OK.

      1. werdsmith Silver badge

        Re: About time too

        This is why we can't have anything nice.

      2. martinusher Silver badge

        Re: About time too

        Its all a matter of common sense. When I wrote my comment yesterday I had just returned from a road trip out of state. The first part of the drive was close to 300 miles across the desert on Interstate 40. Ths is a two lane dual carriageway with a wide median and relatively sparse traffic. Autopilot would have no problem with this. Then at Barstow I-40 joins I-15, the main drag between Las Vegas and Los Angeles / San Diego. The nearest English analog to this is merging from a rural dual carrageway to the M25 around Heathrow but without the lane discipline, speed limits and traffic rules enforcement you'd get on the motorway. Autopilot -- or any adaptive cruise control -- might have worked but you'd be very unwise to rely on it. (Actually, it was a bit of a nightmare for about 40 miles....)

        The article's title is a bit misleading, anyway. It appears that there's been 11 crashes over the last five years or so but the Trump administration adopted a laissez faire attitude to them. So all that the Feds are doing is playing catch up.

        1. David 132 Silver badge
          Happy

          Re: About time too

          Plus, somewhere around Barstow the drugs begin to kick in and it’s bat country…

        2. Anonymous Coward
          Anonymous Coward

          Re: About time too

          It should have never been given that name. In the UK they should be taken to court under the Trade Descriptions Act. It isn't really an autopilot, it's merely an advanced cruise control, and Tesla should not use gullible people as guinea pigs.

          It appears that there's been 11 crashes over the last five years or so

          I assume this concerns fatal crashes where Autopilot decided to remove one gullible buyer from the gene pool? AFAIK, TCUUA (Teslas crashing under unsupervised Autopilot) is a far more common event than Tesla marketing would really like us to know. There are some fairly horrific near misses just on boring YouTube.

          By the way, for those who seem to think that Elon Musk is somehow the inventor of selfdriving: Autopilot was released in 2014, the same year that Audi already received a Californian license for selfdriving research - because they were already running autonomously driving tests in 2010, by for instance having a TT selfdrive up Pikes Peak. The problem that other car companies have, especially European ones, is pesky laws that demand they actually keep their customers alive. This is partly why I'm wondering what Volvo is up to - they were also working on HGV autonomy.

    4. Mike 137 Silver badge

      Re: About time too

      "... maybe the system works really well"

      According to the NTSB, on March 23, 2018 in Mountain View, California, a Tesla on "autopilot" considered the diverging gore lines at an off ramp as a "lane", steered towards and entered it, and accelerated towards a large rectangular barrier plate painted with black and yellow chevrons. It hit the barrier at 70 mph, demolishing the vehicle and killing the driver.

      The driver was "to blame" only insofar as he left it to the "autopilot" to decide what to do. Inattentive, yes, but reliant on the technology. In its absence, I guess it's quite unlikely the driver would have left the legitimate lane and accelerated voluntarily towards the obstruction, however inattentive they were.

      1. David Hicklin Bronze badge

        Re: About time too

        The driver was "to blame" only insofar as he left it to the "autopilot" to decide what to do. Inattentive, yes, but reliant on the technology

        Or a classic case of where the driver is expected to suddenly take over/correct the actions of the car having been lulled into a relaxed state where they are no longer aware of what is happening in real time.

        One reason that anything other than level 5 automation will inevitably fail.

  2. Rich 2 Silver badge

    A solution looking for a problem

    I appreciate the Tesla system is not actually make for a fully self-driving car, but on a wider front, I have yet to see a compelling reason for pouring billions into this idea - I certainly wouldn’t trust it.

    What is the “problem” that a self-driving car solves (without also introducing a boat-load of new problems) ?

    1. Gene Cash Silver badge

      Re: A solution looking for a problem

      Have you ever driven in America??

      Windows NT could probably drive better than 60% of the people here.

      A REAL self-driving car solves all the idiot "drivers" killing people because they don't give enough of a shit to pay attention to what's going on around them.

      Unfortunately, we don't have any real self-driving cars, and I don't think we will for at least a decade. It needs more than hooking a bunch of sensors and a GPS to a computer and attaching a steering wheel motor.

      Edit: God, I can't type.

      1. Grunchy Silver badge

        Re: A solution looking for a problem

        “Windows could drive better.”

        Factually, no.

        You know why they don’t let Mars explorers go bombing full-tilt around the Martian surface without human oversight at key points?

        It’s because computers will always figure out a way to fuck up, that’s how come.

        1. doublelayer Silver badge

          Re: A solution looking for a problem

          "You know why they don’t let Mars explorers go bombing full-tilt around the Martian surface without human oversight at key points?"

          Because it's really hard to teach a dumb computer how to decide on its own what you find interesting when all the robot sees is a bunch of rocks? If I were there, I'd need remote control too; I'm no geologist.

        2. Roboiii

          Re: A solution looking for a problem

          Until it BSODs out.

        3. Anonymous Coward
          Anonymous Coward

          “Windows could drive better.”

          So you press the "Start Button" to Stop it & apply the Brakes?

        4. werdsmith Silver badge

          Re: A solution looking for a problem

          It’s because computers will always figure out a way to fuck up, that’s how come.

          Unlike humans who never fuck up, ever.

      2. Scoular

        Re: A solution looking for a problem

        Yes I have driven in America for years but currently live in Australia.

        I also drive a Tesla S.

        I love the car, easiest and most fun car I have ever driven.

        However the so called FSD is nowhere near being able to drive the car except in vary particular easy situations.

        I would much rather drive it myself than sit there terrified waiting to have to take over.

        Night driving is particularly an issue.

        One day real full level 5 may arrive but It seems very much a future prospect like fusion power stations.

        1. gandalfcn Silver badge

          Re: A solution looking for a problem

          "I love the car, easiest and most fun car I have ever driven." Really? You mean having everything done for you?

          1. genghis_uk

            Re: A solution looking for a problem

            You may want to read that again... He said he prefers to drive the car manually rather than worry about it going wrong.

      3. gandalfcn Silver badge

        Re: A solution looking for a problem

        Maybe they need Evangelical, bible waving sky/autopilots?

      4. big_D

        Re: A solution looking for a problem

        The other problem is, if all vehicles are autonomous, the problem is relatively easy to solve, because they communicate with each other and let those around them know what they are going to do next. Likewise, intelligent "street furniture" that signals bends, lane blockages etc. can help.

        This all reduces the load on the automated system, because most of the chaotic variables (other drivers) have been removed. Wild animals and pedestrians, cyclists etc. remain, but the other lethal weapons are all being controlled and communicate with each other.

        But, we are at the early stages, which means most of those lethal weapons are being driven by humans. That makes automated driving today many times harder than it will be in the future...

        Add to this sloppy/beta code being tested on the public and inaccurate sensors and you have a recipe for disaster.

        There is a reason why traditional car manufacturers have been working on this stuff since the 70s and still don't have fully automated vehicles in production, other than adaptive cruise control - those systems have to be 100% reliable, before they can be given to the public.

        1. Anonymous Coward
          Anonymous Coward

          Re: A solution looking for a problem

          Yes, funnily enough they tend to focus on things like not using their drivers as guinea pigs, build quality that actually exceeds how Lada used to build cars*, actual working after sales support via a dealer network and not having the vehicle drop 50% in value in two years.

          * You'll understand when you see shots of what quality Tesla deems acceptable to hand over as a new car. In one case, the vehicle converted itself to a cabriolet on its first motorway drive ex dealer by losing its glass roof - it was simply not attached.

        2. Muscleguy

          Re: A solution looking for a problem

          It's a nice idea but shame about the processing power. For eg did you know the cameras on those things only use grayscale since they are not fast enough to process full colour in real time? That kind of helps explain the driving into a white truck across the highway scenario.

          My attitude is when it can process images at least as well as a hooman can then call me. I can see the advantages of a HUD lidar system when it's pissing down at night or worse a blizzard (I live in Dundee). Something which can see through that, is not distracted by large fluffy lumps appearing out of the headlight range to hurtle towards the windscreen would be useful.

          It would certainly increase the speed of safe driving under those conditions. My wife was picking the eldest up from Lock Tay for Xmas and it was dark and v snowy, a real hell drive along narrow, twisting highland roads. Parts beside Loch Earn for eg are very twisty.

      5. Anonymous Coward
        Anonymous Coward

        Re: A solution looking for a problem

        Edit: God, I can't type.

        I have that too - the more tired I get, the more lysdexia rules KO. Very annoying.

    2. John Robson Silver badge

      Re: A solution looking for a problem

      The problem that a self driving car solves is that almost all road injuries/fatalities are caused by human error.

      There are very, very few incidents caused by anything else.

      1. starbaby

        Re: A solution looking for a problem

        Maybe more relevant, it solves the problem of paying drivers for e-hailing services.

      2. This post has been deleted by its author

        1. Cybersaber

          Re: A solution looking for a problem

          Just to clarify because of the thumbs down votes: I'm not blaming the victims here, though I realize if you TLDR my post above it could look like that. I can't edit my post to clarify that anymore though.

          All I meant was the autopilot systems have human designers who make human errors too. I was trying to provide counterpoint to the idea that somehow an autopilot system is an entity capable of making decisions. It's a computer running a program, flaws and all.

          Even good drivers (or pilots, like the 737 MAX pilots I referred to) plus a flawed system = a combined system that makes more errors overall than good pilots alone.

          THEREFORE, if you take a group of 'bad drivers' and you add an expert system that may make flaws of its own, you get the malus of having both sets of flaws applied.

          Which was all in support of my conclusion that I can see no set of circumstances where autopilot systems like Tesla's _improve_ safety.

          1. John Robson Silver badge

            Re: A solution looking for a problem

            Autopilot never tires, never gets distracted, never stops doing what it is trained to do.

            It maintains continuous 360 degree awareness (not in the hand wavy is it AI way, but in the sensor suite based ML way).

            It learns from all incidents driven, by all drivers (machine and otherwise) in the entire fleet of Tesla vehicles (since AP is always making decisions, it just gets to say “oh, I’d have handled that differently” and upload).

            It therefore has millions of miles more experience than any human could ever hope to achieve.

            You don’t add the flaws together, you basically entirely eliminate one set. When I see a report that a Tesla was weaving across the street because it went to a back street charger for some hallucinogenic electrons then we can have a different discussion.

            Yes fundamentally all AP errors are human errors because we made it, but the number of them in vanishingly small. Can you imagine a proper full scale air crash investigation into each and every fatal rti (let’s just limit this to fatal for the moment). Combined with grounding all cars between investigations?

            Would be great, the streets would be empty of tin boxes and motons, leaving people to move around.

            Douglas Adams had a character name themselves Ford Prefect and try to introduce themselves to a car…. They do appear to be the dominant life form on the planet.

            Is Autopilot a level 5 system? No

            Does that mean it’s inherently bad? No

            Does that mean we shouldn’t eliminate the cause of all accidents? Absolutely not.

            Do people hold it to a higher standard than a human? Yes, despite the fact that widespread deployment would already make roads safer - not to mention more accessible for more people.

            1. John Brown (no body) Silver badge

              Re: A solution looking for a problem

              "Do people hold it to a higher standard than a human? Yes, despite the fact that widespread deployment would already make roads safer - not to mention more accessible for more people."

              Where there's blame, there's a claim. Now try blaming a multi billion dollar megacorp and see if you can find someone/thing to blame.

              1. werdsmith Silver badge

                Re: A solution looking for a problem

                A brick on the accelerator pedal drives more safely than the average Audiot lunatic.

            2. really_adf

              Re: A solution looking for a problem

              Is Autopilot a level 5 system? No

              Does that mean it’s inherently bad? No

              In and of itself, the latter may be true. However, the combination of the former, the name and human nature seems very likely to result in something for which "inherently bad" seems an appropriate description. That result is what is important, and seems to be supported by evidence.

              Another question is, can anything less than a level 5 system avoid that result? If not, does it not follow that as long as the answer to your first question is "no", the answer to your second question is "yes"?

              1. John Robson Silver badge

                Re: A solution looking for a problem

                So you object to an accurate name on the basis of a cartoon understanding.

                FSD is a daft name, I'm absolutely with you on that one. AP is a name for a collection of features, and it has never worked the way that it does in Airplane! (the film).

                Specifically when you enable autosteer in the settings you get this:

                "Autosteer is currently in Beta:

                Autosteer is a driver assistance feature and does not make your vehicle autonomous..."

                And a reminder any time you turn it on.

            3. Anonymous Coward
              Anonymous Coward

              Re: A solution looking for a problem

              It maintains continuous 360 degree awareness

              Given that it misses HGVs and police vehicles right in front of it (evidenced by the resulting crashes) I'd have to disagree with that statement. It doesn't, significantly so.

              1. John Robson Silver badge

                Re: A solution looking for a problem

                It maintains awareness, that awareness might not be perfect, but I'll wager it's better than the vast majority of people with a license.

                After all we've never had a human crash into a lorry, or a skip, or a police car, or an ambulance....

                1. Anonymous Coward
                  Anonymous Coward

                  Re: A solution looking for a problem

                  I beg to differ. IMHO it actively reduces awareness. It gives drivers the fully mistaken impression they can go and do something else than keep their mind on traffic and this kills off their situational awareness.

                  Driving is more than just knowing what's around you, it's also taking into account the associated vectoring of other road users and plan, even strategise for it- that's why you can't just look out of the window and re-establish that in a flash, it takes a few seconds before you have everything dialed into your mental map.

                  Those are the seconds you don't have when I-can't-believe-they-called-it-Autopilot decides to take a nap without any prior warning.

          2. cornetman Silver badge

            Re: A solution looking for a problem

            I suggest that this is a flawed calculation.

            In reality, either Autopilot is controlling the car *or* the driver is. I know that the driver is still responsible while Autopilot is engaged, but they are not actually in control of the vehicle when it is. We can't simple combine the faults of Autopilot and human driver additively: that would only make sense if both Autopilot and driver were actively cooperating to manage the vehicle, and this is simply not the case.

            In this situation, the calculation is simple. If Autopilot is better at driving than the human, on average, then safety is improved when Autopilot is in control.. We do know that there are some scenarios where Autopilot is definitely poorer and the meatbag has to taken over.

            Most of the worry about the system has been about the driver keeping enough attention to take over when things go awry, and that is a difficult problem to solve since we find it difficult keeping attention when we are not actively engaged. I don't have an answer to that and I admit it is a serious flaw.

            1. Lil Endian

              Re: A solution looking for a problem

              ...we find it difficult keeping attention when we are not actively engaged.... it is a serious flaw.

              Totally agree, cognitive loading is a huge issue in this context.

              Those in jobs that require high attention to detail train, train and train repeatedly[1]. Even then, when they're in the "real" situation ("this is not a drill") they are human and therefore fallible. Any mistakes are fed back in to further training. They train to anticipate and handle cognitive loading, so handle situations better.

              For the hoi polloi, reactions in pressure situations are more likely to be dire. No training. No anticipation. No contingencies prepared.

              So giving partial automation to someone that's happy to "check on their dog" is simply daft.

              [I've never understood why a driving license is essentially for life. Bad habits form. You can't (UK) hold a forklift license for more than a few years without re-testing. But 3 tonnes for Mercedes on a public road and off you go.]

              [1] Think: advanced police drivers; fire officers; pilots...

              1. Anonymous Coward
                Anonymous Coward

                Re: A solution looking for a problem

                I hold all licences, but retaining the bus/HGV parts requires a medical test every year. In a number of countries there are also age limits that then demand things like eye and health check, but I agree, there is no check for cognitive ability. Given that roads do get fuller, that is a dangerous omission.

                Especially with all these autopiloting Teslas swerving and crashing all over the place :)

                "The post contains some characters we can’t support" - that's a lie. I didn't talk about Trump at all.

        2. cornetman Silver badge

          Re: A solution looking for a problem

          > All autopilot does is take a flawed driver, and add in the technical flaws created by flawed software/hardware designers to get the worst of both worlds.

          I don't believe that the evidence supports this view.

          Autopilot certainly seems better able to cope with everyday situations of driving that are monotonous and repetitive and this constitutes the vast majority of people doing long distance highway driving. Here the danger is fatigue causing careless maneuvering, wandering out of lane, missing traffic signals, not slowing and stopping when traffic comes to a halt which is what most highway accidents are caused by.

          I suspect the situation is less clear in town though, but I would not expect that people would use Autipilot in that scenario much anyway.

          1. katrinab Silver badge
            Unhappy

            Re: A solution looking for a problem

            The problem is that with an autopilot system, you lose situational awareness.

            With a human drivan car, you know what the car is doing and why it is doing it because you made the decisions.

            If you need to take over because a computer made the wrong decision, it takes 23 seconds to figure out what the car is doing and how to fix it. If you are driving down the motorway at 70mph (speed limit in the UK), it would have travelled 716 meters in that time assuming it doesn't encounter any obstacles in that time.

            1. cornetman Silver badge

              Re: A solution looking for a problem

              I agree.

              The dilemma is that it could be that those kind of situations (though potentially catastrophic) may be much rarer than the situations that cause the majority of accidents. Rear end shunts are a very common kind of accident that happen often in traffic and in regular driving, a problem that these systems could virtually eliminate since they are caused by simple common lapses in concentration that we are all prone to, even by just looking in our rear view mirror.

              The high profile cases such as Autopilot diving full pelt into the side of a turning truck make popular stories, but the thousands of smaller accidents causing whiplash and perhaps more rarely serious injury or death that happen every day caused by people hardly raise an eyebrow.

              It is a difficult equation to balance.

              1. The First Dave

                Re: A solution looking for a problem

                Rear-end shunts could be entirely eliminated by a system that was one level LESS sophisticated than this. Which would almost certainly be more reliable.

            2. werdsmith Silver badge

              Re: A solution looking for a problem

              The problem is that with an autopilot system, you lose situational awareness.

              I find it is easier to focus on situational awareness, and watch what's going on around me looking for hazards when the car is lane-keeping, speed maintaining and distance-keeping by itself.

          2. a_builder

            Re: A solution looking for a problem

            I've got a Tesla X and it is interesting to see what the systems miss.

            Various silly things happen with automated braking when it is not needed.

            The steering bins out even with a solid while line painted down the side of the road.

            It is far from foolproof.

            The ONLY feature that I use is the auto distance control - I never let it near the steering. That way I am alert to the situational information.

            Like the build quality of the bodywork it is all very ish.

            It is a shame because the platform underneath it is actually very, very good.

        3. doublelayer Silver badge

          Re: A solution looking for a problem

          "All autopilot does is take a flawed driver, and add in the technical flaws created by flawed software/hardware designers to get the worst of both worlds."

          No, it doesn't. What it does is to substitute the flaws in the software for the flaws in the humans. Depending on the quality of the software, this could be worse or better. In existing tests, it's often better.

          Consider a human who is paid to calculate mathematical answers. They are going to make some mistakes. Now add in a computer which solves the same problems. Every once in a while, something will break and the computer will mess up, but it will get a lot of right answers first. Is it the case that substituting the computer will worsen accuracy because you've combined the worst of both approaches? No, because the human is no longer doing the calculations and thus doesn't make their mistakes anymore. The software running vehicles is more complex and has more problems, but it doesn't stop the human's fallibility being removed from the situation.

          It may be that the software is too flawed to allow, though existing tests are not showing that. Even if that's the case, your argument still isn't the problem.

          1. eldakka

            Re: A solution looking for a problem

            "All autopilot does is take a flawed driver, and add in the technical flaws created by flawed software/hardware designers to get the worst of both worlds."

            No, it doesn't.

            Yes, it does.
            What it does is to substitute the flaws in the software for the flaws in the humans.
            No it doesn't.

            The system can only function in certain limited circumstances, during which time it does substitute for the human flaws, while adding in its own. However, it can suddenly and without warning return full control to the human driver, who's human flaws are now in the equation, and in fact multiplied because they have been relaxing, vegetating, losing all situational awareness and now have to work out what the fuck is going on, why control has been handed back, and then dealing with whatever situation caused the control to be returned to the human driver.

            Therefore during the course of a trip, you have to deal at various times with both the AI-flaws and the human flaws, with a dash of confusion thrown in for that unplanned, sudden - usually emergency - transition phase from AI to human control, therefore getting the worst of both worlds.

            1. cornetman Silver badge

              Re: A solution looking for a problem

              You're both right in certain respects. The real question is whether the net benefit is substantially positive and the evidence seems to suggest that it is.

              1) When the Autopilot runs into trouble, it hands control back to the driver who may be flustered and unprepared. This is a big problem, but if it is rare, then it might be an acceptable trade off, particularly if this becomes rarer as the technology develops.

              2) If Autopilot can handle the majority of the driving control, then the common causes of regular traffic accidents caused by boring and fatiguing driving might be avoided.

              The trick is trying to measure the benefits of Autopilot type systems to avoid common incidents, compared to the failure modes of Autopilot itself:

              a) Does not recognise a dangerous situation and careens into the side of a solid object it didn't see without alerting the driver at all (and the driver didn't notice because they were inattentive), or

              b) Does something completely errant, like driving off the side of the road, or misunderstanding where the lane is and swerving off to the side.

              We tend to fear the errant behaviour, even if it is very rare, partly I think because we don't understand it, whereas we accept errant behaviour from humans because we empathise with it. So we rail against Autopilot when it sideswipes a passing car because it misinterpreted the lane markers, but we are more tolerant when a meatbag driver does it because they are tired. Both are equally as dangerous and the other driver won't care as they get shunted off of the road.

              1. hoola Silver badge

                Re: A solution looking for a problem

                To a certain extent I agree however the big question is what happens when Autopilot does make an error?

                If the only consequence is that the operator of the vehicle is the person at risk then one could say that is acceptable. The issue we have is that when Autopilot does foul up there is a high likelihood of collateral damage to third parties or property. Is having Autopilot make an error and injure or kill an innocent third party acceptable?

                Now if an errant driver makes a mistake it is very easy to unpick exactly what happened, why and who is to blame. If the an automated driving system goes wrong there has to be the ability to prosecute the supplier for negligence and in the true style of things, probably damages with a reasonable assurance that like any other insurance claim, they will pay out.

                The problem we have is that tech is moving into the space where it can routinely cause injury and the law is light-years behind. At them moment I simply cannot see how anyone could successfully sue Tesla (or any of the others) as they do not have the resources. These are well-funded tech outfits that are specialists in avoidance and obfuscation. It does not matter how much of a cock-up the tech can make they WILL blame everyone else:

                The roads are not clear

                The signs are wrong

                A sparrow farted and confused super secret sensor X1B.

                And so on.

                If you want to sell and deploy self-driving technology in the way Autopilot and future systems will then there has to be liability.

                1. doublelayer Silver badge

                  Re: A solution looking for a problem

                  "Is having Autopilot make an error and injure or kill an innocent third party acceptable?"

                  The problem with this logic is that it works equally well for literally anything else. Is having a human driver of a large vehicle at high speeds make an error and injure or kill an innocent third party acceptable? On that basis, we could well ban or at least significantly restrict all driving because it carries with it some risk. The better question is what we do when that happens, which must include both having a method to blame the supplier for real problems in their software and not automatically blaming them if something doesn't work. I am more optimistic than you are on that front as there are bodies specifically set up to investigate and penalize companies for exactly that kind of event. The one investigating Tesla here is one of them and most countries have something like it. The software will never be perfect, and it will at times crash. It is the responsibility of our governments to investigate that for safety, but we also need to recognize that we don't need a perfect safety record for it to be acceptable, and in fact we can get a rate of accidents significantly higher than zero before it's even worse than the status quo.

      3. cornetman Silver badge

        Re: A solution looking for a problem

        It is an interesting question to me as to whether a few accidents like this are a small price to pay for the avoidance of a large number of other accidents that might otherwise have happened.

        It would be very easy to throw the baby out with the bath water if the net benefit of this feature was profoundly positive. Not that I'm saying it is or isn't, but it is interesting to ponder.

        1. Anonymous Coward
          Anonymous Coward

          Re: A solution looking for a problem

          This is the question nobody seems to be asking: How many accidents were avoided due to auto-pilot being more aware than the driver?

          Tesla have stats for accidents on their site. In Q1 2021, for example:

          In the 1st quarter, we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 978 thousand miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484 thousand miles.

          1. cornetman Silver badge

            Re: A solution looking for a problem

            Agreed, but we would have to look further into the details.

            For instance, are the accidents due to people not paying attention while using Autopilot generally very serious/fatal because the driver just plain wasn't looking out the windscreen, compared to the other accidents without Autopilot where they just lacked concentration, but were generally able to mitigate the effects of the accident because although they weren't paying enough attention, they were able to at least put on the brakes or swerve?

            I would be interested to see an more in-depth analysis. Perhaps that is one of the things that this committee will undertake.

          2. Cragganmore

            Re: A solution looking for a problem

            "By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484 thousand miles."

            Interesting to see if those stats include all the bogus whiplash claims?

            1. Gene Cash Silver badge

              Re: A solution looking for a problem

              > Interesting to see if those stats include all the bogus whiplash claims?

              Well, even if the whiplash claim was bogus, and you weren't actually injured, you were still in a crash. So yes, that's a crash.

              1. LybsterRoy Silver badge

                Re: A solution looking for a problem

                <<you were still in a crash>>

                In the UK I remember reading an article (don't remember where) where it indicated that the "crash" was engineered - eg pulling onto an exit ramp from the motorway and suddenly applying the brakes. I would not count that as an accident.

          3. werdsmith Silver badge

            Re: A solution looking for a problem

            Of course single anecdotes are not everything but when a lady stepped into the road in front of me, the car was braking before I could get onto the brake and push it. Pulled up 2 metres short although she woke up and hopped back out of the way.

          4. David Hicklin Bronze badge

            Re: A solution looking for a problem

            Tesla have stats for accidents on their site. In Q1 2021, for example:

            But are we comparing like for like here?

            It could be that most drivers only switch on Autopilot whilst on motorways (freeways for the US) where there are fewer accidents per mile driven than in a built up urban environment with its many junctions, crossing and turning traffic etc

            1. Anonymous Coward
              Anonymous Coward

              Re: A solution looking for a problem

              That is an EXTREMELY good question because you can trust marketing to fudge that figure or the parameters leading to it. Or they can just lie - who's going to know?

        2. Anonymous Coward
          Anonymous Coward

          Re: A solution looking for a problem

          I believe a similar judgement had to be made for airbags. There are a few rare types of impact where the airbag will likely kill you or do more serious injury (compared to not having one) but in the vast majority of situations where it deploys, it will save your life or reduce the severity of your injuries.

          1. Lil Endian

            Re: A solution looking for a problem

            I can remember in the UK when seatbelts were becoming legally required. Arguments arose such as "It'll pin me to my seat and I'll die if my car goes under a lorry and I need to duck".

            1. hoola Silver badge

              Re: A solution looking for a problem

              Both are passive until activated at which point they are reactive, mitigating the situation. Neither has any influence on the cause.

              1. Lil Endian

                Re: A solution looking for a problem

                Re: bags'n'belts

                I think it's an indictment of human stupidity in their arguments, rather than a view on the safety devices' causal influence regarding collisions.

      4. gandalfcn Silver badge

        Re: A solution looking for a problem

        The 3 downvotes are interesting. But then I have noted a number of ElRegers who are disconnected from reality.

        1. Steve Davies 3 Silver badge

          Re: ElRegers who are disconnected from reality

          Nah... they are all in their Tesla's with FSD engaged and fast asleep as the 'thing' drives them to work/home/wherever.

          The hardcore Tesla fan is in reality, a member of a cult. Just try saying something negative about their Dear Leader in any of the EV forums. You will get shat upon from every angle.

          As a long time EV owner, I applaud what Tesla has done to the industry but... Tesla is no longer the only game in town. Even if I was given a Tesla I'd get rid of it ASAP. I just don't like their design philosophy. I'm entitled to my opinion and no matter how much slagging off or downvoting I might get, I am not going to change my mind.

    3. Anonymous Coward
      Anonymous Coward

      Re: A solution looking for a problem

      What is the “problem” that a self-driving car solves...

      Getting me home after a night out in the usual 40 minutes a drive takes instead of the two hours the night bus takes or the £200+ a taxi would cost. But by the time the technology is ready, I'll no longer have a dog waiting for me to get home, and will probably just get a hotel in town for the nights when I do go out.

      To be honest, that's pretty much all I want it for right now. At some point in the future, I may become unable to drive myself for whatever reason and not having to rely on public transport and taxis wouldn't be a bad thing.

    4. werdsmith Silver badge

      Re: A solution looking for a problem

      What is the “problem” that a self-driving car solves

      I've been getting more and more tired lately, after long journeys. But since I've had self driving available for the longer stretches, I've found I've been arriving a lot fresher and less tired at the destination.

      Some of that driving load is taken off me and I'm then free to just concentrate on watching what's happening around me and keeping safe margins.

  3. anthonyhegedus Silver badge

    Also: why?

    THe trouble with ‘driver assist’ systems is that the more automated you make the task of driving, the less attention the driver actually pays. If the car can stay in lane, brake, change lane when needed and take the right exit, how much more is there to do?

    And on a further note, I get the impression that true driverless functionality will only be available in controlled zones: where all driverless cars are notified if there’s something unusual ahead (cones, people, light etc.) and the entire area is mapped properly for the autonomous vehicle, and kept up to date on a daily basis. Until we get rid of humans driving the cars though, and humans anywhere in the area in fact, there’s always going to be the difficulty of controlling autonomous vehicles safely.

    1. Version 1.0 Silver badge

      Re: Also: why?

      It would be interesting to see how many Tesla auto-pilot accidents happen when a code writer is doing the driving, we know that just because the code complied with no errors reported, it doesn't mean that the code will not crash.

    2. John H Woods

      Re: Also: why?

      "THe trouble with ‘driver assist’ systems is that the more automated you make the task of driving, the less attention the driver actually pays."

      I thought this too until I started using a car with adaptive cruise and active lane control. But I've noticed that as driving is now somewhat less tiring, I am more alert at the end of long journeys. I've also noticed that with the ACC managing the distance between me and the vehicle in front, I can pay more attention to other hazards. So I'm no longer convinced that it is self-evidently true that assistance systems always lessen driver attention.

      1. cornetman Silver badge

        Re: Also: why?

        That's interesting. I've never used on of these systems before so I have no experience to draw on.

        It might be that some people are more prone to distraction when not actively engaged in the driving or if there are other people in the car they may be more likely to engage in distracting behaviours with them (shouting at misbehaving kids for example).

        1. hoola Silver badge

          Re: Also: why?

          I have ACC and find that it can be beneficial particularly in the restrictions through roadworks on motorways where it will manage the speed and distance. In free-flowing traffic one does need to pay attention in a different way as it can respond slightly unexpected (but what are actually predictable) ways.

          The most obvious is if you don't cancel it when turning onto a slip road. If the traffic flow on the main carriageway is below the set speed, the car will now accelerate on the slip road.

          What is interesting though is the distance keeping. It really makes you realise how poor must people are and how optimistic they believe their, and the vehicles capabilities are at reacting. Even with the distance control dialled all the way down to "Audi" mode (thanks Jeremy Clarkson) it is still usually further away than most drivers maintain at 70 (or more.....)

      2. Alumoi Silver badge

        Re: Also: why?

        For me it's the other way around. Cruise control and lane control bore me to death and make me careless. So I usually engage crruise control only on long, empty stretches of highway.

  4. Grunchy Silver badge

    My quadcopter crashed into the neighbour’s tree and was an embarrassing nuisance. I found out it had inexplicably activated a “flight assist” mode. Anybody remember when 737-Max were overpowering the pilot controls in order to crash into the ground?

    Methinks robots are intrinsically unsafe... moreover, this has been a known fact for decades, and still these unregulated robots are thrown into our midst.

  5. Snowy Silver badge
    Joke

    Self driving car 100% safe

    The problems is their are no self driving car available to buy now or in the near future!

  6. Eclectic Man Silver badge

    Fake driver?

    "Tests from a US-based consumer rights organisation in April were claimed to show that Autopilot could be enabled by fastening the driver-side seatbelt and hanging a weight off the steering wheel, bypassing safety features intended to ensure a human at the wheel is paying some attention to the road ahead."

    Surely this could be solved by a camera pointed at the driver using AI face recognition ensuring that only a registered driver was allowed to engage 'autopilot' while actually sitting in the driver seat? And if the eyes don't blink and there is no appearance of respiration the car slows and calls the emergency services. And fingerprint sensors in the steering wheel could ensure that someone was actually holding on to it.

    But that would probably be too intrusive, and the car could be used as a 'witness' to any road traffic law violations to state who was actually at the wheel at the time of the incident, officer.

    1. Ken Moorhouse Silver badge

      Re: But that would probably be too intrusive

      Sod that... People's lives are at risk. Car Operators need to be made aware that they are responsible for bad things that may happen.

    2. This post has been deleted by its author

  7. Cragganmore

    Cyclists don't stand a chance

    As someone that spends a fair bit of time on two wheels, I dread the thought of an 'autonomous' Tesla driven by someone with a completely false sense of trust in the vehicle's abilities... as in the basic ability not to plough through slower-moving cyclists during adverse environmental conditions.

    1. Ken Moorhouse Silver badge

      Re: As someone that spends a fair bit of time on two legs...

      Ditto. (Apologies for the plagiarism).

      1. Anonymous Coward
        Anonymous Coward

        Re: As someone that spends a fair bit of time on two legs...

        I recall a comment from a car company exec (possibly Mercedes). When asked how they would program their self driving car to respond if it was put in a situation where the car occupants could be injured if the car was avoiding a pedestrian, he was unequivocal. They would protect the driver as they are their customer. Sod the pedestrian.

        1. Anonymous Coward
          Anonymous Coward

          Re: As someone that spends a fair bit of time on two legs...

          After all, that's what screen wipers are for.

          No, wait ..

    2. DS999 Silver badge

      Re: Cyclists don't stand a chance

      During adverse environmental conditions?

      I wouldn't trust it not to run me down on a clear day, since that's when most of the incidents of Teslas on "autopilot" running into parked emergency vehicles occurred. It would be nice to think you only had to worry at night or in a heavy rainstorm, or when you are biking in the same lane as traffic versus in a bike lane where cars are not supposed to drive, but you would be justified in worrying EVERY time a Tesla in "autopilot" went by.

    3. John Robson Silver badge

      Re: Cyclists don't stand a chance

      Whereas I have far more confidence in their ability to spot me than I do a significant proportion of motorists.

  8. Ellipsis
    Unhappy

    “watchdog opens probe … after spate of crashes”

    It didn’t occur to them to mandate some decent testing before the crashes?

    1. DS999 Silver badge

      Re: “watchdog opens probe … after spate of crashes”

      That's what happens when one party insists regulation is by definition a bad thing. Not that legislators would have had the foresight to consider this beforehand - they probably believed Musk's hype and it was only us techies who warned that self driving wouldn't be nearly as easy as many people wanted to believe. A lot of the general public still believes autonomous vehicles are just around the corner.

  9. Blane Bramble

    "Tesla refused to describe to investigators how the system operated."

    That, right there, should be enough to get the system banned and cars recalled.

    1. Lil Endian

      The phrase that springs to mind is "not road legal".

  10. Tron Silver badge

    I'm sorry I had to do that Dave.

    -Tesla refused to describe to investigators how the system operated.

    No problem. Put them in jail until they have a change of heart.

    Autopilot works well enough in planes surrounded by lots of bugger all..

    Cars, surrounded by all sorts of stuff, much of it mobile, some of it driving very badly, less so. Too many variables.

    Not sure this tech will ever actually mature to viability. Aside from anything else, whilst everyone ignores endless RTAs due to people driving like idiots, if a computer is driving, they raise merry hell if there are any crashes at all.

    So a really low bar for accidents and far too much data to manage at speed.

    The best it may do is help you park in tight spots, brake more competently on black ice, and assist Michael Knight in fighting crime.

    1. Joe W Silver badge

      Re: I'm sorry I had to do that Dave.

      So.... why not automate trains? I think the London DLR is driverless, right? And some subways (in other countries). To me it seems much simpler to automate driving on a track dedicated to your vehicle, rather than the extreme dynamic environment that is a road downtown during rush hour.

  11. Nunyabiznes

    It's harder than you think

    Tesla has poured a lot of time and money into their systems, and they still fail. This seems to prove that driving is a lot harder than it looks, and programming a driverless car is even harder.

    One example I have to hand is driving to an intersection directly pointed at the sun. The colors of the traffic control lights get washed out by the glare and you have to depend on situational awareness to determine if you have right of way or not. To me (not a programmer, btw) that seems awfully difficult.

    If the statistics provided by Tesla (and quoted above) are correct, they are doing a pretty good job on the whole. It is just that the failures are spectacular and Elon can be a lightning rod due to his marketing claims.

    1. veti Silver badge

      Re: It's harder than you think

      I assume the stats quoted by Tesla are self-serving in several ways. For one, there is a lot of self selection in the sample of their drivers - they are heavily weighted towards certain demographics by income, age, and geography. For another, I assume most Tesla drivers have the wit to engage autopilot only when it is reasonably safe to do so, so of course the accident rate should be very low.

      I would also speculate that if a terrified driver stamps on the brakes a half second before ploughing into the back of a truck, that would probably immediately disengage the autopilot, so that accident would be excluded from the autopilot stats on a technicality.

      1. Alumoi Silver badge

        Re: It's harder than you think

        For another, I assume most Tesla drivers have the wit to engage autopilot only when it is reasonably safe to do so...

        And I have a bridge to sell you.

        1. veti Silver badge

          Re: It's harder than you think

          So, how do you account for the extremely low rate of accidents involving autopilot?

          1. nautica Silver badge
            Thumb Down

            Re: It's harder than you think

            "So, how do you account for the extremely low rate of accidents involving autopilot?"

            Compared to what, precisely?

            1. John Robson Silver badge

              Re: It's harder than you think

              Compared with not using autopilot - either just the standard aids, or even worse without those on either.

  12. nautica Silver badge
    Holmes

    Really...

    "US watchdog opens probe into Tesla's Autopilot driver assist system after spate of crashes"

    What the bloody f**k took them so long?

    From the very moment Elon Musk announced he was introducing a "self-driving car", or a car "with autopilot", or whatever bit of dissembling and circumlocution (I am NOT going to use the word, "lying". See, I didn't use it) he did, can, and will, come up with, he should have been immediately put under the microscope, to prove his concepts BEFORE THE FACT; not after his systems killed people.

    1. John Robson Silver badge

      Re: Really...

      Because calling it a spate of crashes is a fucking stupid clickbait headline.

      There have been some, but fewer than would be expected in terms of miles driven. And most of those we hear of were likely easily avoidable by the driver doing what they are meant to be doing - paying at least some attention to what's going on. We simply never hear about other vehicle crashes, because they are so common it's not news... The fact that we still hear about these is a testament to how rare it is.

      There are plenty of dashcam clips available showing autopilot avoiding crashes, frequently side traffic failing to yield, or accelerating out of the way of a vehicle about to rear end them. So maybe we should be looking at their safety in the context of:

      a) Safety assessments of any safety critical hardware/software irrespective of whether it's on a road

      b) Cost/benefit analysis

      And that is data that Tesla have, and should be forced to share (on site, with an NDA bound official of relevant inspectorate). The reported information should be detailed enough for the inspectorate, and a public version (further redacted) should be made available occasionally.

      For one improvement suggestion: I'd like to see a HUD with a computer vision overlay so that the driver can have longer to ascertain that the vehicle hasn't got something ahead correct.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like