back to article Watch an oblivious Tesla Model 3 smash into an overturned truck on a highway 'while under Autopilot'

A Tesla Model 3 plowed straight into the roof of an overturned truck lying across a highway in Taiwan, sparking fears the driver trusted the car's Autopilot a little too much. The smash occurred on Monday at 0640 local time (2240 UTC) and the drivers of both vehicles were unharmed according to Taiwan's Liberty Times. You can …

  1. JDPower Bronze badge

    Seriously? The car can't tell the difference between white lorries and the sky? It's about time these so self driving/crashing cars were removed from the roads.

    1. Oliver Mayes

      The human driver didn't brake the car in time either, time to take all those unreliable humans out of the cars to.

      1. Peter2 Silver badge

        When the car is driving, after a while your going to start trusting it and get complacent and pay less attention to the road. In previous incidents this has included watching DVD's, or climbing into the passenger seat while the car is driving.

        The problem comes when the car does something so absurdly stupid that a human is frozen thinking "what the hell" before getting to the "I need to take control, I need to stop, slam foot down on brakes". That takes time, and when the car is moving at 70mph you don't have it. 70MPH is ~30 metres per second. If you spot the problem 200 metres away, you have ~6.6 seconds before impact. Since it takes 75 meters to decelerate to a stop from 70mph you have a hair over three seconds to realise that the car is going to kill you, and go for the brakes.

        It takes one second for an average driver paying attention to notice a problem and hit the brakes. If your expecting the car to react because it's driving, by the time it hasn't reacted as you expected and you realise you need to take control, in most cases it's already going to be too late.

        This is well demonstrated in the video, at six seconds you can notice that the Tesla is alongside a car under human control. Note the differences. The car in the lane next to him under human control notices the problem, checks his surroundings and then moves over a lane and considerately slows down to allow the Tesla driver to pull in front of him.

        He finishes doing this at point the Tesla driver realises he needs to take control, a second later smoke comes from the Tesla brakes as the driver picks the simplest route and slams on the brakes as he doesn't have time to evaluate all possible courses of action (which even at this point include switching lanes, but he doesn't have enough time to consider this, check his surroundings and discover that the surrounding traffic has left him room to escape) Hence, brakes applied. Too little, too late. He hits.

        The human driver drives past the "AI" created crash 4 seconds later having slowed down rather more considerably than the Tesla simply to allow it maneuver room to escape.

        One can point to accident rates, however these will rise as more self driving cars end up on the road and run into conditions like this. It also ignores that the majority of accidents at the moment are had by inexperienced and arrogant drivers, mostly within their first few years of driving when racing on the public roads and then discovering a tree or ditch. I would suggest though that even if self driving cars did away with boy racers, it would simply distribute the fatalities more widely through accidents like this that a 16 year old learner wouldn't have committed. :/

        I'm quite happy with adopting some of the useful technology like automatic braking (which has the possibility of all but eliminating the most common UK accident, the rear end shunt in traffic) but personally, I'd be happier driving myself.

        1. Bbuckley

          Well said. Sorry Oliver but your comment is idiotic - if I buy technology at a huge premium because it promises me things I expect it deliver said things. This is Human nature. You can call it stupidity if you like but if so take a look in the mirror and see a stupid person. If this AI does not work it should be scrapped. End of story.

    2. John Robson Silver badge

      Given the rate of collisions meatsacks have, it's the net behind the wheel that should be removed from all cars.

      Here the driver failed to pay attention, and his insurance premiums may rise slightly as a result.

      1. Bbuckley

        Nope. See other comments about the AI promise. If it does not work as promised the AI creator is responsible, not the victim of it's programmer's inability to cover all edge cases.

    3. Anonymous Coward
      Anonymous Coward

      Downoted, but not because you are wrong.

      Because you thought AI might be able to tell the difference, or that the cameras are of a high enough quality, in the first place!

      If your asking why this failed, you've already missed the boat that sailed. It's often cost savings, time savings or failure to actually work within the limits of the current systems outputs.

    4. tfewster
      Facepalm

      Neither car nor driver noticed the man in the road waving down traffic either. Maybe he was the wrong colour too?

  2. Mark192

    I get that the cameras may not have picked out the truck...

    ...but I thought Teslas had radar too.

    Anyone know how it missed this, beyond shitty algorithm not differentiating between white truck and light sky?

    1. Sampler

      Re: I get that the cameras may not have picked out the truck...

      I was going to say, how far out front does it radar/lidar function - is it less than it's breaking distance at this speed (which this would seem to confer) as that seems a bit less useful having radar/lidar, might aswell strap a chocolate fireguard to the bonnet...

      1. SkippyBing

        Re: I get that the cameras may not have picked out the truck...

        I suspect the radar is more for things like active cruise control where it keeps you a sensible distance from the car ahead. It may not be possible to pump out enough RF to see as far ahead as needed in this instance without getting embroiled in licensing requirements, which would have to be negotiated for each country where you sold the capability.

      2. Lord Elpuss Silver badge

        Re: I get that the cameras may not have picked out the truck...

        In my experience Radar/Lidar works far enough out that at motorway speeds it will reliably *emergency* brake in time to stop you (assuming it recognises the obstacle), but not far enough out that it will comfortably and gently brake you to a halt. And given that I don't want to be thrown into the windshield every time I get behind stationary traffic, I prefer to keep a watchful eye on everything even when on Autopilot.

      3. Anonymous Coward
        Anonymous Coward

        Re: I get that the cameras may not have picked out the truck...

        RADAR range depend on the radar-band.

        cops have no problem measuring any car at any speed using a radar gun

        over the horizon radars spot incoming missiles half as planet away.

        and in additon to radar, there is also lidar - a self driving car relying on obtical sights only

        is a bit archaic to me.

        It should not get road permission until it is fool proof, because it will be fools operating it.

        If you cant rely on "self drive" then dont allow it to exist.

    2. jgarbo
      Stop

      Re: I get that the cameras may not have picked out the truck...

      More to the point, it didn't see a big white "thing" on the black road? Surely an obstruction.

      1. SJA

        Re: I get that the cameras may not have picked out the truck...

        Radar does have issues with recognizing stationary things.

        1. Headley_Grange Silver badge

          Re: I get that the cameras may not have picked out the truck...

          The truck wasn't stationary relative to the car until the car hit it.

        2. Anonymous Coward
          Anonymous Coward

          Radar does have issues with recognizing stationary things

          Nearly - they have no problems spotting stationary things at all. The problem is knowing when they are a threat (as in this case) or just the edge of something like a man-hole cover in the road (or even thick road marking paint - some radars operate at 76GHz, and the "rumble strips" used in places like the UK at the edges of some roads make very good radar reflectors. Sensor fusion tries to overcome this, but it is a tricky problem to solve, especially if the supporting camera system has not been trained to detect "truck on it's side".

          1. Anonymous Coward Silver badge
            Big Brother

            Re: Radar does have issues with recognizing stationary things

            Equally a bridge/flyover going over the road that you're on, especially when your road dips to go under it. To a human it's quite clear that you're not going to smash into the bridge but to radar/lidar it's not so clear.

            That's why they generally ignore big stationary objects.

            1. Anonymous Coward
              Anonymous Coward

              Re: Radar does have issues with recognizing stationary things

              Last friday it was very clear the chimely and light was not going to go under the bridge when the narrow boat approached.

              Still, lovely chap to give me a tow, but I did feel partly guilty. Not my fault, as they had much more experience and should have known their route they often sail has a low bridge. All I heard was "ohy! That bridge is normally taller than that!" when things went crunch.

              If I'd not been there, the same would have happened, just without witnesses.

            2. Anonymous Coward
              Anonymous Coward

              Re: Radar does have issues with recognizing stationary things

              That one is generally not too much of a problem if you have a decent radar that give (even very crude) elevation data - unfortunately, I'm not aware of any current automotive radars that support this due to the cost.

              1. Anonymous Coward
                Anonymous Coward

                Care to explain the thumbs down?

                I'm speaking as someone who has implemented algorithms to cope with this using a radar that reports elevation and have data that quite clearly shows that "overhead" object are correctly classified.

          2. wjake
            Joke

            Re: Radar does have issues with recognizing stationary things

            I see more interesting pictures in "Captcha" coming our way!

    3. Dinanziame Silver badge
      Devil

      Re: I get that the cameras may not have picked out the truck...

      As far as I know, Teslas don't have a lidar. They say it's not necessary, because anyway the car needs to solve optical recognition, so a lidar is redundant. And as everybody knows, you don't need redundancy for safety systems. Ahem.

      Which I translate to: "Our cars are already expensive, Lidars would add on to of that. Also, they don't look cool."

      1. SJA

        Re: I get that the cameras may not have picked out the truck...

        Do humans have lidar?

        1. Danny 14

          Re: I get that the cameras may not have picked out the truck...

          take some acid, shine a torch on things, marvel at the different colours reflected in your eye sensors.

        2. Peter2 Silver badge

          Re: I get that the cameras may not have picked out the truck...

          No, but we do have two visual receivers and very, very good software with neural learning that gives good depth perception on the fly and then automatically gives approximate distances from objects.

          People with damaged equipment that doesn't support depth perception are simply banned from driving in most countries.

          1. ChrisC Silver badge

            Re: I get that the cameras may not have picked out the truck...

            Worth noting however that, at the sorts of distances involved here, it doesn't matter whether a human has two functioning eyes or just one, because we're into the domain of monocular depth perception which is entirely based on our having learned what the world ought to look like as we move through it.

          2. Clunking Fist

            Re: I get that the cameras may not have picked out the truck...

            "People with damaged equipment that doesn't support depth perception are simply banned from driving in most countries."

            I had an elderly neighbour with only one functioning eye (she had lost sight recently in the other eye) and she was apparently allowed to drive. She knew all the local roads so maybe didn't need to worry about depth perception too much. But I swear she was oblivious to anything to the left of her. Scarily so.

            1. ChrisC Silver badge

              Re: I get that the cameras may not have picked out the truck...

              As I noted in my other comment here, depth perception doesn't need two eyes - the distance between our eyes is so small that our ability to judge depth purely from stereo vision diminishes quite quickly as the object gets further away, and you're then into the domain of monocular depth perception where you're judging depth/distance based on your learned knowledge of the scene you're viewing.

              e.g. once you've learned how big something (like, say, a fully grown cow...) looks at a given distance, you can then make a pretty good estimate of how far away another example of said object is if you see it occupying a different portion of your field of view.

              The bigger problem with vision loss in one eye is, as you also noted, the reduced field of view. However, once you've lived with that for even a fairly short period of time, you start to get into the habit of moving your head more to allow your functioning eye to fill in the gaps. It's.certainly not as good as the instantaneous full field of view you get with two working eyes, but it's more than good enough.

              1. SImon Hobson Bronze badge
                Joke

                Re: I get that the cameras may not have picked out the truck...

                once you've learned how big something (like, say, a fully grown cow...) looks at a given distance

                Sorry, couldn't resist

                https://www.youtube.com/watch?v=MMiKyfd6hA0

      2. Anonymous Coward
        Anonymous Coward

        Re: I get that the cameras may not have picked out the truck...

        The issue felt with Lidar is that you can't progress to an autonomous car if you have reliance on Lidar. It doesn't work in too many common situations such as heavy rain, fog, smoke etc.

        *If* the car can be made to work with just cameras then it would work in all situations that a human could operate in and possible better. But is is a massive task.

        Lidar can create remarkable LSAM visualisations and interpretations for a vehicle but they are currently expensive, very resource intensive, often ugly (although miniature unites are arriving on the market), and can only work under certain situations.

        Visual recognition can work under nearly all situations and have a permanent all round car view but the machine learning and data processing and "AI" is not good enough for every scenario until that scenario is "turned on".

        However systems like Tesla do have a lot of safety warnings especially about stationary objects (as opposed to slowing objects) as they are unlikely to be detected at high speed. The AEB like most systems is designed for low speed or rapidly decelerating vehicles in front of it.

        I am involved in a non-vehicle SLAM project at the moment and Lidar has also proved to be unusable for it even though in great conditions it would work better than anything.

    4. Anonymous Coward
      Anonymous Coward

      Re: I get that the cameras may not have picked out the truck...

      It may not even be an algorithm. It could be a neural network "black box" which has been trained on lots of road images - but unfortunately none with white lorries lying across the carriageway.

      Many people don't seem to realise that the "I" in "AI" is non-existent.

      1. Wellyboot Silver badge

        Re: I get that the cameras may not have picked out the truck...

        The "I" is also rare in those that happily bet their lives on the system.

        A 99.999% accurate system making decisions only once a second (its far more often in reality) will screw up every 28(ish) driving hours, most of the time the problem is fixed by the next decision, however, there is a subset (becoming quite obvious now) that just doesn't get fixed in the second or two available.

        A simple comparison is a lottery, reasonable chances of small win *1, v-small chance of a life changing win *2 but as we all know someone somewhere wins big very often.

        A cheap & easy fix is small radar reflectors on all vehicles built into the design & hidden under the radar transparent parts. and for the driving into the sun scenario (radar whiteout) just have the car shout at the driver.

        *1 Car might wobble a bit while changing its mind and applying a correction.

        *2 Car decides a blank view ahead is sky and ploughs on.

        1. ThatOne Silver badge

          Re: I get that the cameras may not have picked out the truck...

          > A cheap & easy fix is small radar reflectors on all vehicles

          What about trees, and containers, trailers, piles of snow or fallen rocks, various animals, cargo fallen off a truck, truck fallen off a bridge and showing us a side nobody had thought to place reflectors on (belly)? Not to mention it would take decades for all existing vehicles to be replaced with AI-friendly ones, waving little warning flags.

          While I agree with your analysis, the solution isn't to try to make the world Artificial Idiot-proof, that would be unrealistic at best. The advantage of cars is their versatility, if they need a carefully prepared environment to move they become - trains.

          IMHO the only solution is, much as you don't let 2-year olds drive, not to trust AI with the wheel until it has (one day, eventually) become intelligent and wise enough. Unfortunately we're not anywhere near yet, yet marketing will keep claiming any system that can drive a straight line in sunny weather is "intelligent" enough to face anything the road throws at it, and we will keep being served headlines like "Autonomous car plows into circus elephant"...

          1. John Brown (no body) Silver badge

            Re: I get that the cameras may not have picked out the truck...

            Maybe the AI just needs to pass the same driving test a human currently has to. I mean a proper test, not one like in some places where driving forwards 100yds and then reversing 100yds successfully is a pass. I mean one with a proper theory test identifying hazards etc as well as a proper drive out on the road with an examiner. Although an additional factor would be that the AI has to be able to demostrate a learning capability too. People incapable of learning at a reasonable rate or level are unlikely to pass a driving test.

        2. John Brown (no body) Silver badge

          Re: I get that the cameras may not have picked out the truck...

          "just have the car shout at the driver."

          Maybe the car should be playing a constant tone where the frequency varies by numbers of decisions made and the assess accuracy of the decisions based on the earlier and later observations of the system and/or other detections constantly happening. This would train the "driver" to realise what is happening and learn when and how often things are going wrong. Many people, even non-techies, could tell when the dial-up modem was going to connect or fail to connect or even when it connected at a low speed and to manually hang up and try again, just from the noises it made. It will even have the knock on effect of keeping the driver more attentive.

    5. bazza Silver badge

      Re: I get that the cameras may not have picked out the truck...

      Teslas do have a radar, but not an imaging radar. They also do Doppler processing. So they can tell how far away something is and how fast it's moving relative to the vehicle, but it can't say in precisely what direction that something lies.

      This is a problem if there's a stationary obstacle, because the radar can't really tell the difference between a stationary obstacle directly in front or a signpost on the side of the road. So it ignores stationary objects in its collision avoidance / mitigation algorithms. It's therefore entirely reliant on the video processing it performs. Which, clearly, is still inadequate.

      Lidar is better, but not that much better. It attempts to build up a 3d map of the local terrain, using the time-of-travel of pulses of laser light. Because a laser beam is very narrow (whereas a radar beam is quite broad, unless you have a large antenna), a Lidar can more or less paint an accurate high resolution picture of the surroundings. That's fine if the obstacle immediately in front reflects back towards car, but if it were a mirror at an angle or, worse, painted in Vanta black, then a Lidar wouldn't see it properly either.

      It's difficult to fool a human brain with mirrors, or indeed anything "odd". A human will almost always spot the mirror. The Pepper's ghost illusion relies largely in making it difficult to see the edges of the mirror used in.

      One thing that I'd like to try (in safe, controlled circumstances obviously) is letting off a glitter bomb in front of a car with Lidar; that should be spectacularly confusing for it. If we do ever get self driving cars with Lidar en masse, it won't take kids long to realise they can have a lot of fun on bridges above highways with nothing more dangerous than a tube of glitter. It'd be an improvement on the bricks they currently throw.

      1. ChrisC Silver badge

        Re: I get that the cameras may not have picked out the truck...

        I'd hope that stationary object detection does at least make use of the information generated by the radar when doing the image processing to inform it as to which parts of the image are likely to contain stationary objects so it can then do a better job of trying to classify them as "roadside object, not in my way" vs "ooh crap, this looks bad"

  3. SJA

    Seriously?

    Seriously? The driver can't tell the difference between white lorries and the sky? It's about time these human driving/crashing cars were removed from the roads.

    1. JDPower Bronze badge

      Re: Seriously?

      Nice try but if the driver had been in a regular car he'd have been more attentive and more in control.

      1. SJA

        Re: Seriously?

        Nice try, but car driver must always be in control and have put his full attention towards the traffic.

        1. Anonymous Coward
          Anonymous Coward

          @SJA - Re: Seriously?

          Unless of course when you're not actually driving the car. Nice try, Elon!

          1. SJA

            Re: @SJA - Seriously?

            Since tesla has no autonomous cars, someone must be driving the car.

            1. Jonathon Green
              Coat

              Re: @SJA - Seriously?

              If (as I completely agree) Tesla don’t have autonomous vehicle they really ought to stop throwing words like “autopilot” around quite so casually...

              1. SJA

                Re: @SJA - Seriously?

                Why? Autopilot is the correct term.

                1. John Robson Silver badge

                  Re: @SJA - Seriously?

                  He'll claim, having never drive a tesla, nor engaged auto pilot that you it pops up an inflatable driver so you can go and make coffee.

                  It warns you repeatedly, and requires that you regularly assure it of your attention.

                  You can lie to the car of course, but that rather puts the fault somewhere other than the car. What would be interesting is eye tracking for attention detection.

                2. Anonymous Coward
                  Anonymous Coward

                  Re: @SJA - Seriously?

                  > Autopilot is the correct term.

                  It may technically be the correct term, and the implications of the term may be correctly understood by people who have undergone rigorous training, education, and dozens of hours of flight time. But most people are not trained pilots, and have been educated by entertainment media to understand "autopilot" as meaning "set it and forget it".

                  A statement doesn't have to be inaccurate to be misleading.

                  1. TheRealRoland

                    Re: @SJA - Seriously?

                    The best kind of correct!

    2. Anonymous Coward
      Anonymous Coward

      @SJA - Re: Seriously?

      As can be seen in the video, none of the human drivers of regular cars on the road failed to detect the obstacle.

      1. SJA

        Re: @SJA - Seriously?

        As can be seen in the video, the driver of the tesla failed to detect the obstacle.

        1. Anonymous Coward
          Anonymous Coward

          @SJA - Re: @SJA - Seriously?

          The driver of the Tesla was NOT driving the car. Come on, admit it!

      2. Anonymous Coward
        Anonymous Coward

        Re: @SJA - Seriously?

        While true in this instance there is a ton of others that are not the case. Still, I don't *sell* my driving ability to others, Tesla does!

  4. Numen
    FAIL

    Didn't notice a person either

    Did you notice the truck driver standing in the lane near the center divider, about where the car's brakes show the puff of smoke, trying to warn off the car? The car didn't notice him either, not just his truck.

    1. fidget

      Re: Didn't notice a person either

      I noticed the person, and saw him get out of the way. It was probably the truck driver, doing their civic duty.

      1. Martin Gregorie

        Re: Didn't notice a person either

        Yes, I saw the man on the road too, standing still until the Tesla passed him and then walking into the central hedge. I agree that this is probably the truck driver, quite possibly having walked back to see what debris on the road caused his crash. Something like this would explain why he was standing, or moving very slowly and somewhat dazed, on a busy highway, rather than trying the flag down on-coming traffic or get off the road. Something drastic must have happened to the truck since its on straight road and doesn't seem to have hit the central barrier.

        I also notice that the Tesla driver hit the brakes about the time he passed the person, so there is a good possibility that seeing the driver distracted him enough to stop him spotting the dead truck. His thought processes may well have been something like:

        - Spots driver and thinks "Geez, there's somebody on the road! Better brake", all the while with eyes on the truck driver.

        - Then, as he flashes past, still looking at the driver "Phew, missed him, I can stop braking".

        - Followed in short order with "Bloody hell! Shit..shit..shit" BANG.

    2. Robert Forsyth

      Re: Didn't notice a person either

      I thought that was just a bit of street furniture. You are right, it is waving at the car, and steps back.

    3. lglethal Silver badge
      WTF?

      Re: Didn't notice a person either

      My thoughts exactly! there was a person on the road in front of the car, and that car didnt notice them either! They're lucky he got out of the way just in time.

  5. Anonymous Coward
    Mushroom

    Call it what it is

    It's not Artificial Intelligence that's driving the car, it's Artificial Guessing.

    To put it in charge of driving is deadly.

    For that matter, to put it in charge of anything is questionable.

    1. trevorde Silver badge

      Re: Call it what it is

      Wonder how many people would use it if they called it 'Artificial Guessing'?

    2. MajDom

      Re: Call it what it is

      Indeed. We need to make a distinction between AI (even narrow AI) and neural networking. The latter is very good at guessing, but just because it uses a mechanism that is similar to that of a brain, it is not exhibiting any kind of intelligence. You need at least another two layers for that.

      1. Wellyboot Silver badge

        Re: Call it what it is

        One layer needs to be enlightened self preservation.

    3. vtcodger Silver badge

      Re: Call it what it is

      "It's not Artificial Intelligence that's driving the car, it's Artificial Guessing."

      Perhaps Clippy's driver's license should be suspended until he learns a bit more about geometry and physics?

    4. BenDwire Silver badge

      Re: Call it what it is

      It's not Artificial Intelligence that's driving the car, it's Artificial Guessing.

      Rather, it's Real Guessing that's driving, as opposed to (Artificial) Fake Intelligence.

  6. Anonymous Coward
    Anonymous Coward

    White truck, light sky.

    Yet another problem with the cloud

  7. JassMan

    Makes you wonder about Dragon.

    It may have had an event free journey journey to the ISS, it may be excellent at minimatoasttl fuel trajectpries to an orbiting target, but I'm guessing that if another satelite crosses its path - it will be toast.

    I know there are not that many other things whizzing round at such a LEO and space is a big place but things will be bit more complicated when Dragon's big brother takes a trip to the moon.

    1. Gordon 10
      FAIL

      Re: Makes you wonder about Dragon.

      Why will it be more complicated? By your own admission there will be less things whizzing around Earth-Moon transfer orbits than LEO.

      FWIW said transfer orbit is no more or less complicated than rendevous with a space station, possibly less as the Moon is a bigger target :D. (Probably the same as Lunar Gateway would be the target)

      1. Danny 14

        Re: Makes you wonder about Dragon.

        my kerbals manage it at least twice for every three failures, so it cant be that hard.

      2. JassMan

        Re: Makes you wonder about Dragon.

        My argument is that there are orders of magnitude more satelites outside the ISS than there are inside. Ie. the ISS is in an orbit which is low even for LEO. It is so low that it needs boosting every year to ensure it doesn't fall back down because of the drag of the atmosohere. VLEO has only been in general use since 2017 because of the problems with drag.

        You also say that navigating to the moon is no more complicated but it relies on an automated system being able to do astral navigation (by taking sightings and comparing to starmaps) rather than just using GPS which is how Dragon knows where it is.

        I have to assume most of the downvotes are because of stupid word prediction on my phone putting toast in the middle of minimal and screwing up the spelling of trajectories.

    2. ThatOne Silver badge

      Re: Makes you wonder about Dragon.

      Space is much more easy to navigate for an AI: The weak point of AIs is reacting to the unexpected, and while "the unexpected" is very common on the roads, it is rather rare in space.

      Space navigation never faces terrestrial issues like "what are the chances a child could jump out in front of my car from behind this school bus?".

      1. Anonymous Coward
        Anonymous Coward

        Re: Makes you wonder about Dragon.

        My kerbals beg to differ.

        Also they want to know what a schoolbus is.

  8. tygrus.au

    This is a driver assist not an autopilot !

    They need to combine techniques and multiple systems over larger areas for redundancy and increased AI. At least 4 sensors: Daylight vision, IR, LIDAR & GPS. If just 1 of these systems can see a possible danger then assume this as being correct. If a system has unreliable readings that don't match expectations then slow down, proceed with caution depending on the limits of the other sensors. If there are conditions or objects obscuring the view of the sensor then don't assume everything is perfect. Don't take the sky or lane lines for granted. Watch the driver at all times, if they stop paying attention then alert and slow down until the driver reacts correctly. This is a driver assist system not an autopilot.

    1. SJA

      Re: It is autopilot but not autonomous

      When was the last time you boarded a plane? Didn't you wonder why there's a flight captain and co-captain? I mean the autopilot has been in planes for decades. Sure they don't need pilots anymore if you think it's autonomous.

      So, why did you never wonder why there's still a captain and co-captain despite the plane having autopilot?

      1. Gordon 10

        Re: It is autopilot but not autonomous

        Both your questions can be answered with the phrase "Meatbags are stupid". In Tesla's case the driver meatbag in Airplanes cases always the passenger meatbags and some times the pilot meatbags.

        1. Jonathon Green
          Coat

          Re: It is autopilot but not autonomous

          ...nd whoever it is (I can’t imagine who that might be...) in Tesla’s management who insists on persisting with the “autopilot” branding in the face of clear evidence that a significant proportion of their customers *are* treating it as a fully autonomous capability...

          1. SJA

            Re: It is autopilot but not autonomous

            Please provide proof for that "significatn proportion of their customers" claim? Also, as Tesla stats show: Accidents happen far less often with activated AP than without.

            Btw, every day, human drivers cause thousands of accidents and crashes. I'm sure you're also vigorously fighting that no humans are allowed to drive cars anymore, right?

            1. Jonathon Green

              Re: It is autopilot but not autonomous

              There are bucketloads YouTube videos and EV forum posts with Tesla owners proudly boasting of doing ridiculous things with Autopilot engaged and companies have actively marketed devices to defeat the safeguards which are supposed to prevent that sort of thing.

              Are Tesla’s accident figures broken down by geographical territory? Given the Standards of driver training and the level of driving standards regularly demonstrated in their home market it probably isn’t at all difficult to outperform the meat there...

              1. SJA

                Re: It is autopilot but not autonomous

                There's 1 million Teslas sold. How many "bucketloads of YouTube videos and EV forum posts with Tesla owners" are there? From how many different owners?

                Also, plenty of drivers from other brands do stupid things: https://www.slashgear.com/mercedes-active-lane-assist-fooled-with-soda-can-02339580/

              2. Anonymous Coward
                Anonymous Coward

                Re: It is autopilot but not autonomous

                Oh come on there are bucketloads of Youtube videos of people doing stupid things regardless of in a Tesla or not. Even if you just narrow it to cars, there's load of videos of people doing stupid things in cars and ending up crashing - doesn't mean they thing their car was incapable of crashing!

                The very fact they have posted it on youtube and people watch it is because they know it is risky/dangerous. Otherwise it would be a very boring video. I mean no-one is going to post a video called "man drives car down road" and then have a video just showing someone driving normally down the road in a car. Doing something normal that a vehicle is supposed to do isn't 'interesting'.

                It is very unlikely that anyone actually believes their Tesla is fully autonomous. there is enough warnings to tell you it isn't and you'd find out within a few miles of leaving home that it isn't. Does that mean that people don't abuse the system? Of course not. They try to cheat the system and take a risk, a big risk. In reality most of the time it will work fine - it's not everyday you meet a truck on its side or a fire engine stopped in a live lane but the times it happens - those cases outside normally daily driving, then the system is likely to fail.

            2. Anonymous Coward
              Anonymous Coward

              Re: It is autopilot but not autonomous

              "as Tesla stats show: Accidents happen far less often with activated AP than without."

              Ah, stats from a company that is very selective about what stats it releases. Also consider:

              1. Autopilot is usually engaged on the kinds of roads with significantly less accidents per mile travelled for all car models.

              2. Many, many more miles are travelled with autopilot deactivated.

              1. Peter2 Silver badge

                Re: It is autopilot but not autonomous

                3. Standing on the brakes turns off cruise control in most cars. Since the human braked, did that turn autopilot off? If so then technically if one were so inclined then this could be omitted from the list of autopilot accidents since the accident occurred while the human was in control. Technically.

                Of course, that'd be absurd, but that's why the saying is "lies, damn lies and statistics" and frankly, I wouldn't trust a marketing department not to do it.

                1. Alan Brown Silver badge

                  Re: It is autopilot but not autonomous

                  " Since the human braked, did that turn autopilot off?"

                  Since the human braked, why didn't he try to steer around the obstacle?

                  (Hint, if you watch the video the twat was cruising down the road lanehogging. One might call it Karma)

                  1. Peter2 Silver badge

                    Re: It is autopilot but not autonomous

                    Probably because the Tesla's autopilot was driving and the human, dulled into complacency by an "autopilot" doing the driving wasn't paying as much attention to the road as he would have done if he was actually driving?

                  2. SImon Hobson Bronze badge

                    Re: It is autopilot but not autonomous

                    See the very first post on this. The driver would not have been very alert as most of the stimulus that keeps a driver alert and aware of his surroundings have been removed - that's the whole point of the high end adaptive cruise control. It makes no difference who says what or what it is called - this cruise control is designed and marketed as a way of letting the car take over a lot of the work and decision making, and that does mean that the driver is less involved in the task than he would be without it.

                    So you are cruising along, enjoying the scenery as you don't need to concentrate on the driving - the car is doing that for you. Then "what ?", "err ?", "oh sh*t !" - and the driver simply does NOT have the time to assess the situation, work out where the other vehicles are on the road, consider available exit strategies, pick one, and execute it. It's easy to sit in comfort, knowing what's going to happen, and watch a video - and say "what an idiot, all he needed to do was ..."

                    In the time available to him, getting the brakes on hard enough to emit smoke was pretty good going.

                    As an aside, one of the biggest problems in commercial aviation (well apart from most of it being on the ground at the moment) is crew alertness. Short haul flights probably not too bad, but on long haul it's largely a case of take off, wheels up, engage flight management - then sit back for a few hours, a few radio calls with ATC, perhaps a few course changes into the FMC, and wait till you arrive at your destination. There have been a number of occasions when flight crew have, lets remain polite, become distracted from the job of flying the aircraft - I recall one where the crew claimed to have been discussing rostering and failed to hear ATC calling them repeatedly as they over-flew their destination can carried on for a while, before turning round and flying back to where they were supposed to have landed.

                    AIUI, with the best of the management systems these days, the pilot can line up on the runway then the management systems can take off, fly the route, and land at the other end with the pilot only required to brake and then taxi off. I doubt that it's done very often, but you think of the challenge of staying alert for hours on end with that level of automation.

                    1. SImon Hobson Bronze badge

                      Re: It is autopilot but not autonomous

                      Oops, correction - it's the THIRD post in the comments.

      2. Alan Brown Silver badge

        Re: It is autopilot but not autonomous

        "I mean the autopilot has been in planes for decades."

        "Autopilot" in an aircraft ranges from something that will keep the wings level/heading constant (AND NOTHING ELSE - you have to watch your own altitude, etc) to something that can take off, route and land all by itself.

        As such it's a stupid name for a cruise control - Winnebago found that out a long time ago.

    2. Anonymous Coward Silver badge
      Facepalm

      Re: This is a driver assist not an autopilot !

      Autopilot is exactly a driver/pilot assist technology. No more, no less.

      In aviation terms, an autopilot will follow a heading at a set altitude until it is told otherwise (whether direct pilot input or a trigger from a pre-planned route). Aviation autopilots will also happily fly into another plane in their path if there is one (unless augmented with other systems, but it's then not just an autopilot).

      I'm sure they would also fly straight into an overturned truck floating in their flight path.

      1. vtcodger Silver badge

        Re: This is a driver assist not an autopilot !

        "I'm sure they would also fly straight into an overturned truck floating in their flight path."

        My understanding is that commercial aircraft tell the "driver" when they think he/she is about to fly into things. for mountains, it is "Terrain Awareness and Warning System (TAWS)". For other aircraft it is Airborne Collision Avoidance System (ACAS)

        Perhaps Teslae and other perhaps some other autonomous vehicles as well need an obstacle awareness and warning system.

        1. ThatOne Silver badge

          Re: This is a driver assist not an autopilot !

          > Perhaps Teslae and other perhaps some other autonomous vehicles as well need an obstacle awareness and warning system.

          Problem is the implementation, as there is much less clutter in the sky. Absence of landscape and constructions, and full 360° visibility mean that an obstacle awareness system has a very easy task on a plane.

          Sky: "Object's trajectory intersects our trajectory, alert pilot and change heading/altitude till collision is no longer possible."

          Streets: "Big object I'm about to pass will soon swerve to avoid a small object, there is a turn without visibility coming up, there are road works in the opposite lane, yet I need to turn left, unfortunately there is a stopped object and other objects are swerving into my lane. Where did that small object appear from? Ah, it was hidden by that big, long object. The object in front of me stops, do I pass it or is there a traffic light somewhere?"

        2. The First Dave

          Re: This is a driver assist not an autopilot !

          From what I understand, collision avoidance on aircraft is an extension to the transponder tech that identifies them to ground-based radar. Basically, if one plane hears another transponder nearby, it tracks where that transponder claims to be, does the maths, and adjust course if the track is too close.

          IIRC there was an exploit mentioned very recently that relied on this to turn another aircraft, by repeatedly lying (slightly) about the location of the attacking transponder.

          1. SJA

            Re: This is a driver assist not an autopilot !

            That didn't work in Überlingen where two planes crashed into one another:

            https://en.wikipedia.org/wiki/2002_%C3%9Cberlingen_mid-air_collision

            1. amateriat

              Re: This is a driver assist not an autopilot !

              That story is even more interesting than it appears on the surface:

              https://en.wikipedia.org/wiki/Vitaly_Kaloyev

            2. SImon Hobson Bronze badge

              Re: This is a driver assist not an autopilot !

              Ah, but the TCAS worked - it's warnings just weren't followed.

              Bear in mind that at the time of this crash, TCAS was fairly new and it would appear that there was some confusion on the part of one crew as to whether to follow the TCAS or ATC. These days it's very clear - you follow TCAS and then tell ATC what you've done.

              As an aside, the reason TCAS uses climb/descend for Resolution Advisories is that vertical position is (or certainly was back then) a lot more precise than horizontal position. These days with extensive use of GPS and Mode-S, fully equipped aircraft know where they are to high accuracy and transmit this via Mode-S broadcasts. Other aircraft can pick these up and do the maths to gain accurate situational awareness.

              But back then, position was largely a case of "the signal came that that direction" which is not very precise and "the signal was X strength" which is also very imprecise as received signal strength depends on both the transmitted power, and the orientation of both transmitting and receiving antennae. But with a properly calibrated pressure sensor, (relative) vertical position is fairly accurate - you don't need to know your height above ground or MSL (changes in atmospheric pressure change the relation between pressure and height), only the difference between yourself and the other aircraft.

  9. Anonymous Coward
    Anonymous Coward

    So the driver was in the fast lane of a highway, but wasn't paying attention to what was on the road ahead of him? That's the main cause of this accident, clearly.

    However, we do seem to keep hearing about these incidents happening with Teslas in Autopilot mode. It's almost as if anything in between full human manual control of the vehicle and full autonomous control of the vehicle is a bad idea, as it can cause the driver to stop paying attention.

    1. Anonymous Coward
      Anonymous Coward

      Would you say there are more incidents involving (claimed) autopilot or with other vehicles and drivers sleeping or texting at the wheel?

      Be interesting stats but I suspect that humans will work out more fallible than the Tesla and certainly more fallible than a Tesla on autopilot which is used properly (i.e. car driving, human paying attention and not overriding the detection warning system)

      1. ThatOne Silver badge
        Devil

        > I suspect that humans will work out more fallible than the Tesla and certainly more fallible than a Tesla on autopilot which is used properly

        So, a well-behaving AI is better than a misbehaving human?... Well, while we're at this kind of logic, I also bet a living snail is much faster than a dead cheetah.

    2. mevets

      almost as if?

      Nothing almost about it. Advanced Driver Assistance Systems gives itself away with the first word -- if it were truly Advanced it wouldn't need to try to convince you that it was; it just would be. That they don't work when they are most needed, ie in poor driving conditions, is reason enough to suspect them. Why reduce a drivers skills only to betray them when they need those skills the most?

      They do help increase the sticker price, and insure that the vehicles lifetime is substantially shortened.

    3. SJA

      You keep hearing this for teslas becaue of two things:

      a) "sexy" electric cars are still a novelity and there's so little crashes, that every single one of them is being blown out of proportions

      b) with traditional cars this happens so often, it's not even news-worthy anymore.

      1. Anonymous Coward
        Anonymous Coward

        Dear SJA, you should just only post this:

        "All Lives Matter".

  10. Mike 137 Silver badge

    "...not paying attention"

    That's actually an offence in the UK if you happen to be driving at the time, and with damned good reason. If you want to day dream en route take a taxi.

  11. dvd

    I read a very good analysis of the tesla accident where the car ploughed into the side of the truck.

    The explanation, as far as I remember, was that when a human encounters a novel situation they will take care, slow down, be suspicious, whatever. Whereas an ai only has it's training data. So it will always just pick the best fit from that data.

    I'm willing to bet that the training data has no images of trucks on their sides. So the ai's best fit was an overpass or something that it had seen before. Bang.

  12. TheProf
    WTF?

    Flip

    How did the driver of the truck manage to put it on its side? The road doesn't look particularly dangerous.

    1. RPF

      Re: Flip

      You clearly haven't seen Chinese driving before :-)

    2. Anonymous Coward
      Anonymous Coward

      @TheProf - Re: Flip

      How about wind ? In this part of the world where I happen to live it's not at all unusual to see empty large trucks having problems with wind gusts.

  13. 45RPM Silver badge

    Perhaps if Tesla spent less time on easter eggs, gags and games and more time on safety issues like this could be resolved. In the short term at least, it seems to me that they need to put more effort into checking driver awareness and disengaging automatic cruise control functionality if the driver is not paying attention whilst, at the same time, limiting the top speed to something inconveniently slow.

    They better act quickly though - Volvo, through Polestar, is coming to eat their lunch - and Volvo really does understand safety (and the build quality could teach Tesla a thing or two too).

  14. Anonymous Coward
    Anonymous Coward

    GIGO.

    If you train an AI model on only perfect driving... your AI won't know what to do when it "sees" (it won't, as it's not trained on it, but it will get a feed from the cameras/lidar/sonar) an over turned truck.

    "There is an over turned truck in front of you."

    AI, blankly "OK, so I continue on into it?"

    At least with the rocket science they have a scope/outline of all possible outcomes, even if unlikely, and try to cover all bases of failure modes. With self driving cars, I've yet to see anyone cover the basic failure modes, let alone an exhaustive one!

    1. You aint sin me, roit
      Stop

      Re: GIGO.

      As someone pointed out above, the human driver in the middle lane did notice the lorry and both slowed and changed lane so that the tesla had an escape route. I doubt that another tesla on autopilot would do that - is its ai taught to be considerate, or would it just carry on regardless? "The road ahead of me is clear, continue at speed limit".

      Equally would the first tesla, if it had detected the lorry, just slam on the anchors? Or would it consider other options?

      And then you get into philosophical realms... if it can't stop should it plough on, or should it consider a manoeuvre (switching lanes) that might put others at risk?

  15. c1ue

    The subset of Tesla fanbois is pretty interesting: they're all trying to redirect by saying humans cause accidents too/more accidents.

    Except that the problem isn't humans causing accidents - it is Teslas on Autopilot causing accidents where a human would not have.

    Isn't the whole point of autonomous driving that it is better? And therefore numerous and public examples of the opposite are a serious problem?

    1. FeepingCreature Bronze badge

      "Better" and "worse" are not a one-dimensional spectrum. The Autopilot (apparently) causes less, but different kinds of accidents. It's "superior on net", not "strictly superior."

    2. batfink

      No, that's false thinking. The responsibility is clearly with the drivers.

      This is like saying "the plane caused an accident when it flew into the mountain because the pilot wasn't paying attention".

      It is very clear that it's the driver's responsibility to avoid accidents. If the driver is stupid enough to think they don't need to pay attention, that doesn't mean the car is "causing" accidents.

      Yes, Tesla's marketing shouldn't be calling it "autopilot". However, how dim do you have to be to think that means you have to do fuck-all?

      Yes, there's a problem of attention if you're not making inputs.

      But, the bottom line is that these are NOT fully autonomous vehicles.

      So, stop trying to blame the car, and start blaming the fuckwits who are driving them. Just like the fuckwits who do stupid things in ordinary cars.

      1. druck Silver badge

        But its the car which is allowing the fuckwit to achieve even higher levels of fuckwittery.

      2. SImon Hobson Bronze badge

        The responsibility is clearly with the drivers

        Technically yes, but as I've already explained in another post, when you reduce stimulation, then driver alertness will reduce. As the first poster pointed out, another car driver spotted the obstacle a reasonable distance before and took avoiding action. But that driver would have had situational awareness already - not taken time to acquire it when the "oh sh*t" moment happened.

        So yes, there is a big problem with these advanced cruise controls - it is inevitable (basic human factors) that even the best driver will be less aware and alert when the car is cruising along under "autopilot". It's not the "fault" of the driver, it's basic human factors than make this inevitable - the only thing under driver control is how much effort (yes, positive effort) he puts into keeping alert, and then you get into a question of "if you are expending that much mental effort, why now just drive the darn thing yourself ?"

  16. Roger Kynaston
    Mushroom

    what is really scary

    The IMO is rushing towards enabling autonomous ships at the moment. I went to a talk and the person was getting all excited about how a gopro type camera can be hooked up to 'puter and will have incredible target discrimination. I envision lots of fishing/pleasure boats being mowed down.

    1. Steve Foster
      Joke

      Re: what is really scary

      It's the lighthouses that won't get out of the way that'll really get hurt!

    2. Francis Boyle Silver badge

      The first rule

      of operating small vessels is and will always be "Stay the fuck away from big ships".

  17. Anonymous Coward
    Anonymous Coward

    But he missed the HUMAN on the road too

    I don't know if you noticed it, but there was a PERSON on the road as well who tried to wave him down - I suspect that's why the Tesla initially slammed the brakes.

    As for visibility of the van, I don't think we can judge that without the video from the driver's perspective. As far as I can tell, the sun was at 9 o'clock for the driver which may have played a part, but I suspect the man was not paying attention at all given that he didn't slow down after seeing a man in the lane..

  18. aks

    In the video, there's a human about 100 yards in advance of the truck who's attempting to wave down the car.

    Neither the human nor the car seem to take any notice.

  19. Anonymous Coward
    Anonymous Coward

    A good point

    ...is that the driver was able to walk away (according to the story, though I'm not sure what was on what looked like a stretcher being carried away in the video still).

  20. Anonymous Coward
    Anonymous Coward

    He braked. He also had 2 other lanes to swerve into to avoid the truck.

    1. Cederic Silver badge

      I fear the point isn't that the driver braked or failed to swerve, it's that the autonomous driving software in control of the vehicle didn't brake (or swerve, or even apparently warn the driver).

      People that dislike Elon Musk's disingenuous marketing find this amusing, people that love the brand make excuses for it and people that want a fully autonomous car they can safely sleep in while driving sigh in disappointment.

      I'm in two of those groups.

      1. SJA

        What disingenious marketing are you refering to?

        1. Anonymous Coward
          Anonymous Coward

          You are tireing.

        2. Anonymous Coward
          Anonymous Coward

          Do you vote and/or like Trump?

    2. amateriat

      The Tesla *also* drove into the roof of the trailer; had the car driven into the *underside*, the aftermath may likely have been more grim.

  21. Nifty Silver badge

    I'm suspicious of the white roof against a white sky excuse. Surely the outline of the overturned truck could be seen against the road by the Tesla's cameras. The outline would've been getting larger.

  22. Alan Brown Silver badge

    Duh....

    It's an advanced cruise control.

    More importantly the fucking owner's manual EXPLICITLY STATES that the car's autonomous systems CAN NOT DETECT AND STOP IN TIME FOR STATIONARY OBJECTS IN THE VEHICLE'S PATH when travelling in excess of 50mph/(80k/m for the sensible)

    There are too many Tesla owners out there who clearly got their driving license out of a packet of Rice Crispies

    1. Boris the Cockroach Silver badge
      Happy

      Re: Duh....

      Theres something wrong with your final statement

      Quote

      There are too many car owners out there who clearly got their driving license out of a packet of Rice Crispies

      There fixed it for you

  23. Anonymous Coward
    Anonymous Coward

    Since there is no pilot in the car there cannot be such a thing as an autopilot in the car.

  24. amateriat

    “As a species, human beings do not switch in an emergency from being observers to being proactive very easily, and babysitting automatic systems probably is pretty toxic to your situational awareness.” - Tom Wisker, 2011

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like