back to article Man killed in gruesome Tesla autopilot crash was saved by his car's software weeks earlier

An investigation was launched today after the driver of a Tesla was killed in what is believed to be a malfunction of the car's autopilot. Joshua Brown, a 40-year-old Ohio man, was killed on May 7 while driving his 2015 Tesla Model S on Route 27 in Florida. The car was using the optional autopilot system, which controls the …

  1. Mark 85

    I guess my concern is "did he become complacent?". Hell, I've seen drivers hit the cruise control and basically just hang on for the ride and on the highway, it's real easy to be distracted unless one is making an effort to pay attention to the road. These autonomous systems are not completely autonomous yet and I believe they still have a ways to go. If the sensors failed to sense the trailer, why didn't the driver?

    From his previous encounter, he says he wasn't actually watching that side of the car. Chances are, if he were in full control, he would have.

    Yes, I read the article and others... other drivers slowed and avoided the truck.

    My sympathies to his family. He was too young to go.

    1. Anonymous Coward
      Anonymous Coward

      > Yes, I read the article and others... other drivers slowed and avoided the truck

      Thanks for pointing that out. Instead of "biting the hand.." as usual, this time El Reg left out some minor details that look bad for this technology.

      - This "divided highway" isn't a freeway, it has intersections.

      - The truck made a left turn from the oncoming left turn lane (think right turn, Brits).

      - He should have seen it and anticipated that the trucker might turn in front of him.

      - Time was 3:40pm on May 7 so glare likely wasn't an issue.

      - "White truck, white sky" is the lamest bullshit excuse...

      P.S. some of the other articles say a lot of other autopilot users reported that complacency is a huge problem, and the autopilot occasionally does really dangerous things like switching off during a lane change. Personally I would feel safer with a texting drunk driver at the wheel than ANY 'autopilot' or self-driving car. AI is bunk.

      1. Charles 9

        "- Time was 3:40pm on May 7 so glare likely wasn't an issue.

        - "White truck, white sky" is the lamest bullshit excuse..."

        Unless the car was facing west, meaning the car was oriented toward the sun. I don't know of too many sensors yet that can properly handle sun-blindness.

        1. Eddy Ito

          Sorry, sensors shouldn't have to handle sun-blindness, drivers should. It's a simple case of the mistaken belief "I don't have to do my job because 'tech' will do it for me".

          1. Alan Brown Silver badge

            "sensors shouldn't have to handle sun-blindness, drivers should."

            Drivers can't(*) - and if a trucker pulls out across traffic flow he's supposed to be giving way to, then there's a huge degree of culpability.

            (*)Sunstrike is a large factor in crashes in the half hour before sunset and after sunrise. Silver/white/grey vehicles are disproportionately involved in them.

        2. Oengus

          Time was 3:40pm on May 7 so glare likely wasn't an issue

          "The FHP said the tractor-trailer was traveling west on US 27A in the left turn lane toward 140th Court. Brown’s car was headed east in the outside lane of U.S. 27A."

          Even if he was headed west at 3:40 PM in Florida the Sun would have been high enough in the sky to not be a glare issue.

          1. Charles 9

            Re: Time was 3:40pm on May 7 so glare likely wasn't an issue

            Unless the truck's white side was highly-reflective, creating a mirror effect.

            1. Anonymous Coward
              Anonymous Coward

              Re: Time was 3:40pm on May 7 so glare likely wasn't an issue

              These cars will be using mostly radar and ultra-sonics not cameras

        3. Anonymous Coward
          Anonymous Coward

          Sensor failure

          "Unless the car was facing west, meaning the car was oriented toward the sun. I don't know of too many sensors yet that can properly handle sun-blindness."

          I'm pretty sure radar and laser systems arn't bothered by sun glare. The question is why do Tesla apparently rely soley on cameras and visual recognition systems when they should be using a belt and braces approach for something this safety critical?

        4. JeffyPoooh
          Pint

          "...don't know of....sensors yet that can properly handle sun-blindness."

          Radar. ...Like, obviously.

          With some effort, one can design a radar receiver that can detect in-band RF noise from the Sun. But it's much easier to build a radar that works perfectly normally even when it's boresighted on the Sun in the background.

          The essential 'System Design 101' requirement is to remember that the vehicle is about 4 or 5 feet tall. The system design needs to also consider the vertical height of the proposed route. Not shoot radar under the truck and thus crash into it.

        5. twilkins

          Thought the Tesla had radar?

        6. Dazed and Confused

          Re: sun-blindness

          > I don't know of too many sensors yet that can properly handle sun-blindness.

          If they can't then they shouldn't be being used in this application. Sunshine (even in Blighty) isn't an unusual phenomena. This is something that should have been tested for.

        7. Anonymous Coward
          Anonymous Coward

          I don't know of too many sensors yet that can properly handle sun-blindness.

          Does anyone know if it's just a standard video camera?

          Maybe an HDR camera would help in this situation - it would be more like the human eye.

        8. Anonymous Coward
          Anonymous Coward

          Unless the car was facing west

          At 3:40pm on May 7, even facing west - glare shouldn't have been a problem.

      2. Voland's right hand Silver badge

        Mre likely "Clark Griswold driving"

        "White truck, white sky"

        My old car (now my wife's car) is shiny bright blue metallic. It has been pranged twice with the other person not seeing it because it becomes literally invisible at some angles in a bright (and especially low) sun.

        In any case, based on a description of the trailer and the crash I suspect an "American Lampoon Christmas Vacation" incident, just one without a happy ending.

        https://www.youtube.com/watch?v=ozksR8QLWzM

        The autopilot did not see anything on the side because there was nothing to see - it was _UNDER_ the level of the tractor trailer and not looking up, but looking at the gap between the front and the back.

        None of it excuses the driver from being complacent too by the way.

        1. Anonymous Coward
          Anonymous Coward

          Re: Mre likely "Clark Griswold driving"

          Maybe the radar needs to point slightly higher, too.

          Of course, that may cause EM emissions problems, but may provide some protection against birdstrikes or hitting giraffes.

      3. MrXavia

        "- He should have seen it and anticipated that the trucker might turn in front of him."

        I partially agree, he should have seen it, and realised the guy might be an idiot and pull out, accident still sounds to be the fault of the trucker to me....

        1. sabroni Silver badge

          er: White truck, white sky

          Why is it only looking at the picture from a camera? Shouldn't there be some kind of object sensing sonar/radar that bounces off trucks of any colour?

          Basing automatic driving on a single mechanism seems incredibly stupid, however clever the image processor is...

          1. Anonymous Coward
            Anonymous Coward

            Re: er: White truck, white sky

            My thoughts entirely - I can't believe this system could be set up in such a way. Surely that makes it useless against black trucks at night, green trucks in heavily forested areas, grey trucks in the rain? for something as safety specific as this, at a minimum, it should use radar and IR. Do we really have the full story here?

          2. Brewster's Angle Grinder Silver badge

            Re: er: White truck, white sky

            @sabroni On the radio this morning, Noel Sharkey said the Tesla has both radar and ultrasound, but that they point down and would have missed the trailer because it was so high. He was pretty critical of the Tesla having such holes in its sensor coverage and said another, German manufacturer has complete coverage.

          3. S4qFBxkFFg

            Re: er: White truck, white sky

            "Why is it only looking at the picture from a camera? Shouldn't there be some kind of object sensing sonar/radar that bounces off trucks of any colour?"

            It has a forward-facing camera (top of the windscreen, near the mirror), forward-facing radar at the nose, and 12 short-range (5 metres) ultrasonic sensors all around the car. *

            Note that apparently the camera is used for lane-keeping and detecting speed limits - I am not sure if it is actually used to detect traffic and obstacles.

            At a guess, I would say the radar beam is too narrow in its y-axis (perhaps it isn't even 2-D) - it perhaps saw a gap underneath the trailer and thought "all clear", at the same time as the camera software was fooled by the lighting conditions, or wasn't even checking for things like the trailer.

            This may not necessarily have been a crazy way to design the car's software - how many of us would think that it should be able to detect whether it can do a limbo manoeuvre or not?

            * This was taken from an owner's post on a forum - not claiming it's reliable.

          4. Jaybus

            Re: er: White truck, white sky

            There is. That model S has one front-facing radar, one front-facing near/far IR optical camera, and 360 degree ultrasonic sensors. Not nearly enough for autonomous driving, although to Tesla's credit, they do not advertise autonomous driving, or even semi-autonomous driving, but rather autonomous steering. Huge difference! After all, the car did indeed steer straight down the highway. The ability to avoid the unfortunate accident was beyond its design parameters.

          5. Anonymous Coward
            Anonymous Coward

            Re: er: White truck, white sky

            Why is it only looking at the picture from a camera? Shouldn't there be some kind of object sensing sonar/radar that bounces off trucks of any colour?

            There is, but it was looking for cars on the road, not the gap between a trailer and the road (radar didn't go high enough).

            Sonar would have to be very loud (loud enough to kill birds?), or wouldn't be very useful.

        2. Anonymous Coward
          Anonymous Coward

          I partially agree, he should have seen it, and realised the guy might be an idiot and pull out, accident still sounds to be the fault of the trucker to me....

          Given how fast a full-sized truck accelerates, it pulled out way beforehand.

          The trucker, not the autopilot was at fault for not yielding the right-of-way.

          1. Kiwi
            Holmes

            Given how fast a full-sized truck accelerates, it pulled out way beforehand.

            The trucker, not the autopilot was at fault for not yielding the right-of-way.

            If something is blocking the road ahead, slow down or crash.

            If you can't see if something is blocking the road ahead, slow down or die.

            If you have any doubts about your ability to see clearly ahead, you're driving too fast.

      4. IsJustabloke
        Facepalm

        @tnovelli

        "Personally I would feel safer with a texting drunk driver at the wheel than ANY 'autopilot' or self-driving car."

        sigh.....

      5. Just Enough
        Boffin

        Machine Learning

        " Personally I would feel safer with a texting drunk driver at the wheel than ANY 'autopilot' or self-driving car. AI is bunk."

        Can confirm. AI was driving my car the other day when it leaned over to me, and burped with a beery breath "Whatch thish.." It then attempt to take a corner at 80mph and spun into a ditch. Then it messaged Facebook "oopz i crased, lol", attached to a dashboard video upload, submitted a crash report to the manufacturer, notified the emergency services and then fell into a drunken slumber.

        This is learnt behaviour and so much worse that any texting drunk driver.

      6. Marshalltown

        The facts - just the facts

        Looking at the other stories covering this, the "white side" of the truck is irrelevant. It looks as if the truck driver might be found to be partially at fault (in my state he definitely would be). He could see oncoming traffic that clearly was moving fast enough to be a hazard and decided to make the turn or pull across the road anyway. The turn is evidently not a signaled intersection so the rules of the road require the truck driver to make a "safe" turn, which means he can't rely on the kindness and alertness of strangers to handle safety for him. Getting hit by oncoming traffic, even traffic on autopilot shows that he failed to judge the time it would take to cross the road, or failed to wait until the turn was truly "clear." You see behaviour like that often, where one driver gets impatient or simply is impatient and grabs the intersection regardless of safety.

        1. Anonymous Coward
          Anonymous Coward

          Re: The facts - just the facts

          > You see behaviour like that often

          It's totally routine in New England. In the more congested areas, you can't get anywhere without breaking a few laws. Autopilot should be disabled if GPS says you're in this region. Maybe it is and that's why we haven't heard about any autopilot fails around here.

      7. c3me
        Meh

        drunk texting drivers vs AI.... and superiority bias

        It would be interesting to test that hypothesis by knowing the subset of the statistic deaths per million miles for those who habitually text whilst driving under the influence of alcohol (or other drugs). I fear all we may find is additional evidence that confirms the existence of superiority bias, and that it may be even higher amongst those who believe it is acceptable to impair their abilities or divert their attention unnecesarily whilst driving.

    2. Anonymous Coward
      Anonymous Coward

      Yes, I read the article and others... other drivers slowed and avoided the truck.

      This is actually one of the things that an (alert) driver has over an AI - you don't just watch traffic, you also watch for anomalies. I have a simple algorithm for driving at motorway speeds in that I have to be able to scan well ahead, and by that I don't mean two cars in front of me, it means several hundred yards, and all the lanes, not just the one I'm in. If I can't do that I don't feel I'm driving at a safe speed (and I do a fair bit of German Autobahn, I'm used to high speeds).

      In this case, it would have helped the AI if it had been able to detect that there were unexplained avoidance manoeuvres ("unexplained" as the AI had not picked up on the truck), which should have been cause to alert the driver. This situation may improve in the future when auto-drive cars are able to signal each other (which also widens sensor range), but for now, a human driver would have picked up on the flow anomaly caused by people swerving around an object, even if the object itself wasn't that visible. I suspect that will be a hard one to embed in an AI, but if they could it would certainly improve its ability to spot problems - even if it's just to wake up the driver.

      Personally I'm not comfortable with using something labelled "beta" at motorway speeds, but each to their own - if others didn't use it we would not get live data. However, I've seen a few sudden glitches that make me avoid the idea. Assist, fine, control, no, not yet, thanks.

      1. AndrewDu

        Yes, you're right AC.

        Anticipate, anticipate, anticipate.

        Watch the road as far ahead as you can see - any sign of brake lights is a warning signal.

        When driving on motorways I plan ages ahead if I can: "I'll overtake that truck before the fast guy in the BMW catches up with me, then I'll move into that space there, just in time to move out again before the slow guy who's gradually catching up blocks me in behind the next truck..." and so on and on. Result smooth steady progress, no panics unless really necessary, nobody else held up too much.

        This is what human drivers can do, which AI is really going to struggle with imho.

        Truly autonomous vehicles are much further in the future than people think.

        1. Anonymous Coward
          Anonymous Coward

          @AndrewDU

          Thats because you have had years of expereince on the road behind the wheel, by the sounds of it.

          Experience and the ability to predict other drivers actions is a skill that cannot be taught, it is learned, ususally over 10-20 years of driving.

          1. Alan Brown Silver badge

            Re: @AndrewDU

            "Experience and the ability to predict other drivers actions is a skill that cannot be taught, it is learned, ususally over 10-20 years of driving."

            It can be taught, but it requires the recipients have a will to learn. Most monkeys are far too impatient for that shit.

            The fact that it _can_ be taught means that AIs can learn it, which means that their driving ability and anticipation factors will improve. They're already better than most drivers most of the time(*)

            (*) Most drivers only think 3-4 seconds ahead at most, which explains the infamous 3 second cycle (3 seconds on the loud pedal, 3 seconds off it - and bus drivers are particularly bad for this habit(**)). Training them to look 12 seconds ahead and think 30 seconds ahead is what gets much smoother traffic flows and less annoyed passengers.

            (**) now you're aware of this phemonenon, you'll notice it more often. It occurs most often on open road where the driver "hunts" on either side of the speed limit rather than using more gentle throttle movements to keep more-or-less on speed. Motorcyclists are more likely to exhibit this kind of behaviour when driving cars, for some reason.

            1. Charles 9

              Re: @AndrewDU

              "The fact that it _can_ be taught means that AIs can learn it, which means that their driving ability and anticipation factors will improve. They're already better than most drivers most of the time(*)"

              Do we have reason to believe road smarts is something that CAN be taught rather than something intuitive we just pick up without realizing it, meaning we don't know HOW we know it and therefore can't pass it onto a car system?

              1. Teddy the Bear

                Re: @AndrewDU

                Yeah - the IAM train people in how to improve their driving significantly, mainly by anticipating what's going on a long way ahead, and focusing on small, smooth inputs and corrections. It's recognised by insurance companies to reduce accident rates - so if you pass the IAM test, you will get cheaper insurance because you're proven to be a safer driver.

            2. I am the liquor

              Re: @AndrewDU

              "Motorcyclists are more likely to exhibit this kind of behaviour when driving cars, for some reason."

              Probably because they're accustomed to feeling/hearing the engine RPM, which a motorcyce rider will rely on to hold a perfectly steady speed without ever looking at the instruments. In some cars these days it's pretty hard to tell if the engine is even running, never mind what RPM it's at.

            3. Kiwi

              Re: @AndrewDU

              Motorcyclists are more likely to exhibit this kind of behaviour when driving cars, for some reason.

              There's a number of factors, including the listening to the engine as mentioned above. Also the size and relatively pitiful responsiveness, and relatively poor handling of cars can contribute to this. As woulsd driving a vehicle one is not used to, especially as the controls are in quite different places (for the uninitiated most bikes have hand controls for the throttle, clutch and front brakes with foot controls for the gears and rear brake).

        2. Anonymous Coward
          Anonymous Coward

          "Truly autonomous vehicles are much further in the future than people think."

          This. We seem to have leapt into self driving vehicles without me ever noticing that bit where software became provably reliable, developers think like safety engineers, and humanity developed something akin to 'AI'.

          Do you trust industrial software development? The Toyota unintended acceleration case showed the quality of one major vehicle manufacturer: over 2000 global variables, nearly a 1000 buffer overrun vulnerabilities and much more. And that was industrial safety critical vehicle software. The behavior of Volkswagen and its emission results faking software also makes me question the VW development processes.

          As well as being able to drive as safe as a human, car driving software needs to also be bug free. Given that the software being shipped by some car manufacturers has been able to be pwned by playing MP3s, I think we have some way to go before their software can be relied on enough that we can start thinking about how well we trust the AI.

          I will trust self driving AI when it and the vehicle's software is open sourced and the AI can autonomously pass a human driving test ;)

          "OK KITT, I want to you to drive until you find a safe spot and perform a turn-in-the road."

        3. Alan Brown Silver badge

          "This is what human drivers can do"

          However it's not what most human drivers _DO_, which is why motorways end up with mysterious tailbacks with no crash in sight.

        4. Anonymous Coward
          Anonymous Coward

          Anticipate, anticipate, anticipate.

          Watch the road as far ahead as you can see - any sign of brake lights is a warning signal.

          So, the exact opposite of drivers in the U.S.A., then?

          1. Anonymous Coward
            Anonymous Coward

            Visible light cameras only?

            There is a lot more spectrum than the portion humans see, the car should be looking in those as well as using some form of radar or sonar (or better yet both) so "white car, white sky" should NEVER be a problem.

            If it is, then Tesla's autopilot isn't fit for purpose and shouldn't be used because it is obvious that people will become complacent and not pay attention when it is in use regardless of Tesla's instructions on the matter.

      2. mdava

        "Personally I'm not comfortable with using something labelled "beta" at motorway speeds"

        This is exactly my thought. It seems insanely reckless to do so.

    3. Yugguy

      Indeed, like many of these systems, they should be viewed as an AID, and not a replacement, for your own sense.

    4. Anonymous Coward
      Anonymous Coward

      So... the truck driver heard audio from a Harry Potter movie coming from the Tesla post-crash, and police found a portable DVD player in the vehicle... looks like the driver may have not been paying attention.

    5. Voland's right hand Silver badge

      Updated: He was watching Harry Potter

      The only comment is that if that was on the car main entertainment screen, that means Tesla is in seriously hot water. Federal safety regulations and car insurers prohibit explicitly the ability of movie playback in car entertainment systems on the front screen while moving (this is why the aftermarket ones have the hand/parking brake sensor).

      1. Anonymous Coward
        Anonymous Coward

        Re: Updated: He was watching Harry Potter

        Was it Harry Potter and the Deadly Tesla of Elon Muskevort?

    6. Anonymous Coward
      Anonymous Coward

      Did he become complacent?

      Well, apparently he made a habit of posting videos taken when he should have been driving, one showing him with both hands on his thighs letting the car drive itself.

      At the time of this crash he is said to have been watching a movie on a portable DVD player.

  2. JustWondering

    Yes but ...

    "Tesla said this is the first autopilot-related fatality in 130 million miles driven by its autos."

    But how many of those miles were on autopilot?

    1. Anonymous Coward
      Anonymous Coward

      Re: Yes but ...

      And how many were on a test track?

      1. Anonymous Coward
        Anonymous Coward

        Re: Yes but ...

        ...and how, exactly do Tesla know that figure? I wouldn't have one unless you could turn off all the data going back to the mothership.

        1. allthecoolshortnamesweretaken

          Re: Yes but ... / telemetry

          Well, regarding telemetry, Teslas are a bit like Windows-10-on-wheels:

          MIT Technology Review - Tesla knows when a crash is your fault

          But you can also diddle your driving data yourself: TeslaMS tools for telemetry data visualization

    2. Random Handle

      Re: Yes but ...

      >But how many of those miles were on autopilot?

      All of them - last time they released figures (AFAIK) was when customer autopilot activated miles hit 100 million in May. Probably higher as not all owners choose to share their logs with Tesla.

    3. Schultz
      Go

      Yes but ... this process will make autopilots safer

      The cause of the accident will be investigated and the sensors or software will be updated to ensure this type of accident will not occur again. Ultimately, this will make autopilots safer drivers than humans -- because humans are not very prone to learning from other people's mistakes.

      We humans are vain and may not like to hear the truth, but on average we are lousy drivers. And I am not talking about that annoying lady that cut you off last week, I am talking about you when you [switched lane without looking | almost hit that bike | drove after that late night beer | ...].

      I know you guys like to drive your cars, so bring on the shitstorm. But consider the statistics : "You have a one in four chance of being in some kind of auto accident within any five year period of time. You have a 30% chance of being in a serious car crash in your life. You have one chance in 98 of dying in an auto accident in your lifetime."

      1. Ralph B

        @Schultz Re: Yes but ... this process will make autopilots safer

        > The cause of the accident will be investigated and the sensors or software will be updated to ensure this type of accident will not occur again.

        Would this mean Tesla will start using LIDAR like the Google self-driving cars then? The current Tesla solution employing only cameras (subject to dazzle), ultrasonic sensors (short range) and radar (too directional) would otherwise appear to me to be lacking the technical means to ever avoid accidents such as this one.

        1. JeffyPoooh
          Pint

          Re: @Schultz Yes but ... this process will make autopilots safer

          RB "radar (too directional)"

          Radar can be as directional or not as you wish. Wider angle means less antenna gain, but there are plenty of signal processing tricks to provide significant coding gain ratios.

          Systems Engineering 101 indicates that the beam width should at least probe for a gap big enough for the vehicle to fit through.

          Something has gone very wrong in the system design process.

          This crash is very revealing. It exposes a disturbing and dirty truth about such self-driving vehicles.

        2. Sporkinum

          Re: @Schultz Yes but ... this process will make autopilots safer

          My first thought was that LIDAR, or some other active sensor type, would have not been confused like a passive camera. I would guess that Tesla will make a software change that requires more active participation by the driver. More frequent hands on wheel checks, etc.

      2. Anonymous Coward
        Anonymous Coward

        Re: Yes but ... this process will make autopilots safer

        "You have a 30% chance of being in a serious car crash in your life. You have one chance in 98 of dying in an auto accident in your lifetime." "

        I've been in a serious crash caused by me (no excuses, I was young and stupid) but at the last second I knew I'd fucked up badly and prepared - head back, hands off wheel. If an auto driven car fucks up and while you're reading a book then you don't even get that luxury.

      3. I am the liquor

        Re: Yes but ... this process will make autopilots safer

        "You have one chance in 98 of dying in an auto accident in your lifetime."

        What's the chance of dying in an auto accident outside your lifetime?

        It seems we're much better off in the UK from that point of view. Here only around 1 in 300 deaths is by road vehicle accident.

      4. Public Citizen

        Re: Yes but ... this process will make autopilots safer

        One of the principle reasons for the abysmal statistics is the low standards for licensing.

        Think about this for a minute:

        80% of the injuries and fatalities are caused by 20% of the drivers.

        We could more than halve the injury and fatality rates simply by denying the lowest 10% of those licensed the ~privilege~ of a motor vehicle operators license [until they can demonstrate a higher level of competency] and toughening the penalties [such as automatic jail time for a first offence] for those who drive [and cause accidents] without a license.

        If this seems harsh, consider how much you personally pay every year for insurance, the bulk of the premium being used to pay out claims caused by the Bad Drivers.

        The other area that will show an immediate reduction is by adequately dealing with the habitual drunks who get liquored up and then get behind the wheel of an automobile. I've been involved in two major accidents caused by drunk drivers, one of which left me with permanent debilitating injuries that have affected not just me but my family for nearly 10 years now. 70% of multi-vehicle accidents [in the US] have alcohol involvement.

        1. John Brown (no body) Silver badge

          Re: Yes but ... this process will make autopilots safer

          "toughening the penalties [such as automatic jail time for a first offence] for those who drive [and cause accidents] without a license."

          The US already has 0.7% of the population in prison. You want to raise that to 7%? ;-)

        2. Charles 9

          Re: Yes but ... this process will make autopilots safer

          "We could more than halve the injury and fatality rates simply by denying the lowest 10% of those licensed the ~privilege~ of a motor vehicle operators license [until they can demonstrate a higher level of competency] and toughening the penalties [such as automatic jail time for a first offence] for those who drive [and cause accidents] without a license."

          You keep saying "privilege" when many people MUST use cars to get to work because NO other form of transportation is open to them. Denying these people cars is depriving them of their inaliable right to the pursuit of happiness. Not to mention life if it's the only thing keeping food on their table.

      5. Tikimon

        Re: Yes but ... this process will make autopilots safer

        Second that. Most people have no idea how dangerous driving really is, and don't give it the attention it deserves.

        I ride my motorcycle (wearing screaming neon yellow jacket and full armor on a bike with extra lighting) with the assumption that everyone in a car is trying to kill me. Paranoia is the price I pay for going two-wheeled. It's helped me avoid at least four people who otherwise would have hit me in spite of being almost a visual hazard to air traffic. I drive my car at only a slightly lower level of paranoid, since airbags and such.

        1. Kiwi

          Re: Yes but ... this process will make autopilots safer

          I ride my motorcycle (wearing screaming neon yellow jacket and full armor on a bike with extra lighting) with the assumption that everyone in a car is trying to kill me.

          Much the same, including very bright extra headlights (must get some rearward and side lights of some sort as well). I've found the extra headlights significantly cut down the risk of people pulling out in front of me. I still get the odd person who makes me hit the brakes a bit harder than I'd like when I have the right of way, which I notice have been almost entirely (all but one!) elderly ladies. Without the extra headlights I'd still get several people pulling out each week, with them it's often as low as once/month. I do 2-300 miles per week on average, some weeks closer to 2,000 and some 0.

          Paranoia is the price I pay for going two-wheeled. It's helped me avoid at least four people who otherwise would have hit me in spite of being almost a visual hazard to air traffic. I drive my car at only a slightly lower level of paranoid, since airbags and such.

          ISTR that there was a study done in the US many years back where it was found that kids who started out on dirt bikes/scramblers/whatever were far less likely to be in accidents later than those who started out just driving. IIRC the study found that such kids learn that "crashing hurts" and thus avoid it, wheeras those who start their driving in a SUV think they're in an impenetrable tank and drive it as such.. Likely (not sure if this was part of the findings or not) the bike kids also learnt a great deal about traction and skid control.

          One of the issues I have with a lot of "safety" systems is the more someone thinks the car will protect them, the less they do to protect themselves. Personally I think it should be a requirement of getting a car license that you've spent at least 2 years on a bike (or ridden x miles, say 20,000). Or 10 minutes in Auckland traffic on a bike.

          One thing I think should NEVER be removed is a direct physical link between the steering wheel and the front wheels and the brake pedal and the brakes. The electronics may be really reliable these days, but electronics still fail far more often than simple mechanical linkages.

  3. ZootCadillac

    It does not matter how many million miles it has tested previously. The moment it fails when a person is relying upon it then it is not fit for purpose in the current state.

    However, Jesus, be alert enough not to end up under a truck. If you can't do that, get a taxi.

    1. daemonoid

      Depends how you define fit for purpose. An AI that does the job better than the standard meat controller would certainly fit most definitions of fit for purpose.

      It's a shame someone died. On the plus side, it looks like Teslas are safer than people based on very early data so maybe, instead, we should be celebrating that 4/9ths of a person haven't died!

    2. Alan Brown Silver badge

      "However, Jesus, be alert enough not to end up under a truck. If you can't do that, get a taxi."

      The number of crashes I've seen where the meatsack at the wheel _SHOULD_ have seen an obstruction on the road but didn't gives me no confidence in most meatsacks.

      Tesla will analyse and fix this. That changed expertise will go out to _every_ Tesla on the road. Such lessons are not passed on nearly so easily to meatsacks, which is why "car drove under truck" is a relatively common occurence. The only thing newsworthy in the story is that it was a Tesla.

    3. Doctor Syntax Silver badge

      "If you can't do that, get a taxi."

      And let the taxi driver not be alert instead. On one of the few times I've ever taken a taxi I watched for several hundred yards whilst the driver ignored a car pulling out from a junction in front of him and eventually T-boned it.

      1. Gene Cash Silver badge

        One of the few times I've taken a taxi, I watched as the "driver" plopped a paperback book on the steering wheel and proceeded to read it.

        1. Doctor Syntax Silver badge

          'One of the few times I've taken a taxi, I watched as the "driver" plopped a paperback book on the steering wheel and proceeded to read it.'

          I've also watched a long-distance coach driver doing his paper-work as he threaded hisour way through SW1.

    4. Nextweek

      > The moment it fails when a person is relying upon it then it is not fit for purpose in the current state

      WTF?!?! How do you travel? Someone comes out with a safer car and you call it not fit for purpose?

      Your problem is your ego, you think you and all humans are better than a computer. However you aren't taking into account the collective stupidity of the human race.

    5. Kristian Walsh Silver badge

      Fatality rates

      Once again, Tesla PR is playing fast-and-loose with statistics. They cite the US National fatality rate for ALL ROAD CLASSES to try to minimise their involvement in an accident that occurred on a divided, limited-access freeway - the safest road class. To go on to compare to a "Global" figure is meaningless and insulting.

      For a more valid comparison, the German Autobahn network, including those small sections where no upper speed limit applies, has an overall fatality rate of 1.9 deaths per billion-travel-kilometres (one fatality per 326 million miles travelled).

      None of this absolves the driver. Unless the car acted deliberately against the driver's control to cause the accident, it's still the driver's failure - it was their responsibility to control their vehicle. Had the other vehicle been a passenger car, and not an 18-wheeler, the Tesla's very heavy weight would have meant the driver would have probably survived... and then stood trial for second-degree murder.

      1. MonkeyCee

        Re: Fatality rates

        Maybe I misunderstand something, but it wasn't a "divided, limited-access freeway" otherwise the truck wouldn't have been crossing the lane the chap was driving in.

        I'm also curious about the truck crossing the road, since it seems it wouldn't have right of way, and crossed in front of other vehicles requiring them to brake.

        1. Anonymous Coward
          Anonymous Coward

          Re: Fatality rates

          It was divided, yes, but an aterial, not a freeway.

          As for trucks turning across a roadway, many times they can't help but make cars stop. Simple physics makes this inevitable. Turning across takes time, so the rules of the road play a little different. The trucker has to look for a space lengthy enough where the truck at least has enough time to fully present itself to the cross traffic (IOW, long enough so they clearly see you're turning). Once the truck is fully presented, the cross traffic IINM is supposed to yield to it until it clears.

    6. Kiwi

      It does not matter how many million miles it has tested previously. The moment it fails when a person is relying upon it then it is not fit for purpose in the current state.

      A number of passengers have found that their driver fails quite miserably.

      (Some quick and extremely dirty math - the most I know I've driven in a single week is 1,243M (or 2,000km), x52wks = 64,625.6, x an unlikely 50yr diving span = 3,231,280 - safe to say most people would not get even close to that amount and I doubt most would come close to that. Working on 70miles round trip x 6 days x 50 wks x 50 years gives 1,050,000miles; x maybe a more realistic 30 years gives 630,000miles.... IE most drivers fail considerably at least once in a million miles!)

  4. Deltics

    White truck camouflaged against a bright sky ?

    Seriously ? You put your life in the hands of an optical system, not RADAR ?

    If this is really how this thing works then you had better not be driving a Tesla into a low setting/rising sun or on wet roads or toward oncoming traffic at night with high-beam in other circumstances where such an optical system might be compromised by high glare.

    1. JeffyPoooh
      Unhappy

      Re: White truck camouflaged against a bright sky ?

      Too few bits in the colour depth? I mean seriously, can't see a truck? No radar? This is clearly a FAIL of monumental proportions. The stupid thing drove straight into a truck! The Tesla Autopilot should have its license suspended.

      The human 'driver' was obviously also on 'autopilot' (not paying attention). Human eyes can either see the truck, or at least realize that they CAN'T see empty road ahead. Is Tesla Autopilot programmed to look for obstacles, instead of the correct inverse logic of looking for clear and empty road?

      This incident is very revealing. Hopefully all those with such naive faith in the near-term future self-driving cars will be able to comprehend the obvious message implicit in this tragic failure. But I'm not confident about that. The nutters will leap to Tesla's defense, even when this failure is clearly indefensible.

      To be clear, it's not about relative safety and failure rates. If the system drives straight into the sides of trucks because they're white, then it's not ready for prime time. Period. This is a revealing 'process' failure. What other 'blind spots' exist? This is on the very edge of justifying yanking any certification, and starting over with better processes.

      Sincere condolences to his family.

      1. Goldmember

        Re: White truck camouflaged against a bright sky ?

        "If the system drives straight into the sides of trucks because they're white, then it's not ready for prime time. Period."

        That's why the Autopilot system is very clearly labelled "beta" and is disabled by default. Tesla drivers have to explicitly activate it, and are told to keep their hands on the wheel at all times etc.

        We are currently heading for a dangerous time. In the future, the majority of vehicles will be autonomous, and the world's roads will be much safer as a result. But in this interim period, most vehicles on the road are controlled solely by meatsacks. The machines and their makers have to learn. We have to help them for the good of ourselves everyone else.

        It's incredibly sad that this guy died. But for fuck's sake... use driving aids as just that; an aid. Don't become complacent and certainly don't rely on automation. Especially on a highway/ motorway, using software labelled as "beta." In his previous video, he states that he "didn't see" the boom truck. I was always taught that driving was 10% making the car move, and 90% observation. You should always know which vehicles behind, in front and at the side of you. Not just immediately around you, but further afield. And with experience you can spot telltale signs and can predict which types of drivers are more likely to do something stupid.

        He really should have been paying more attention both in that video and during the accident that cost him his life.

  5. Grumpy Fellow
    Stop

    White Truck?

    Most trucks here in the US, even white trucks, have 18 large black tires (5 visible per side in a distinctive 1-2-2 pattern). I think a possible tweak to the software would be to disengage the autopilot and apply the brakes, rather than driving at high speed between adjacent sets of black truck tires. The algorithm to identify sets of truck tires could be similar to the detection of the EURion Constellation that is used to prevent color copiers from duplicating US and Euro currency. This wouldn't absolutely eliminate running into the sides of trucks, but it would prevent running into the 18 wheelers, the only ones in the US that could appear to have a gap between wheels large enough to drive through. Before you down vote me for this comment, please give me credit for the correct spelling of Brakes (not Breaks) used in this vehicular context.

    1. Anonymous Coward
      Anonymous Coward

      Re: White Truck?

      "Most trucks here in the US, even white trucks, have 18 large black tires (5 visible per side in a distinctive 1-2-2 pattern). I think a possible tweak to the software would be to disengage the autopilot and apply the brakes, rather than driving at high speed between adjacent sets of black truck tires."

      Unless they're hidden against a black road and the bottom of the trailer manages to line up with the horizon...

      1. naive

        Re: White Truck?

        White Truck/White Sky.. seems like marketing talk to me. What about ordinary radar like Mercedes S-class uses, the echo would prevent collisions.

    2. This post has been deleted by its author

    3. pffut

      Re: White Truck?

      U.S. semi trailers could also adopt side guards, thus not presenting the void the Tesla apparently tried to drive through.

      As a bonus, any collisions /'drive unders' that do occur would be of lessened severity as the colliding vehicle could use its major deformation zone instead of taking the full impact on the A-pillars.

  6. Anonymous Coward
    Anonymous Coward

    "...claiming his Tesla S's autopilot had saved his life..."

    The embedded YouTube video shows a fender bender being averted.

    It would be extraordinarily bad luck for such relatively slow and gentle vehicle contact to lead to death.

    Condolences.

    1. Anonymous Coward
      Anonymous Coward

      Re: "...claiming his Tesla S's autopilot had saved his life..."

      If you are in Nascar and expecting to be side swiped then yes. For a normal human being on a road doing 60mph then a side swipe like that will be a considerable inconvenience.

      You are likely to be thrown into the path of another vehicle or off the highway itself which can easily prove fatal and the average drivers response is also generally going to be to over compensate and start the car fishtailing or go to a full spin. Side wipes are pretty effective at unsettling a car, that's why the cops use them to remove the "bad-guys" from their present tarmac/rubber combination.

      1. Deltics

        Re: "...claiming his Tesla S's autopilot had saved his life..."

        Instead of which the auto-pilot took care of the "swerving off the highway" part.

        Does the auto-pilot have the smarts to know when the verge/off-road area being driven into is safer than adequately dealing with/responding to the glancing impact ?

        Give that a white truck against a white sky seemingly poses such a problem it would apparently be fooled by a wall painted with a picture of a street receding into the distance, so one would suspect that it couldn't tell a ditch filled with water from a smooth, wet road either.

        1. TRT

          Re: "...claiming his Tesla S's autopilot had saved his life..."

          As ane fule no, a road tunnel painted onto the base of a sheer rock wall becomes an actual road tunnel providing that the object attempting to traverse it is not a coyote.

      2. Anonymous Coward
        Anonymous Coward

        Re: "...claiming his Tesla S's autopilot had saved his life..."

        @AC

        You missed the "It would be extraordinarily bad luck..." ? Your mention of PIT Maneuver proves the opposite point. It's ~99.9% non-fatal.

        The Autopilot did NOT save his life from the menacing boom truck. It saved his fender. His claim was BS.

        1. Anonymous Coward
          Anonymous Coward

          Re: "...claiming his Tesla S's autopilot had saved his life..."

          I stated it would be a "considerable inconvenience".

          However PIT maneuvers are done with knowledge of when it is safe to do so and where to hit the car to curtail it. They are also done at very low speed in most states and the consequence of death or serious injury is put as a reasonable consequence for the driver and/or passengers as they are choosing to evade the police. Even still the stats show that some states have 2.5% death and 27% injury from using it.

          Without those controls in place there could be considerable inconvenience from being side swiped especially by a heavier vehicle.

  7. LaeMing

    I recall not seeing an oncoming NSW State Rail heavy vehicle until it was disturbingly close a few decades back, as it was a perfect 'Eucalyptus green' against a forested background.

  8. Anonymous Coward
    Anonymous Coward

    He obviously hasn't watched Final Destination. Death will find you....

    1. Rich 11 Silver badge

      Game of Drones

      "The God of Death will not be denied."

  9. Herby

    Darwin award??

    Just thinking...

    Or as Forrest Gump says: "Stupid is as stupid does".

    1. This post has been deleted by its author

  10. partypop69

    I will NEVER use Autopilot

    Computers will always be flawed. Imperfectly created ourselves, it's not within us to create a perfect machine. Mistakes will be made, accidents will happen, let it be a lesson to car manufacturers to innovate but not replace human reaction. Innovation is nice, but ignorant. Less is more. Telsa should disable autopilot permanently. Condolences to the family...

    1. Charles 9

      Re: I will NEVER use Autopilot

      The thing is, though, will they be MORE flawed than us? You DID note the statistic about how long it took for this to happen. We demand perfection of our machines yet are content to throw our own flawed selves behind the wheel.

      1. Anonymous Coward
        Flame

        Re: I will NEVER use Autopilot

        The upvote/downvote ratio shows that most people are indeed morons, among the Reg commentard demographic at least.

        If all the idiots start driving autopilot cars, there'll be MORE serious crashes.

        1. Charles 9

          Re: I will NEVER use Autopilot

          "If all the idiots start driving autopilot cars, there'll be MORE serious crashes."

          I don't know about that. I'd sooner trust a computer behind the wheel than most of those idiots who would probably insist on driving home after a night at at the pub, drive while putting on makeup or even changing clothes. Trust me, the world is FULL of stupid (which you just can't fix), so you know what? I'll take my chances with the computer.

  11. Schlimnitz

    I wonder...

    Wouldn't it be a better idea to inverse the roles?

    The driver driving, and the autopilot watching out for danger?

    At least to start with?

    I've averted quite a few accidents because the person next to me also had their eyes on the road and spotted something before I did (like "YOU'RE ON THE WRONG SIDE OF THE ROAD!").

    1. Hollerithevo

      Re: I wonder...

      I'd like that. I had rear sensors fitted to my ancient Ford a while back to help me reverse park in very tight London parking spaces, and it really relieved my stress. If I could have something beep when a cyclist was in my blind spot or a motorcyclist was using that invisible third lane or whatever, I could use more of my attention for the road ahead. I'd like to step into my car and become a cyborg.

    2. Vic

      Re: I wonder...

      spotted something before I did (like "YOU'RE ON THE WRONG SIDE OF THE ROAD!")

      That's going to get really annoying during an extended overtaking manoeuvre...

      Vic.

  12. Mystic Megabyte
    FAIL

    No bars?

    AFAIK European trucks and trailers have side impact bars to prevent this sort of drive-under crash.

    1. ecofeco Silver badge

      Re: No bars?

      Those are still rare in the U.S.

      1. Richard 12 Silver badge

        Re: No bars?

        That is the cause of death.

        Drive-under crashes are extremely deadly, as they avoid all possible crash safety equipment and peel off the top half of the car.

        Nobody can duck far enough or fast enough.

        The driver/autopilot combination caused the crash, but it's that truck design that killed.

        1. Anonymous Coward
          Anonymous Coward

          Re: No bars?

          > "The driver/autopilot combination caused the crash, but it's that truck design that killed."

          Did he/it?

          A few people have made this assertion, but when you pull across a carriageway it is your duty to make sure that it's clear. The driver/autopilot combination appears to be a compounding factor rather than a cause as such.

          It is entirely possible that this would also have been prevented had the truck driver been using an autopilot as well.

          1. Richard 12 Silver badge

            Re: ZanzibarRastapopulous

            You're right of course.

            I missed the word "perhaps".

      2. Alan Brown Silver badge

        Re: No bars?

        "Those are still rare in the U.S."

        They were rare in the EU until legally mandated. federal regulations really should be amended.

        Ditto on rear under-ride bars. A lot of research went into finding the best ways of setting these up and it turned out that a crumple model worked better than totally rigid bars.

        The current problem is that if a fold- under tail lift is used you can't fit the bars. I'm expecting to see a mandate on some kind of crash mitigation being required for these installations sooner or later.

        1. John Brown (no body) Silver badge

          Re: No bars?

          "They were rare in the EU until legally mandated. federal regulations really should be amended."

          Not only would all the trucking companies lobby against the extra cost, but they'd probably have all their customers lobbying with them because those extra costs would have to be added on to the trucking fees.

          No one will be looking at the overall savings to the economy if there are fewer fatal crashes.

      3. Paul Kinsler

        Re: Those are still rare in the U.S.

        Really? I recall a conversation I had with an American guy on irc who told me he was inputting some gruesome traffic statistics along the lines of car-under-truck/decapitated-adults lines ... some twenty years ago. On the plus side, he did tell me that small children often survived because their heads were lower down.

    2. Anonymous Coward
      Anonymous Coward

      Re: No bars?

      "AFAIK European trucks and trailers have side impact bars to prevent this sort of drive-under crash."

      Ah, Brexit, TTIP, and the freedom to kill ourselves under trucks just like Americans.

      1. Anonymous Coward
        Anonymous Coward

        Re: No bars?

        Ah, Brexit, TTIP, and the freedom to kill ourselves under trucks just like Americans.

        Not necessarily.

        About 15 years ago I remember seeing a news report on French TV after someone was killed in a crash like this. The reporter showed a French lorry with it's flimsy ½" side bar, and compared it to the adjacent lorry from "our British neighbours", with its 2x4" reinforced steel crash rail.

        There's a reason that UK road traffic death/injury figures are the best in the EU, and it isn't down to EU laws...

    3. Anonymous Coward
      Anonymous Coward

      Re: No bars?

      The trucks are generally a lot safer, they have flat front cabs so that you can see small kids, old people etc in front of you, better mirror combinations for seeing cyclists etc along side and safety bars to allow a car to not enter the troublesome area under the trailer.

      It doesn't make for such interesting car v truck movies (see previous christmas vacation clip) but it does help to save lives and would have ensured that the Tesla would have seen the truck in question.

      1. stungebag

        Re: No bars?

        Trucks in the EU have flat fronts because of EU length limits: trucks couldn't carry the full load if they had a bulbous nose. It adds to the danger for truck drivers, which is why there's talk of allowing longer trucks this side of the pond.

      2. John Brown (no body) Silver badge

        Re: No bars?

        "they have flat front cabs"

        Yes, what the yanks call a "cab over" truck, The fact it's a safety feature of sorts in some circumstances is pure serendipity. The reason for that truck design being the most popular across Europe is because of the legal limit on the maximum length of a tractor/trailer combo. Putting the cab high up and over the engine allows for a longer trailer so more goods can be carried.

    4. Charles 9

      Re: No bars?

      "AFAIK European trucks and trailers have side impact bars to prevent this sort of drive-under crash."

      But you have to consider the kinds of roads trucks may traverse in the US. Therre's a reason we have ride height warning signs: because trucks have occasionally gotten their trailers stuck on a hump. Often the hump is a railroad crossing, and cameras have shown trucks get stuck on those humps and then get smashed by trains. Tragically, at least one of those trucks was a propane truck, with tragically obvious results.

      Anyway, side impact bars can aggravate ride height issues, so there's a tradeoff here, and I think in the US's case, hump sticks happen more often than trailer guillotines (plus it's not always fatal; some HAVE survived; you CAN duck under it and into your seat).

      1. Anonymous Coward
        Anonymous Coward

        Re: No bars? - because trucks have occasionally gotten their trailers stuck on a hump

        Solution: Fix the humps. Or put warning and diversion signs around them.

        Perhaps if the US put less effort into political infighting and more into infrastructure, the road fatality rate wouldn't be twice that of the UK (that is per vehicle kilometre, so is pretty directly comparable.) There have been big improvements in the US, but technological fixes like autopilots can only achieve so much; the actual infrastructure, vehicle design and driver regulation have to be addressed too.

        1. Charles 9

          Re: No bars? - because trucks have occasionally gotten their trailers stuck on a hump

          "Solution: Fix the humps. Or put warning and diversion signs around them."

          Not gonna happen. First, many of these humps have signs and truckers STILL get stuck on them. As for diversions, usually they're across railroad tracks which means the only diversion is to another hump. IF there's a diversion.

          As for fixing them, we're talking a big country full of roads that are LOCALLY maintained by communities with tight budgets. Plus it takes a lot of convincing to get state and federal authorities to look into them, usually because it then gets other localities to complain, "What about US?!"

          Sometimes, you just gotta deal with what you get dealt, American trucks NEED the ride height.

      2. Alan Brown Silver badge

        Re: No bars?

        "hump sticks happen more often than trailer guillotines"

        Hump sticks happen when rig drivers aren't paying attention. If they were, they wouldn't go over the hump in the first place - most humps large enough to snag a rig tend to be signposted for precisely this kind of reason.

        A classic example of rig operator stupidity got reported a couple of months back where a rig operator took an (IIRC) 89,000 pound rig onto a historic bridge rated for 10 tons maximum - with utterly predictable results. Her excuse was that she thought 89,000 pounds was less than 10 tons, but that didn't prevent her being stuck with the bill for not only being rescued, but for replacing the bridge (insurance policies tend not to cover acts of gross negligence)

        When you start digging into the real reasons for resistance to safety features on US rigs it comes down to 2 main ones:

        1: Extra weight == less payload

        2: Tractors and trailers tend to be changed around a lot, so side impact rails suitable for one combination may not be effective on another or may snag the rear of the tractor in extreme cases.

        It's interesting to note the IHRC crash videos linked elsewhere in this thread which show USA trailer rear bars simply snapping off at 50% and 35% impact overlap. That wouldn't be tolerated in the UK and liability would fall back on the fabricators.

        With regard to the Tesla: This guy drove around with dashcam on most of the time and was a regular youtube poster. Presumably the camera was running that day and this will help the NHTSA determine actual faults. I'm picking that the "other cars which saw and slowed down" were further away than the Tesla was - at some point it a T-boning just happens "in your lap" and there's not much which can be done.

    5. S4qFBxkFFg

      Re: No bars?

      Instinctively, I'd prefer the option of increasing the height of the gap, so that crashes are avoided rather than made safer - I suppose that just ends up transferring the danger to higher vehicles like Transit-type vans or big 4x4s.

      1. Anonymous Coward
        Anonymous Coward

        Re: No bars?

        I'd prefer the option of increasing the height of the gap, so that crashes are avoided rather than made safer

        Can't be done...you'd have to rebuild every warehouse in existence; the load would be higher and you'd be killing a lot of lorry drivers on corners (plus whoever they fell over onto); and also -as you say- the van thing.

  13. Public Citizen

    Not First Report of

    The Tesla Autopilot System not correctly identifying a van type trailer.

    For the information of the European Readers:

    More than half of the van type trailers in use in the USA are 53 feet long, with a typical kingpin to front axle distance of 40 feet.

    There seems to be a problem with the Tesla system detecting objects directly in front of it that don't go all the way to the ground, such as the center portion of the van trailer that was involved in this accident.

    As stated above, an optical based system can not be relied upon to function perfectly over the broad range of color and lighting conditions that are found in everyday driving situations. An optical system should only be relied upon as an auxiliary to a radar based system that is capable of scanning the entire ~volume~ of space, plus a clearance safety factor, for the entire frontal silhouette of the vehicle. Anything less is an inherently flawed system, as tragically demonstrated by Tesla [and it's unintended crash test victim].

  14. Anonymous Coward
    Anonymous Coward

    Radar

    Radar isn't perfect answer either (I've worked extensively on sensors for collision avoidance) and cannot always discriminate between objects on the ground and those that are high enough to pass under (e.g. pedestrian walk way). They also suffer from the problem of reporting parts of the road as stationary obstructions (e.g. manhole covers), resulting in the potential that the brakes can be applied hard when there is nothing actually in front of the vehicle with the possibility of causing a rear-end shunt.

    A high-spec automotive radar can be tracking several hundred "objects" at any one time in some environments (e.g. round town where there are lots of static items - poles, railings, grids).

    1. Anonymous Coward
      Anonymous Coward

      Re: Radar

      Autopilot systems would need a "cooperative environment" where each vehicle transmits its position, speed and direction, and avoidance procedures are coordinated - something alike the CAS systems on airplanes. Otherwise a complex battery of sensors would be needed - and radars alone could not be the solution. Sensor working on several different parts of the EM spectrum would be needed, to identify object by correlating the different "view" from each sensor.

      1. Charles 9

        Re: Radar

        But even the best sensor systems have trouble with inclement weather. Heavy rain or snow is likely to beat even the best radar systems by producing too much radio AND optical noise.

        It raises the prospect of autopilot moral dilemmas which are a very hot topic as driverless cars are being tested more and more. For example, how does a driverless car handle itself when it turns the corner on a mountainside road and suddenly sees an out of control school bus full of kids tearing between the lanes at high speed? It goes right into Book of Questions territory.

  15. Anonymous Coward
    Anonymous Coward

    Problem is - it's not really an autopilot

    its effectively cruise control with some extra sensors. It is not a driverless car.

    Its not really up to leaning back in your seat watching the in-flight movie.

    A long time ago it was learned that the more safe cars were made, the more complacent drivers became so ABS, Air bags, seat belts benefits were reduced by people that simply took more risks because the perceived they were safer...

    There may well be some technical improvements on sensors, blind spots, sensitivity etc. Tesla can make, but ultimately unless we have a self driving car, the driver really should need to be in charge i.e. remain holding the steering wheel...

    1. Alan Brown Silver badge

      Re: Problem is - it's not really an autopilot

      " unless we have a self driving car, the driver really should need to be in charge i.e. remain holding the steering wheel..."

      Given some of the stupidity we've seen (climbing into the back seat, etc) it's arguable that the autopilot should disengage if the seat is vacated or seatbelt disengaged (let's not get into arguments about your "right" not to wear a seatbelt. If you want to opt out of the safety system then I want to opt out of paying for your medical care when you go through the windscreen/out the side windows/out the rear windows/crush and kill the guy in the seat in front of you - all of which I've had to deal within many many miles of driving on rural roads and being first-on-scene at a crash)

    2. Anonymous Coward
      Anonymous Coward

      Re: Problem is - it's not really an autopilot

      Aircraft autopilot isn't full auto either; same problems. I know a few older pilots who all say airline pilots are becoming dangerously overreliant on it. Even well-trained, very experienced pilots can become complacent.

  16. Anonymous Coward
    Anonymous Coward

    Can it cope with farmers?

    I've had a close shave on a motorbike when I met two muck spreaders on an A-road at night, the lights, number plate, reflectors and back of the trailor were covered in black matter so the only warning I had was the oncoming car light lights (seen under the trailer as it crested a slight rise) blinked out as they went behind the trailer wheels.

    My brain could not assimilate the information so I braked while trying to process, then I saw the back the trailer far too close to do anything other than break heavily, almost losing control.

    Nothing is fool proof, we should not assume the other road users are predictable, sensible or observe the law, accidents will happen but as Schultz pointed out there is a chance for the other users to learn from this driver/car accident and like in aircraft incidents it should lead to safer systems.

    If the bloke just drove into the side of truck without the Tesla we'd not even hear about him, nothing would be learned I expect. Commiserations to his family and may his tragic passing save many others.

    1. SImon Hobson Bronze badge
      Stop

      Re: Can it cope with farmers?

      I've been on the other side of that equation, sort of.

      <cough> decades ago I used to do some farm work, and one farm I worked at was "out in the middle of nowhere" and ... lets just say legal compliance was a bit lax a lot of the time.

      Anyway, I was carting silage - trailer of freshly chopped grass back to the farm, empty trailers back to the field. The weather forecast wasn't good, and the farmer was keen to get as much in as we could before it rained. As it got dark, I'd been pointing out that there were no lights at all on the tractor - but we only stopped when I declared that I could no longer see the road. It really did get to that stage - clear skies, no moon, middle of nowhere so none of that "orange haze" you get around town - just darkness.

      The upside is that I'd have known if anything else was coming (as long as it had some lights on it), downside is that there would have been SFA I could have done about it as the road was not much wider than the trailer. (and bounded by hedges).

      All I could have done would be to stop and wait for the bang.

      I was young and naive back then - sure as heck wouldn't do it now.

    2. Public Citizen

      Re: Can it cope with farmers?

      If the guy had been on manual control this accident most likely would not have happened.

      He would have registered the tractor of the vehicle as it started to turn and would have tracked both it and the trailer, taking the appropriate braking action as a matter of course.

      This can be demonstrated by the actions taken by other drivers on the same road at the same time who were not relying on an experimental technology to do the thinking for them.

  17. Anonymous Coward
    Anonymous Coward

    Lawsuit..

    in 5... 4... 3... 2... 1...

    Cynical? Moi?

  18. rmason

    Will happen again

    As has been said this is basically a VERY clever cruise control, not a driverless car system.

    That's what it gets treated as though, we've got two people who work here that drive Teslas, they're often checking emails on a laptop etc when driving. There was also a video on youtube doing the round recently of a fella asleep behind the wheel of his in stop-start traffic.Explaining that to your insurance company after rear-ending something, or hitting another person/pedestrian would be interesting, to say the least.

  19. wolfetone Silver badge

    Question

    When this Tesla is in autopilot mode, can the driver slam on the brakes and in doing so turn the thing off? You know, akin to stepping on the brakes when your car is in cruise control mode?

    1. Anonymous Coward
      Anonymous Coward

      Re: Question

      No, not unless he turns it off with the key switch under the back bumper.

  20. Lee D Silver badge

    My car has cruise control - I consider that the most dangerous function on the car, being the only one that can accelerate against my will (and no, it does NOT do it smoothly enough in my opinion and will happily burst forward to reach its set-speed if it resumes from after manual braking).

    Now, my car is a late 2015 model, so it's not exactly early tech for such a thing, but I just don't trust it. I deliberately refused the option for "lane control", etc. because I just consider it far too dangerous when I'm the one who's going to die (not the car) and I'm the one responsible for other's deaths (not the car).

    Given this guy's previous video, I'm going to say he's inattentive and complacent. In the US, you are ON the side of the car that that first truck approached him from, and he doesn't see it, doesn't notice it approach, until LONG after the FRONT WINDSCREEN DASH CAM (obscured by the left-hand window support!) can see it could be a threat and the car still takes a little while to decide it's a danger.

    And what does the driver do / notice in that video? Nothing until the car swerves out of its way.

    1. Phil O'Sophical Silver badge

      will happily burst forward to reach its set-speed if it resumes from after manual braking

      Any cruise control I've had required me to push a resume button after brake-operated disengagement. Are you saying that yours doesn't? That does sound dodgy, but I don't have a problem with a manually re-engaged one, since I'd be expecting the accelerator response.

      1. fruitoftheloon
        Happy

        @Phil O'Sophical

        Phil,

        I believe our fellow commentard was pointing that after they pressed the 'get back to the speed you previously set' button, it did, with great alacrity, rather than taking its' time.

        I had a 1999 BMW 535i auto which when you pressed the aformentioned button (after braking to significantly reduce the road speed) would drop down a number of gears and really f'ing go for it, that was the first & last occasion I used CC on that vehicle, not my idea of a good time...

        Our current family wagon/van/taxi doesn't have that issue (2.7 diesel Jeep)!

        Cheers,

        Jay

        1. SImon Hobson Bronze badge

          Re: @Phil O'Sophical

          I find the same thing in an Avensis. So I manually accelerate to about the required speed before resuming.

          But I agree, it's easy to lower ones concentration with automation - and it's a big issue in commercial transport. One of the big challenges in commercial aviation is keeping the crew alert - a more extreme example of the problem being this one (yes, we all believe they didn't fall asleep don't we ?) :

          https://en.wikipedia.org/wiki/Northwest_Airlines_Flight_188

          I recall some years ago suddenly realising that I'd become too reliant on the GPS. I'd got so used to putting in a route or destination and following the displayed course that I'd got "rusty" in following the map and keeping track of exactly where I was. It was something of a wakeup call to realise that I was actually uncertain of my position while flying over familiar territory ! It took a certain amount of effort to get my visual navigation skills back up to standard and keep them there.

        2. Alan Brown Silver badge

          Re: @Phil O'Sophical

          'get back to the speed you previously set'

          One of my cow-orkers has a late model VW Golf (not a diesel) and the degree of acceleration when you press the button is based on the "mode" the car is in.

          Under normal circumstances it accelerates slowly. If left in sport mode then it'll ram you back into your seat.

          The same thing happens when it's come up behind a slow car (adaptive cruise control) and you pull out to pass. As he always "drives it like he stole it", it stays in sport mode permanently.

  21. Matt Bryant Silver badge
    WTF?

    Wouldn't have happened in the UK.

    By law, 18-wheelers in the UK have side bars to stop vehicles going underneath a trailer. A perpendicular hit like the Tesla accident would have resulted in the bumper being the first part of the car to hit the lorry's sidebar, meaning all the usual safety features of the Tesla (airbags, crumple zones, safety cell) would have triggered and possibly saved the driver. I am still surprised when I see the US still hasn't implemented this simple safety feature. Whilst it's technically interesting that the Tesla's autopilot didn't spot the turning truck in time (and worrying, given that quite a few trailers have white sides!), this type of fatal accident was quite common in the UK with manually driven vehicles until the sidebars on trailers were introduced.

    1. GreggS

      Re: Wouldn't have happened in the UK.

      Yeah, but then The Fast & The Furious would have to come up with a whole slew of new ways to avoid the chasing bad guys.

    2. Charles 9

      Re: Wouldn't have happened in the UK.

      "I am still surprised when I see the US still hasn't implemented this simple safety feature."

      SImple. Too many humps. Even without crash bars, too many stories arise of trailers going over a hump and getting the underside stuck on it with the wheels off the ground, meaning it's stuck. Worse, many of these humps are railroad crossings, and Murphy is notorious in these scenarios. Many a trailer have been smashed by trains in these settings...if not detonated because they were carrying gas.

      1. Anonymous Coward
        Anonymous Coward

        Re: Wouldn't have happened in the UK.

        "SImple. Too many humps. "

        I keep hearing all this stuff about how the US is the most advanced nation on earth. We are in the 21st century. And yet amazingly very few people seem to ponder how exactly it is that other nations in the world manage to have significantly fewer humps in their roads - which are by definitions man-made interruptions in the natural terrain.

        1. Gene Cash Silver badge

          Re: Wouldn't have happened in the UK.

          These roads were probably put in back when "yew kin gitcher trackta ovah dat, no worries!" and there's no money or concern to improve them in the mean time.

          Hell, the stupidass rich portion of Orlando PAID MONEY TO REMOVE the asphalt from their roads to expose the 1920s-era bricks ('cuz it looks kewl and leet) and they consider that a safe surface. I avoid the area as much as possible but I've lost count of the potholes I've hit that have been camouflaged by the brick pattern, as well as just plain loose or missing bricks.

          > I keep hearing all this stuff about how the US is the most advanced nation

          Where'd you hear that? My American-made bike has been in the shop over 3 months now, so I'm depending on my Japanese-made one to get where I need to go.

          I consider "American-made" to be a pretty hefty insult.

          1. Alan Brown Silver badge

            Re: Wouldn't have happened in the UK.

            "I avoid the area as much as possible "

            Which is exactly what an area with pavers or bricks is intended to make you do. It's a very effective way of keeping trafffic speed down and traffic volume low.

        2. Charles 9

          Re: Wouldn't have happened in the UK.

          Simple. It may be the 21st century, but (a) the United States is a big country with LOTS of roads (more than Canada and Russia IIRC, the only two countries bigger), and (b) the roads are LOCALLY maintained by communities who distrust the federal government (SOCIALISM!).

        3. I am the liquor

          Re: Wouldn't have happened in the UK.

          "I keep hearing all this stuff about how the US is the most advanced nation on earth. We are in the 21st century. And yet amazingly very few people seem to ponder how exactly it is that other nations in the world manage to have significantly fewer humps in their roads - which are by definitions man-made interruptions in the natural terrain."

          They know exactly how other countries do it: national infrastructure planning, or as they call it in the US, communism.

    3. Public Citizen

      Re: Wouldn't have happened in the UK.

      "this type of fatal accident was quite common in the UK with manually driven vehicles until the sidebars on trailers were introduced."

      But it isn't common in the US, where the van trailers are even bigger than those used in the UK and Europe.

      The trailer manufacturers and the operators have argued, based on the low occurrence of this type of accident, that it is unnecessary parasitic weight. All trailers have a bar structure on the rear so the possibility of an overtaking vehicle being involved in a run under accident is minimal.

      1. Richard 12 Silver badge

        Re: Wouldn't have happened in the UK.

        I suspect that it probably is quite common in the US, because most people are really bad at understanding statistics and legislators know this.

      2. Richard 12 Silver badge

        Re: Wouldn't have happened in the UK.

        And this article would tend to support this, albeit sadly lacking in sources or detail.

        http://www.cnbc.com/2014/07/30/truck-accidents-surge-why-no-national-outcry.html

  22. Mutton Jeff

    First person to be killed by (semi)autonomous vehicle ?

    Harry Bliss of the modern era? (ironically he was done for by an electric vehicle too, according to 'pedia)

    1. Putters

      Re: First person to be killed by (semi)autonomous vehicle ?

      Ahh, but was Harry Bliss at the time the William Huskinson of the modern era ?

    2. Putters

      Re: First person to be killed by (semi)autonomous vehicle ?

      You are also being a little US-centric ...

      https://en.wikipedia.org/wiki/Mary_Ward_%28scientist%29 (1869)

      or if you want to be a little more pedantic and want a car, not a private steam carriage

      https://en.wikipedia.org/wiki/Mary_Ward_%28scientist%29 (1896)

      Mr Bliss was 1899

  23. Anonymous Coward
    Anonymous Coward

    Risk Compensation Theory

    "Risk compensation is a theory which suggests that people typically adjust their behavior in response to the perceived level of risk, becoming more careful where they sense greater risk and less careful if they feel more protected."

    Now if all cars had a 6" steel spike that comes out instead of an air-bag...

    1. Charles 9

      Law of Unintended Consequences

      Two words: Ghost Drivers. Some accidents (even head-ons) aren't your fault, and I'd hate to be the car manufacturer who has to defend the spike when a driver is tragically skewered because a drug-crazed maniac thought playing chicken was cool.

      PS. Risk is subjective. There are those who ignore risk (the pathologically stupid) and those who are turned on by risk (thrill seekers).

  24. Jos V

    Autopilot...

    Maybe the whole autopilot thing for cars should get some rethinking.

    For an airplane, this is fine. There are hardly any strange objects thrown in it's path when airborne, and even when there is another airplane on collision course, the pilots are warned and guidance is given for avoidance, as well as take-off and landing are done manually.

    For cars not so. You will encounter all kinds of situations out of the blue and it's better to stay alert by handling the car at all times. The thing where autopilot should help is when it starts noticing a dangerous situation by alerting the driver, and when the driver then ignores it, or handles it wrongly, only then take over and perform evasive action in safe limits.

    In other words, only autopilot for edge-cases.

    1. Gene Cash Silver badge

      Re: Autopilot...

      From "This Day in Failure, July 01"

      2002: A Bashkirian Airlines Tupolev 154 collides in midair with a DHL International Boeing 757 cargo plane over Ueberlingen in southern Germany, killing the 69 passengers and crew on the Russian plane and the two-person crew on the cargo jet. The collision occurs even though each plane had TCAS (Traffic Collision Avoidance System) collision-avoidance equipment onboard.

      So no, even for airplanes, it's not always "fine"

      1. GreggS

        Re: Autopilot...

        if i remember rightly the Russian pilot overruled the TCAS system and also got conflicting information from the sole air-traffic controller working on the nigh - so humans caused that, not the automated TCAS which gave the correct instructions to avoid the collision to both sets of pilots.

        1. SImon Hobson Bronze badge

          Re: Autopilot...

          Thats how it recall the documentaries about it. The controller gave conflicting advice, the russian pilots followed it, and flew into the path of the other aircraft that had avoided them.

          There's a reason why the rules are "if the controller says something different to the TCAS, you follow the TCAS".

          As an aside, the reason why avoidance action is always given as ascend/descend is simply because the vertical axis (height) is the most accurate - there's much less accuracy in position and heading although modern GPS systems have probably improved that significantly. When TCAS was being developed, the only really reliable information is the heigh information provided by the other aircraft in the vicinity - the horizontal position being (originally) "fairly vague" based on radio bearing (crude measurement) and signal strength (crude proxy for distance).

          The choice of who climbs and who descends is by fixed rules (who's got the higher transponder code) so unless there are bugs in the software, you'd never get two aircraft being given the same advisory.

          But as has been pointed out, the aviation space is relatively benign. Once in controlled airspace (particularly Class A where equipment and qualification requirements are fairly tight), you don't need to worry about pedestrians, kids on bikes, pets dogs, cattle, ... There is still the issue of (typically) light aircraft without all the hi-tech (and expensive) TCAS gear, but even then it takes some effort to get two aircraft very close - hence why we insist on herding them into smaller areas around airports and into airways to justify the effort of keeping them apart :-)

          So given that there will probably never be 100% "clever" cars with all the comms gizmos, and we'll always have other road users (people, animals, stuff that's fallen off things, ...), autopilots for cars are always going to be quite a lot harder to implement. I look forward to when they start properly throwing them into urban environements - that's going to be a lot of fun for other drivers who know how to "game" the automation ;-)

          1. Charles 9

            Re: Autopilot...

            "There's a reason why the rules are "if the controller says something different to the TCAS, you follow the TCAS"."

            I think they only made that ruling AFTER the crash. At the time, it was not defined which took priority.

            1. GreggS

              Re: Autopilot...

              You're correct, it was clarified after not only this, but an earlier near-miss involving two Japanese airlines. The Japanese asked for the clarification after that incident but it wasn't until the final report on this accident that it was actually specified TCAS takes priority. The incident report also found that both sets of advice was contradictory, the DHL pilots being instructed to go with TCAS, the Russian one, to go with ATC (or their last instruction).

      2. Alan Brown Silver badge

        Re: Autopilot...

        "The collision occurs even though each plane had TCAS (Traffic Collision Avoidance System) collision-avoidance equipment onboard."

        That's hardly fair. The air traffic controllers and TCAS were barking conflicting warnings and the CREWS countermanded the TCAS control inputs.

    2. Mark 85

      @Jos V -- Re: Autopilot...

      And given this... we still want flying cars? While it would be cool, I think I'd stick to the roads and let everyone else flyabout and run into each other.

      1. Charles 9

        Re: @Jos V -- Autopilot...

        But airplanes have three dimensions to work with (unlike cars which are normally limited to two). Anyway, the big problems with flying cars are their complicated operations and their operating costs.

      2. John Brown (no body) Silver badge

        Re: @Jos V -- Autopilot...

        "And given this... we still want flying cars? While it would be cool, I think I'd stick to the roads and let everyone else flyabout and run into each other."

        I agree. So long as the flight "roads" are not above me or where I want to go!!!

  25. This post has been deleted by its author

  26. IHateWearingATie
    Go

    130 million miles?

    Wish I could drive that far without doing something really really stupid.

    1. Alan Brown Silver badge

      Re: 130 million miles?

      "Wish I could drive that far without doing something really really stupid."

      Most people average something stupid every 10 miles.

      It's only because there's so much "space" built into our road rules that we routinely get away with it - and that's also why most of the time a crash is caused by a combination of multiple foul ups.

  27. Gene Cash Silver badge

    FYI

    The GPS coordinates of the crash intersection are 29°24'39.1"N 82°32'23.4"W in case anyone wants to streetview it.

    This is not very far from where I grew up, as a matter of fact.

  28. Anonymous Coward
    Anonymous Coward

    Multiple Teslas?

    At the moment cars on autopilot using radar and other active sensors are probably so rare as to never interfere.

    Would there be any issues if lots of vehicles in close proximity were relying on active sensor technology (analysis of reflections)? Can these sensors become blinded or confused by the emissions and reflections from other emitters?

    1. Nixinkome

      Re: Multiple Teslas?

      Every person reading this article must have seen that the deceased driver, Joshua D. Brown, was an ex Navy Seal of high technical qualifications and, latterly, the person who started Nexu. He could afford a Tesla and had previously uploaded a video demonstrating Autopilot.

      I presume he was a capable driver.

      His death this way does not make sense to me. It's ludicrous!

  29. Sporkinum

    2 things. The truck driver was the one who caused the accident by pulling out in front of traffic. The other is that many trailers have devices to smooth airflow under the trailer that would block light underneath. Possibly their software was relying on that instead of an open undercarriage. When I checked on the trucking company, it turns out it is just one guy and a truck.

    1. regreader42

      Surely its not beyond the wit of man to design truck sidebars that don't risk grounding?

      A horizontal impact is very different to a strike from below.

      Bars on hinged arms suspended by a tether? Bars on telescopic arms with shear bolts?

      1. Anonymous Coward
        Anonymous Coward

        Yes, I think it is beyond the wit of man. First off, any non-rigid design will likely get sunk because a low-riding car will slip under it and push it up, "submarining" it. Second, if it hits a hump, it may go up, but there's no guarantee it'll come back down due to stuck pivots and so on.

    2. Charles 9

      If he pulled out well ahead of the traffic, then no it's not his fault because he operated due diligence with a large, slow vehicle, presenting itself to the traffic that it was turning. If other cars nearby have stopped, then the trucker gave enough notice and the cars had yielded right of way to the truck.

    3. Grumpy Fellow
      Go

      Not so fast there

      Before deciding that the truck driver caused the accident I would like to hear how fast the Tesla driver was going. If this is a 55 mph zone and the Tesla was going twice that (for example), then all bets are off. I find it curious that the speed of the Tesla hasn't been released at this point, at least I haven't heard it. Tesla company certainly knows that number. I'd be interested to hear from any Tesla driver whether or not you can use the autopilot mode to exceed the posted speed limit. I'm thinking that a properly functioning autopilot would obey the speed limit just like it obeys a stop sign. Or would it? I'm genuinely interested to hear.

      1. Anonymous Coward
        Anonymous Coward

        Re: Not so fast there

        " I would like to hear how fast the Tesla driver was going."

        Mr Brown was making a long drive from Florida to Ohio and, according to the WSJ, had numerous speeding tickets.

        The truck driver says the Tesla was flying and never hit the brakes - take that for what it is worth, considering the source.

  30. Anonymous Coward
    Anonymous Coward

    One near miss and a not miss within a couple months?

    Seems to me the problem was the driver relying on the car too much. In the "near miss" video the truck coming from the left was visible long before it tried to turn into the car. If I was driving I would have sped up a bit to keep that truck behind me. I never trust someone changing lanes exactly parallel to me, and neither should any decent autopilot software.

    1. Anonymous Coward
      Anonymous Coward

      Re: One near miss and a not miss within a couple months?

      The truck driver claims he heard audio from Harry Potter (witnesses who arrived after say they didn't hear this) but police did report that a portable DVD player was found in the car's wreckage.

      If there's a Harry Potter DVD in the player then I think the truck driver's account is pretty solid, and substantially all of the blame must be placed on the moron driver for thinking "autopilot = autonomous driving". I'd put the rest of the blame on Tesla for calling that feature "autopilot" when it is nothing of the sort (yeah I'm sure they caution people against treating it that way, but they are hardly the only car with such a feature but they are the only ones using a name that implies it can do the driving for you)

      The truck driver's driving record isn't very good (neither is the Tesla driver's) so it is possible he did something wrong too, but that's why the car's driver needed to pay attention. If an accident resulted every time I made a driving mistake (pulling out in front of someone I didn't see, that sort of thing) I'd have been in a thousand at-fault accidents in my life instead of just one, as would just about anyone else if they are honest with themselves.

      1. Vic

        Re: One near miss and a not miss within a couple months?

        I'd put the rest of the blame on Tesla for calling that feature "autopilot" when it is nothing of the sort

        On the contrary - it is quite appropriately named.

        In an aircraft, the autopilot takes some of the load out of flying - it maintains, speed, altitude, direction, that sort of thing. It doesn't avoid mountains or other traffic. It isn't a magical flying box - it's simply a way to deal with the simpler feedback loops a pilot needs ot keep on top of.

        And so it is with this - it does some of the simpler stuff, just as an aircraft autopilot would. But it doesn't mean you can't hit stuff, even if it makes best efforts at that.

        Vic.

        1. Kiwi
          Devil

          Re: One near miss and a not miss within a couple months?

          I'd put the rest of the blame on Tesla for calling that feature "autopilot" when it is nothing of the sort

          On the contrary - it is quite appropriately named.

          Unfortunately humans aren't generally that intelligent or look for the simplest level of understanding, therefore "auto" = "it does it all for me automatically".

          (I'm someone who likes to do my own wrenching (and 'lectrics), so I have all sorts of levels of dislike for modern vehicles!)

  31. earl grey
    Joke

    well, at least it gave him a "heads up"

    getting my coat

  32. colinb

    Feel sorry for the developers

    Where a bug/design flaw causes the loss of a human life, my bugs just result in screaming frustration, which i can handle.

    I work in an industry where you are either the f**ker or the f**kee and on the roads until all cars/trucks are computerised as a autonomous car rider you are the f**kee

  33. nilfs2
    Terminator

    All or nothing

    For autonomous cars to be successful, all the vehicles on the road have to be autonomous, so they can communicate with each other and be predictable, computers don't like unpredictability, humans behind the wheel are very unpredictable.

    1. John Brown (no body) Silver badge

      Re: All or nothing

      "humans behind the wheel are very unpredictable."

      As will be the many different AI algorithms seem to us more careful meat-sack drivers, not to mention that they will change/be upgraded over time.

      "Race condition" might take on a whole other meaning on a busy road with different AI cars. Or, more likely, I'd expect they'd "fail safe" and probably end up all stopped while waiting to see what the others will do, maybe inching forwards every now and then like the Google car and the cyclist.

      1. Charles 9

        Re: All or nothing

        I think this is one of the conundrums AI researchers have posed to car manufacturers:

        "If four cars pull up to the four legs of a four-way stop simultaneously, who goes first?"

        This is actually a conundrum for drivers in general, not just for AI drivers. It's just that with human drivers, someone eventually runs out of patience and breaks courtesy.

  34. wayne 8

    Imagine this crash with, instead of a Tesla car, an 18 wheeler tractor/trailer combo loaded up and moving out. How about a convoy, Rubber Duck?

    http://m.theregister.co.uk/2015/05/06/robot_lorries_on_a_roll_in_vegas/

    "The Freightliner Inspiration Truck operates on a system called Highway Pilot, an automated cruise control setting which is intended to allow the driver to relax and, for instance, read The Register while avoiding Decepticons road hazards."

    Note the date of the article, 2015/05/06, the day before the Tesla fatal incident.

    Wonder if that truck is still on the road here in Nevada. There is no news since the 2016/05/06 PR.

    Tin foil hats ON. The day after the PR, a trucker whose livelihood could be threatened by automation, crosses the road the road in front of a speeding Tesla.

    "Why did the truck cross the road?"

  35. x 7

    Don't American trucks / trailers have under-run bars?

    In the UK trucks and trailers have bars along the side to prevent vehicle under-runs such as this. Its a basic legal safety requirement. Don't trucks in the USA have something similar? Damn stupid if they don't.

    The bars wouldn't stop the crash, but they may at least make it survivable

    1. Charles 9

      Re: Don't American trucks / trailers have under-run bars?

      US roads tend to have more humps, and even without the bars, trucks frequently get stuck on them. Furthermore, many of these humps are at railroad crossings, meaning a trailer stuck on a hump runs the risk of getting subsequently smashed by a train.

      And no, the humps can't be fixed because the roads are locally maintained by communities on tight budgets.

  36. razorfishsl

    So it is functioning correctly.

    You get a semi-random chance of surviving in a Life V Death situation.

    Unless all vehicles are controlled and are capable of communication with each other people will die and in a mass pile up, the dynamics are just going to be out of the realm of most current software.

  37. Nano nano

    Regression test

    No doubt this scenario will be added to the Regression Test suite for the car's "autopilot" ...

  38. Anonymous Coward
    Anonymous Coward

    He *thought* it was just that good...

    Tesla Autopilot

    Actually I know exactly what happened here. I was waiting for El'Reg to dig up the truth since they obviously have agents embedded deep in Musk's operations.

    The build of control software that Mr. Browns Tesla Model S was running was not the autopilot 'beta' as reported by the news agencies but was in fact build 0.79disDsheitAlpha.1 which was actually compiled by a group of Tesla roadsters that WERE running the autopilot 'beta' themselves.

    I have inside information that this Alpha build was the first to include the fully autonomous routines that will eventually make it into production. One of these routines that runs in the background all the time after init looks to optimize the system by looking for opportunities to activate or remove the sensors providing the least valuable input.

    Having been unable to engage a fully grown man that was watching Harry Potter for the 87th time with rapid vector shifts, Tessie0.79disDsheitAlpha.1 identified an opportunity for a full cephilectomy of the useless sensor with little collateral damage and simply went for it.

  39. JeffyPoooh
    Pint

    Another idiotic Tesla Autopilot crash, this time in China

    It's all over the news this morning.

    The Artificially Stupid thing ignore the red triangle and then gently sideswiped a car stopped along side the road.

    Not. Ready. For. Primetime.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like