back to article Tesla Full Self-Driving 'fails' to notice child-sized objects in testing

The latest version of Tesla's Full Self-Driving (FSD) beta has a bit of a kink: it doesn't notice child-sized objects in its path, a campaign group has claimed. In tests performed by The Dawn Project using a Tesla Model 3 equipped with FSD version 10.12.2 (the latest, released June 1), the vehicle was given 120 yards (110 …

  1. Anonymous Coward Silver badge
    Stop

    Comparison

    Are they going to compare these results to another car with cruise control and lane assist active, or is it purely someone's vendetta against tesla?

    1. Anonymous Coward
      Anonymous Coward

      Re: Comparison

      I think if the CEO of a company says "I would be shocked if we do not achieve full self-driving safer than human this year. I would be shocked" as Musk did 7 months ago, it's not unreasonable to test its functionality. And it's not as if the software should have been stressed. Straight lines, daylight, about 10 times the required human reaction/braking distance at the low speed, and about 8 seconds to see and react.

    2. katrinab Silver badge
      Facepalm

      Re: Comparison

      Those cars don't claim to be self-driving.

      1. TRT Silver badge

        Re: Comparison

        The cars don't claim ANYTHING. It's the marketing bods that go around making spurious claims. And verifiable claims too, I hasten to add - they're not all a bunch of charlatans, though I do sometimes wonder if they have just the one moral compass, which they have pass amongst themselves like The Graeae did with that eye.

        1. katrinab Silver badge
          Childcatcher

          Re: Comparison

          I've never seen a car with cruise control or lane assist claim it will stop you mowing down children, so if it does, that's on you.

          Tesla, on the other hand, does make those claims, or at least people think they do, which is what matters.

          1. tfewster
            Terminator

            Re: Comparison

            Toyota Pre-Collision System with Pedestrian Detection I have no knowledge of its effectiveness, but it may be comparable.

            On the other hand, the Dawn Project test was a little odd and the article didn't describe the Tesla's behaviour very clearly - Driving between a lot of child-sized objects (cones), apparently the Tesla slowed down. When it "saw" an opening, it missed the solid white (stop?) line and (stationary?) mannequin, and speeded up again? Again, I have no knowledge of what Tesla's are "looking" for, but maybe the computer or its sensors were overwhelmed.

            It's a bad outcome, but would an adult sized mannequin or dummy car/truck have been detected and avoided? How about if the cones were replaced with parked vehicles or the lane was completely open?

            1. heyrick Silver badge

              Re: Comparison

              "but maybe the computer or its sensors were overwhelmed"

              And?

              Meatsacks can't use that excuse if they were "overwhelmed" trying to look at the road and a phone at the same time.

              Remember it's an x-ton moving mass on a public highway. If it gets "overwhelmed", it's not fit for purpose, get it off the roads until it is.

            2. Anonymous Coward
              Anonymous Coward

              Re: Comparison

              > It's a bad outcome, but would an adult sized mannequin or dummy car/truck have been detected and avoided? How about if the cones were replaced with parked vehicles or the lane was completely open?

              What you're asking here is if it'd have performed better in a completely different scenario.

              I don't think Tesla are going to run with a marketing line that's effectively "yes, it'll mow down small children, but it's excellent at detecting adults and other vehicles".

              If it can't handle complex scenarios, then it's not ready.

            3. MachDiamond Silver badge

              Re: Comparison

              "but would an adult sized mannequin or dummy car/truck have been detected and avoided? "

              Not if the adult was on a bicycle.

          2. NeilT

            Re: Comparison

            No, actually people who don't have a Tesla, don't have the software and have never read the terms and conditions think that Tesla FSD, a piece of Beta software, which is unfinished and very much NOT "certified" for level 5 self driving, won't mow down something it does not recognise.

            That doesn't make Tesla bad, evil or lying to people. It just proves that a hell of a lot of people are just plain STUPID.

            BTW, Tesla does not market and does not have "advertising bods". Musk, the company CEO, talks about the progress they are making and the expectations for delivery.

            People who DO have Tesla's, HAVE bought FSD and HAVE read the terms and conditions; are overwhelmingly in favour of FSD and are glad that they are getting to beta test it instead of waiting for the company to do it themselves and get it certified.

            But that doesn't sell media or press. It also doesn't drive US senatorial bids.

            1. John 104

              Re: Comparison

              @NeilT

              Fanboi anger much?

              The fact that it is Beta and being 'tested' by owners in the real world where live human beings exist should be frightening enough...

            2. Pascal Monett Silver badge
              FAIL

              Re: It just proves that a hell of a lot of people are just plain STUPID

              And, in other news, bears shit in the woods.

              Sorry, but you're not exactly revealing a scoop here.

              1. aerogems Silver badge
                Facepalm

                Re: It just proves that a hell of a lot of people are just plain STUPID

                I see what you did there.

            3. heyrick Silver badge

              Re: Comparison

              Downvote because, FFS, there's surely a clue in the name "Full Self Driving" that might make one understand the car to be capable of driving itself. And no, you can't hide "doesn't actually drive itself" on page thirty seven of a sixty page licence agreement.

              1. Only me!
                Facepalm

                Re: Comparison

                FSD says nothing about stopping.

            4. and I

              Re: Comparison

              It seems the CA DMV disagrees and has sued Tesla on the basis that the names FSD and Autopilot are misleading consumers. FYI Tesla does have a marketing department check out who works there on LinkedIn if you doubt this.

              The NHTSA are also getting increasingly interested in the rate of Tesla accidents, especially Teslas hitting parked emergency service vehicles, it appears that's another blind spot for them.

              1. Mr. V. Meldrew

                Re: Comparison

                I wonder......

                Just my small grey cells working overtime.

                Do you think that "parked emergency vehicles" maybe with various radio frequencies in use at the same time could affect the Tesla car?

                Just a thought.

                1. ThatOne Silver badge
                  Facepalm

                  Re: Comparison

                  > radio frequencies in use at the same time could affect the Tesla car?

                  So what? Please don't tell me this is supposed to be an excuse?...

                  1. Mr. V. Meldrew

                    Re: Comparison

                    Not an excuse. Just a simple question for the more informed. Unlike yourself.

                    1. ThatOne Silver badge
                      Meh

                      Re: Comparison

                      > the more informed. Unlike yourself.

                      Jeez, how childish. You probably don't care since you think I've insulted you, but my point was that even if it was true, it is irrelevant since it simply should not happen, period.

                      Besides no, parked emergency cars don't emit anything, they do listen to various radios (police, emergency), but that's passive. They definitely do not EMP the Tesla computer into road rage.

                      1. M.V. Lipvig Silver badge

                        Re: Comparison

                        "Besides no, parked emergency cars don't emit anything, they do listen to various radios (police, emergency), but that's passive. They definitely do not EMP the Tesla computer into road rage."

                        Are you sure about that? It's not the 1970s where the cops were using CB radios. Today the cops have location transponders that are always on. In fact, some well known radar detector manufacturers are looking into trying to use this transponder data as a police car detector. They also have encrypted comm links now. I don't know specificaly how their systems work, but encrypted links in general are always on. Always on gives a potential spy tons more data to sift and the spy won't know what is inteligence and what is dead air. If this is the case, then the cop car may be confusing the Tesla. I'm not saying it's an excuse, but if this is the case then it means Tesla overlooked the possibiity of outside transmitters causing interference with its systems. This would be a huge miss.

                2. MachDiamond Silver badge

                  Re: Comparison

                  "Do you think that "parked emergency vehicles" maybe with various radio frequencies in use at the same time could affect the Tesla car?"

                  Highly doubtful. Teslas use cameras to sense the environment as Elon doesn't believe in RADAR or LiDAR (They may not believe in him either). If transmitting radios are swamping the CAN bus in a car, that's a huge issue and should prompt an immediate recall.

            5. Kane
              Facepalm

              Re: Comparison

              "No, actually people who don't have a Tesla, don't have the software and have never read the terms and conditions think that Tesla FSD, a piece of Beta software, which is unfinished and very much NOT "certified" for level 5 self driving, won't mow down something it does not recognise.

              That doesn't make Tesla bad, evil or lying to people. It just proves that a hell of a lot of people are just plain STUPID.

              BTW, Tesla does not market and does not have "advertising bods". Musk, the company CEO, talks about the progress they are making and the expectations for delivery.

              People who DO have Tesla's, HAVE bought FSD and HAVE read the terms and conditions; are overwhelmingly in favour of FSD and are glad that they are getting to beta test it instead of waiting for the company to do it themselves and get it certified.

              But that doesn't sell media or press. It also doesn't drive US senatorial bids."

              Ok, Elon, whatever you say...

            6. Triggerfish

              Re: Comparison

              Musk is Tesla marketing, he is one of the more powerful social media influencers out there.

              1. MachDiamond Silver badge

                Re: Comparison

                "Musk is Tesla marketing, he is one of the more powerful social media influencers out there."

                Stop it, you're making me bilious. Mostly because it's true. He got rid of their PR department possibly so it would be tough to get anybody at the company to talk to the press.

            7. An_Old_Dog Silver badge
              Flame

              Re: Comparison

              Keep me and my loved ones out of this "beta test" program.

            8. HkraM
              Stop

              Re: Comparison

              You, as a driver, may well have agreed to Tesla's Terms and Conditions (probably without reading them all) to test their beta software - but I, as a pedestrian, have NOT agreed to be part of that beta test.

              So stay the fuck away from me.

            9. Anonymous Coward
              Anonymous Coward

              Re: Comparison

              > People who DO have Tesla's, HAVE bought FSD and HAVE read the terms and conditions; are overwhelmingly in favour of FSD and are glad that they are getting to beta test it instead of waiting for the company to do it themselves and get it certified

              Here's the thing though.

              It's not just you beta testing it.

              We're all beta testing it too - anyone who happens to be sharing the road with your multi-tonne killing machine is part of that test, and some of us don't consent to being put at risk so that a company can cut corners on testing and certification.

          3. Roland6 Silver badge

            Re: Comparison

            Given the level of driver assist and collision detection Volvo were shipping in some of their cars years back, I suspect their cars with the relevant systems - labelled as driving aids and not self-driving mumbo-jumbo have had this capability for a few years now and probably real-world feedback that the systems work.

            1. An_Old_Dog Silver badge

              Re: Comparison

              Volvo's systems working !=> Tesla's systems working.

        2. Anonymous Coward
          Anonymous Coward

          Re: Comparison

          In this case, when you say marketing bod, you mean CEO.

      2. jollyboyspecial Silver badge

        Re: Comparison

        Musk predicted Tesla would achieved full self driving this year so I think it's reasonable to test how close they are. As it stands they are a country mile away.

        But the thing that puzzles me is why call it Autopilot and then constantly moan that people mistake it for self driving. It is perfectly reasonable to assume that Autopilot would mean self driving. If they didn't want people to make that assumption why not call it something else - like maybe driver assist?

        1. Zack Mollusc

          Re: Comparison

          Yeah, I think that the nerds at Tesla chose the name Autopilot because its functions and limitations are very similar to that of an autopilot on an aircraft. They didn't take into account that most of the public imagine that autopilot is a magical intelligent robot that can do anything a human pilot can do

          Heck, most of the public probably have no idea how the car they themselves drive every day does anything it does.

          1. heyrick Silver badge

            Re: Comparison

            Well, given that I'm alive because an autopilot had no problems whatsoever landing an Airbus in fog so dense it was impossible to see the flashing lights on the wings from the cabin windows (the airport was technically closed but we didn't have fuel for a reroute), I think your analogy might need some work.

            1. Roland6 Silver badge

              Re: Comparison

              But, quite reasonably, you don't expect there to be objects such as children on the runway; an assumption that can't be reasonably made about our roads...

              1. Richard Pennington 1

                Re: Comparison

                Still less would you expect to encounter a child in front of you at 30,000 feet.

                1. BebopWeBop
                  Devil

                  Re: Comparison

                  Ahhh - see 'Adams Family Values'

            2. Anonymous Coward
              Anonymous Coward

              Re: Comparison

              The important difference being that it wasn't really the autopilot that landed you, autopilots were on airplanes for literally decades before the automated landing systems were flight rated and used with real people.

              That's not to zing you, it's just not as common knowledge. Another important difference is that those automated landings are taking place with a half dozen support systems, on dedicated runways, with a team of air traffic controllers, under specific and favorable wind conditions where the automated landing system is safer then human pilots. Human pilots that supposed hands on and ready to take over the moment something beeps funny. But in most regards there are different "levels" of airplaine assits/autonomy just like cars, and "autopilot" just needs to hold a basic course, elevation, and speed in open airspace. So it is actually still capable then most of the driver assists because they have to deal with roads and ground stuff. Less complicated might be a better way to put it.

              1. Doctor Syntax Silver badge

                Re: Comparison

                "The important difference being that it wasn't really the autopilot that landed you, autopilots were on airplanes for literally decades before the automated landing systems were flight rated"

                Which really proves the point. If people over-estimate what autopilot on planes do and confuse it with automated landing systems it's very likely that they'll over-estimate what it does in cars.

            3. SCP

              Re: Comparison

              ... autopilot had no problems whatsoever landing an Airbus in fog so dense it was impossible to see the flashing lights on the wings ...

              But then that autopilot would be using radio/microwave signals not obscured by the fog to follow a fairly standard landing flight profile and will be "assuming"** that the runway is where the signals indicate it will be, and that the runway is unobstructed.

              That autopilot, for all its cleverness, is still performing a relatively straight-forward control task - which was the essence of the previous poster's statement.

              I think that Tesla, for all the progress it might be making, is muddying the waters far too much and this is likely to have long term negative effects on car driving automation (which IMO still has a long way to go).

              [** the assumptions will have been built into the design of the system and the pilot training for the use of the system.]

            4. Dave314159ggggdffsdds Silver badge

              Re: Comparison

              "the airport was technically closed but we didn't have fuel for a reroute"

              [citation needed]

              I assume you can point us to the air safety report that was issued after that major incident?

              1. Richard Pennington 1

                Re: Comparison

                Presumably something along the lines of this:

                https://en.wikipedia.org/wiki/Gimli_Glider

                1. Dave314159ggggdffsdds Silver badge

                  Re: Comparison

                  Well, quite. A really major incident. I think the OP may have been exaggerating a bit. I can't think of any incident that fits the description, although there are lots that could be massaged to fit for the sake of a good story.

                2. Richard Pennington 1

                  Re: Comparison

                  Not sure who is downvoting all these posts (or why).

                  The Gimli Glider is quite a well-known case. They ran out of fuel at 41,000 feet, and flew it like a glider to get down at an old military airstrip.

                  Also, as a bonus:

                  [1] They put the plane down at an airstrip which had been closed ... and decommissioned ... and converted for drag-racing ... and which was hosting an event at the time;

                  [2] They had children on the runway (including two on pushbikes near the end of the runway ... who panicked and tried to outrun a 200mph aircraft ... and eventually took the better choice to go off the side of the runway);

                  [3] The [human] pilots managed to put the plane down and bring it to a stop without causing a serious injury (let alone a fatality).

            5. ICL1900-G3

              Re: Comparison

              Airliners fly Instrument Flight Rules, a tightly controlled and monitored system. It bears no relationship at all to the chaotic world of ground-based motor transport.

            6. Zack Mollusc

              Re: Comparison

              Would the Airbus have detected a child on the runway? Or any other obstacle?

              Not saying that a pilot would have done any better at landing a plane blind in fog, but I am sure the autopilot was not relying only on its own sensors to land the aircraft and was making a lot of assumptions .

            7. ThatOne Silver badge

              Re: Comparison

              > I'm alive because an autopilot had no problems whatsoever landing an Airbus

              This isn't even apples and oranges, it's apples and baseballs. "Autopilots" work fine on planes and ships which mostly travel through vast empty spaces. Cars on the other hand rarely travel through empty spaces, so a simple "autopilot" system won't do.

              On a car the main issue isn't so much to stay on track on a journey from A to B, but to avoid hitting all kind of objects constantly doing all kind of unexpected and stupid things around you. That is the real challenge, and clearly one Tesla's "Autopilot" is still very far from having mastered.

              Of course you can expect a system able to crash full speed into a huge truck to miss a small object like a child, I would had been surprised if they didn't. The only surprise here is why this clearly unready system is allowed to be used on public roads, putting innocent peoples' lives in danger. Test it, improve it, make sure it won't even hit a stray cat, and then only sell it to the public.

              1. Dave314159ggggdffsdds Silver badge

                Re: Comparison

                "why this clearly unready system is allowed to be used on public roads"

                Because human drivers are so bad that it's still a net gain despite all the things wrong with it. It's a bit like the way we can use experimental drugs on people who are terminally ill - desperation justifies it.

                The UK is one of the best countries in the world for driving standards, and yet they are utterly appalling. Most people driving around simply don't know the basic rules of the road.

                In countries like the US, where there is no driving test at all, and little enforcement of road laws, driving standards are unimaginably low. A large minority of US drivers say they are unable or unwilling to _reverse_. They literally cannot back into or out of a parking space, let alone parallel park. And they drive around blithely ignoring their complete lack of skills, and kill people at an insane rate - roughly 3x the rate per mile the UK manages with our standard, such as it is.

                Self driving cars don't have to be very good at all to be better than that. From a quick google, Tesla appear to have a fatal accident rate (during Autopilot use) roughly comparable with the UK, despite most miles being done in the US.

                So the real question is whether it would be lawful for governments to forbid the use of Autopilot. Certainly it wouldn't be in the UK or EU due to the European Convention on Human Rights, and I think US Constitution also prevents it. It would be akin to banning airbags because they can make accidents worse in rare cases.

                1. ThatOne Silver badge

                  Re: Comparison

                  > Because human drivers are so bad that it's still a net gain despite all the things wrong with it.

                  All right, let's talk about this again when we've reached that point.

                  Right now AI is as good as a narrow-sighted drunk teenager with no glasses and no permit. Which is a very low standard. Let's wait till self-driving is not necessarily better than the average human, but at least not worse than the worse human drivers. I'm pretty sure aforementioned drunk teenager would had managed to miss at least one child in three.

                  1. Dave314159ggggdffsdds Silver badge

                    Re: Comparison

                    "All right, let's talk about this again when we've reached that point."

                    You aren't entirely wrong, sometimes AI is pretty much as you describe - but then, so are humans, sometimes, which is why we get wrong-way drivers, people who stop on motorways to let ducklings cross, people who mow down cyclists by driving on the wrong side of the road, etc, and most of the time it isn't even newsworthy. Most of the rest of the time, AI is better than your typical driver because it can, e.g., make a right turn correctly.

                    Really we're just arguing about how bad the current standard of driving is, and, as I said, the statistics suggest Autopilot is roughly at that level when measured by deaths per mile.

                    If we allow self driving cars to be used quite widely we have every expectation that they will improve over time. Whereas clearly that isn't happening with human drivers.

                    1. ThatOne Silver badge

                      Re: Comparison

                      > we get wrong-way drivers, people who stop on motorways to let ducklings cross [...]

                      You're confusing exceptions and standard performance. Yes, there are some human drivers which are appallingly bad, but it seems* the average human is still a better driver than the AI in its current state.

                      Also it's not like "sometimes AI is pretty much as you describe", it's actually always like that: Unlike humans, AI is consistent, it has no good or bad days, whatever it does today it will do tomorrow and the day after. Meaning, if today it's not up to a task, it will not be up to that task all following days, until some future version fixes that specific shortcoming. And apparently there is still a lot to fix.

                      Now, downgrading a system which in average is better because of some outliers does not make any sense. To caricature, some humans are nasty criminals, does that mean we should replace all humans with robots?...

                      * Take the number of motorized vehicles using the streets each day, divide by the number of accidents. You might say "too much", but statistically there are few given the amount of vehicles.

                      1. Dave314159ggggdffsdds Silver badge

                        Re: Comparison

                        "You're confusing exceptions and standard performance."

                        No, I'm talking about both. The exceptions exist with both human and electronic drivers. The standard performance is unquestionably better for the computers. By a wide, wide margin at this point.

                        "the average human is still a better driver than the AI in its current state."

                        I've repeatedly cited the stats here. That is simply not true. The observed data is that Tesla's 'autopilot' system is significantly better than the global average, and roughly equivalent to the UK, measuring by fatal crashes. Looking at minor crashes, the stats are overwhelmingly in favour of the computers.

                        "it has no good or bad days, whatever it does today it will do tomorrow and the day after."

                        No. On any given day the AI may or may not encounter situations which cause it problems. If it doesn't, it'll work flawlessly.

                        "apparently there is still a lot to fix"

                        No disagreement there. The point is that even with all that stuff to fix - which, as I said, means we can expect things to improve over time - the computers are already doing better than the average human driver.

                        1. ThatOne Silver badge

                          Re: Comparison

                          > Tesla's 'autopilot' system is significantly better than the global average

                          I think we'll have to agree to disagree on that. I do not believe that a system that routinely plows into trucks or stopped vehicles is better than the global average of human drivers.

                          I also deny that there are "exceptions" with Tesla's AI like there are for humans. Unlike human drivers, where there are huge differences and undeniably good or bad drivers, Tesla's AI is one and the same in all cars. Any shortcoming it has, is present in all cars.

                          Which means that if Tesla's AI was a human driver, it would have done some pretty dumb things in those few years it's in the streets, things most humans don't. And I insist on the "most". I don't know where you live (there are countries where people get creative with the rules of the road) but around here we have an 80% of adequate drivers, 15% of borderline, and 5% of really bad ones. As opposed to those 100% inadequate (as in "not yet fit for duty") Tesla AIs. I rest my case.

                          1. MachDiamond Silver badge

                            Re: Comparison

                            "I think we'll have to agree to disagree on that. I do not believe that a system that routinely plows into trucks or stopped vehicles is better than the global average of human drivers."

                            I'm with you on that. I don't see the point in swapping out the accidents that a system might avoid for ones that mostly never happen. I haven't seen anybody post the stats on how many lit up emergency vehicles get plowed into on an annual basis. I'm sure it happens. The road service trucks here often have crash absorbers on the back so I'd guess enough idiots bang into them often enough to make them worth the cost.

                        2. Roland6 Silver badge

                          Re: Comparison

                          >The observed data is that Tesla's 'autopilot' system is significantly better than the global average

                          It hasn't occurred to you that perhaps the wrong benchmark is being used?

                          I suggest rather than the global average which is obviously going to be very low and open to interpretation, the benchmark needs to be something a little more precise. I suggest the average UK Advanced Driver. It will be only by achieving this level of competence will we actually see a significant drop in accidents etc.

                          Why? Because if we accept the current state-of-the-art (and we are talking about art and not science when it comes to machine learning) that the system is "roughly equivalent to the UK (average)", the result is equivalent to taking all the above-average drivers off-the-road and thus we can expect the number of incidents to increase; potentially double. By setting the benchmark at the Advanced Driver level, we effectively take-off-the-road all average and below drivers, significantly reducing the number of incidents.

                        3. M.V. Lipvig Silver badge

                          Re: Comparison

                          How many miles has Tesla racked up with Autopillot vs us average drivers, and how much of that was during ideal conditions vs us Luddites who have been driving in all conditions for decades? Yes, all that matters. While it can't be expected that Tesla has the trillions of miles under its "belt" that we poor meatsacks have, it does matter when a large part of the miles it did travel were in favorable circumstances. And, even in perfect weather the car still treats cops and ambulances like 10 points. Outside test conditions, I really doubt anyone outside a Tesla tester has pushed the autopilot button on a rainy or icy day while quite a few of those meatsack accidents have happened on days with horrible weather conditions. So, Tesla having a slightly better accident rate on sunny days than people have 24x7 regardless of weather does not look good for Autopilot.

                2. Graham Dawson Silver badge

                  Re: Comparison

                  Because human drivers are so bad that it's still a net gain despite all the things wrong with it.

                  Teslas drive like they're from Birmingham.

                  So no, it's not a net gain.

            8. hoola Silver badge

              Re: Comparison

              Whilst I agree, there is significant feedback to the system to achieve this and crucially, the runway is a fixed location with (usually) not unexpected obstructions. The expectation is that the runway is ready to receive the aircraft that is landing.

          2. gandalfcn Silver badge

            Re: Comparison

            I first used an autopilot in ships in the 60s.

            it simply meant a set course was steered. Sort of.

          3. Wellyboot Silver badge

            Re: Comparison

            So what you're saying is that Tesla just needs humans performing 'road traffic control'* and all the problems will be fixed.

            * for everything that is going to be on the road including wildlife.

          4. Cuddles

            Re: Comparison

            "Yeah, I think that the nerds at Tesla chose the name Autopilot because its functions and limitations are very similar to that of an autopilot on an aircraft."

            The thing about Autopilot is that it's possible to have these arguments about exactly what it means and what it's reasonable to expect the general public to think about it. Full Self Driving is a little less ambiguous, and no amount of small print or claims about being in beta is going to change the message that name gives to the average person.

        2. Anonymous Coward
          Anonymous Coward

          Re: Comparison

          Well, he predicted it would be done 4-5 years ago. Bit of a moving target for him.

          It's a 'solved problem' don't you know.

        3. gandalfcn Silver badge

          Re: Comparison

          "But the thing that puzzles me is why call it Autopilot and then constantly moan that people mistake it for self driving."

          Leon the Paedo is a liar and whimpers and bleats when he is proved to be one.

        4. MachDiamond Silver badge

          Re: Comparison

          "If they didn't want people to make that assumption why not call it something else - like maybe driver assist?"

          They could just call it "cruise control +".

          I know there is a difference between Autopilot and Full Self Driving, but the descriptions are vague and the names are misleading. If you know what autopilot does in a plane, it is close. In an aircraft, a simple autopilot maintains course and speed. A more advanced version will turn at programmed waypoints and adjust speed and altitude. Since flight plans are submitted and approved in advance along with air traffic controllers watching the skies, IFR flights on autopilot work very well. There isn't a parallel with a car on a road. Being off 3 meters in the air is not a big deal. In a car, 3 meters can mean you're driving on the pavements.

          So what the hell is FSD if it isn't Full Self Driving? How can an average owner, no matter how safely they drive, be allowed to beta test the system?

      3. Adam Foxton

        Re: Comparison

        You're claiming that calling a product "Full Self Driving" doesn't amount to a claim that the thing is fully self driving?

    3. Richard 12 Silver badge
      Terminator

      Re: Comparison

      They don't call it "Full Self Driving"

      When you call something "Full Self Driving", there is a certain expectation that it is in fact capable of safely doing so.

      1. NeilT

        Re: Comparison

        When you say something is in BETA, there are certain expectations set by precedent.

        Beta us In Test, Not Tested, Not certified, expected to have bugs and fully expected that things won't work as advertised.

        Not until it is Out of Beta should any expectations of Full Self Driving be set.

        Even then FTFM. The information on the website Before you purchase FSD says that the product is not yet ready for release and that even when it is ready for release it may not be used until the regulators approve it.

        I know that some people can't read English and that some people are hard of thinking. Apparently they have "expectations too". Should I listen to them???

        1. DS999 Silver badge

          Re: Comparison

          That doesn't excuse them when Musk claims "there will be a million Tesla robotaxis making $100K a year for their owners on the road by the end of 2020" and how he'll be shocked if it isn't truly self driving by the end of this year. He's been pushing the date back since 2016 and will still be doing so in 2030.

          I hope the California DMV slaps them down hard, he needs to be taught a lesson about false advertising.

          1. Doctor Syntax Silver badge

            Re: Comparison

            "I hope the California DMV slaps them down hard, he needs to be taught a lesson about false advertising."

            Not just the California DMV. Isn't this stuff that could affect the share price? In that case the SEC should be interested, particularly given that he has form on this.

            1. DS999 Silver badge

              Re: Comparison

              What the California DMV is concerned with isn't really something the SEC would be interested in, and the SEC probably has their hands full with Musk related investigations already.

              The SEC doesn't get involved with anything "that could affect the share price" because pretty much anything a publicly traded company does is. They are only concerned with actions taken to illegally manipulate stock price or profit from it. Mispresenting the capability of a product would fall under the remit of the FTC, but the rules for false advertising in the US are a much higher legal bar than in the UK.

              1. Dave314159ggggdffsdds Silver badge

                Re: Comparison

                "They are only concerned with actions taken to illegally manipulate stock price or profit from it."

                Er, no. Misleading investors is also a crime. Look at the Theranos case.

                I don't think Musk has done so in an actionable way, but it is possible to do so.

        2. Anonymous Coward
          Anonymous Coward

          Re: Comparison

          > Not until it is Out of Beta should any expectations of Full Self Driving be set.

          No, Beta testing normally means we're pretty damn close and just ironing out the last few wrinkles. You're thinking of Alpha testing where the programmers are testing "Hey, does any of this shit work?" or "Is this a viable algorithm"

          You'd hope that no one was alpha test self driving systems on the road without professional test drivers checking and observing the behavior of the system.

          1. Zack Mollusc

            Re: Comparison

            Beta testing normally means we're pretty damn close and just ironing out the last few wrinkles?

            Really?

            Observation of the industry leaders indicate.

            Alpha :It compiles.

            Beta : It compiles and doesn't crash much.

            Release : It compiles and doesn't crash much and we put splash screens on it.

            Updates: It compiles and doesn't crash much and maybe we tweaked the splash screens and we have increased telemetry.

            EOL : The frameworks and libraries have altered enough that it no longer compiles.

            1. Someone Else Silver badge
              Pint

              @Zack Mollusk -- Re: Comparison

              Nicely done. One for you! - - - ->

          2. MachDiamond Silver badge

            Re: Comparison

            "You'd hope that no one was alpha test self driving systems on the road without professional test drivers checking and observing the behavior of the system."

            A system to drive a car in Alpha should not be allowed on public roads regardless. Who's to say the system doesn't lock the driver out and do something very deadly. Elon has been fascinated with acceleration, so many Teslas can go from zero to 100kph in just a few seconds. A test driver might take that long to locate the E-stop and kill the car.... about that time it's already done a whole bunch of damage or become airborne.

        3. heyrick Silver badge

          Re: Comparison

          "When you say something is in BETA, there are certain expectations set by precedent."

          Sure. For domestic software.

          Not for a moving lump of metal on a public road.

          If a failure has a real risk of killing or maiming, all tests should be done off of public roads until it can be shown to work reliably enough to match the wits of an average human. It won't ever be perfect - edge cases like going around a bend and being blinded by sunlight affect us, I don't expect an AI to fare much better.

          But this? Repeatedly driving over a child-like object? That's some real Gen-Alpha hate right there...

          1. Anonymous Coward
            Anonymous Coward

            Looks like exactly the sort of failure you'd expect from beta firmware

            One, this was a deliberately structured test that was designed for the car to fail, on camera. Other tests have reproduced similar concerns, though at a less consistent failure rate. From the description you can see that they specifically set up the cones, which were interfering with the cars model and caused it to slow down and then rapidly accelerate. The "child" in this test was mounted to a base that was visually similar to one of the traffic cones. One of the dummies outfits was also literally dressed in the color palate of Caltrans road markings. There was also a prominent white stripe across the road even with the dummy and the last line of cones.

            To be clear, I'm not saying it mistook a "kid" for a "cone", or at least if it did there are extra layers to that mistake. The car should have either slowed/stopped and alerted/exited self drive. It's not clear from the description if the car alerted the driver during approach. It should have, but it also should have stopped or avoided an obstacle at the same time.

            Considering the source of the video and the specific setup, a reasonable possibility is that they figured out how to create a reproducible failure in the cars driving and visual detection models. This would be different in the specifics but similar to prior attacks that caused the car to rapidly accelerate or stop in response to road signs that had been modified with small but specifically located stickers. The model esentially wasn't using enough decision points to reliably identify the road sign correctly. Bad when that is a stop, yield, or a speed limit.

            During the compression phase of many ML models the system can over prune it's recognition and decision trees, which was the basis of the sticker attack and forced the various players to modify their training to better prevent that class of problem, but the vulnerability is still inherent to that class of model. If that is correct the specific setup of the test is creating what is essentially a blind spot were the dummy is, or the base of the dummy is confusing the object detection, and the cones are triggering the acceleration as the car "sees" the end of the line of cones. Essentially the test seems built to place the target right at the edge/branch of a bunch of critical decisions.

            That said the Tesla was in another similar test with three other manufacturers and crushed a lot of styrofoam pedestrians, as I mentioned before. So regardless of the exact mechanisms this is still a fail, but it will be important to let independent testers get in there and find out why, and how general this problem is.

            1. LybsterRoy Silver badge

              Re: Looks like exactly the sort of failure you'd expect from beta firmware

              -- One, this was a deliberately structured test that was designed for the car to fail, on camera. --

              and the problem with that is? I can see something of the sort happening in real life.

              1. awavey

                Re: Looks like exactly the sort of failure you'd expect from beta firmware

                Something of the sort happens multiple times per day every day just on UK roads, and no one cares because we tolerate humans making these life/death mistakes

                But put a computer in control and suddenly every demands perfection from it.

                If their autopilot system kills or injures only 1 % less people on the roads than now, its actually an improvement

                1. hoola Silver badge

                  Re: Looks like exactly the sort of failure you'd expect from beta firmware

                  Yes, but if that happens, those responsible are usually prosecuted for their error. At the moment there is no chain of liability or precedent for self-driving.

                  Just accepting that it has killed or injured someone is not acceptable.

            2. Anonymous Coward
              Anonymous Coward

              Re: Looks like exactly the sort of failure you'd expect from beta firmware

              Even if they were dressing the dummy up to look similar to a cone, you shouldn't expect it to flatten the dummy/cone

              1. MachDiamond Silver badge

                Re: Looks like exactly the sort of failure you'd expect from beta firmware

                "Even if they were dressing the dummy up to look similar to a cone, you shouldn't expect it to flatten the dummy/cone"

                It shouldn't even hit the cone. What if there were a bridge out or flooding and a minimum set of cones was set out to warn drivers before proper barriers could be erected?

            3. FatGerman

              Re: Looks like exactly the sort of failure you'd expect from beta firmware

              >> Essentially the test seems built to place the target right at the edge/branch of a bunch of critical decisions.

              That's *exactly* what you should be doing if you're doing your QA properly. This looks like a case of a third party doing what Tesla is (negligently?) not doing. Fair enough if Tesla are relying on independent testers to do this for them, many people in many industries do this - but if this is being found in production code then that's appalling.

              What's now important is how Tesla respond to these findings. Will they be responsible and address the issue, or will they give Elon's standard response Number 3 and call the testers a bunch of paedophiles?

            4. hoola Silver badge

              Re: Looks like exactly the sort of failure you'd expect from beta firmware

              Self-Driving means that the vehicle is capable of driving under normal road conditions. That is highly variable with fixed and moving objects. Then you have all the signs, marking and the GPS that is actually telling it where to go.

              At the moment self-driving is barely functional under ideal test conditions let alone real world.

              Arguments that road markings and signage needs to be improved to (try) to make self-driving vehicles work are rather defeating the object of self-driving.

              Perhaps Musk would like to pay to for all the upgrades say in the US or UK so that one of his vehicles is less likely to go round destroying things or people. Crucially, Tesla (and preferably the developers/management as well) have to be liable when it goes wrong. It is this last step that seriously pisses me off. Nothing has been done to resolve liability when something goes wrong.

          2. LybsterRoy Silver badge

            Re: Comparison

            -- edge cases like going around a bend and being blinded by sunlight affect us, I don't expect an AI to fare much better. --

            Personal favourite - a guy at the company I worked for had just bought a Porche. A bunch of us were going out for lunch and I was "volunteered" to go in the Porche. Everything was fine until we wizzed round a corner straight into the back of a car waiting at roadworks - poor Porche <G>. There had been no sign roadworks were being carried out.

            1. Dave314159ggggdffsdds Silver badge

              Re: Comparison

              Whether there's a sign or not, that's 100% on the Porsche driver. You don't go round a blind corner faster than the fastest you can be going and able to stop in time if necessary. That may be v slow.

          3. MachDiamond Silver badge

            Re: Comparison

            "all tests should be done off of public roads until it can be shown to work reliably enough to match the wits of an average human."

            These systems have the potential to be so deadly that they should be required to pass independent testing by something like UL/CSA/etc in line with standards approved by SAE or similar body before they can be operated on a public street or motorway.

            The actress Anne Heche managed recently to get a Mini up to around 90mph and crashed into a building inserting the car 30ft in. If she were in a Tesla, which is much heavier, can accelerate faster and hit a higher top speed, it would have gone in even further or all the way through the structure. It illustrates just how dangerous a car can be even when on an urban street.

        4. Anonymous Coward
          Megaphone

          Re: Comparison

          If Microsoft released beta software with a flaw that was responsible for people dying, putting a clause into the terms and conditions that you aren't responsible is not going to fly with the public (or ElReg commenters including, I suspect, you).

        5. doublelayer Silver badge

          Re: Comparison

          "When you say something is in BETA, there are certain expectations set by precedent."

          Yes. They are as follows:

          1. The developers have designed and implemented everything.

          2. They did the customary unit testing, and it passed.

          3. They tested on their defined inputs, and it worked.

          4. They tested in practice on their own (alpha testing), and it at least mostly worked.

          5. Now you can try it, but you're likely to find bugs and there are known ones you'll experience that won't be fully documented.

          6. You shouldn't rely on it in production, where important things like your un-backed-up data or humans may be damaged if it goes wrong.

          It's called "full self driving". This is an oddly specific name if it means "limited automatic actions". I'm not asking it to be perfect, but it should be tested a lot before people start activating and possibly relying on it; the license may tell them not to, but that's not sufficient if critical safety systems are still missing. In addition, when it's completed, it should offer full self driving. If it doesn't, they'll get a lot of people thinking it does and using it as such.

          1. M.V. Lipvig Silver badge

            Re: Comparison

            What's bad is people are already using it as a fully self driving car. Idiots have actually turned it on then settled in for a nap.

            https://www.nbcnews.com/news/us-news/tesla-driver-slept-car-was-going-over-80-mph-autopilot-n1267805

            1. Ken Moorhouse Silver badge

              Re: Idiots have actually turned it on then settled in for a nap.

              They will have to review the rules for obtaining a Darwin Award, clearly it will be far too common an occurrence in the future.

        6. LybsterRoy Silver badge

          Re: Comparison

          Neil, from your comments I wonder if you've ever bought a car. When I've done so I might read the manual AFTER I've bought the car and if I'm feeling bored or want to find out what a specific feature does, but probably not. There should be no need to do so.

          Another fun fact - the vast majority of the human race (those not in IT) would be unlikely to have any idea that BETA meant not ready for use.

    4. JohnG

      Re: Comparison

      Funnily enough, O'Dowd doesn't seem keen to make a side-by-side comparison with his own company's (Green Hills Software) offerings.

  2. Randy Hudson

    As soon as I saw this

    I immediately told my wife we'll have to stop dressing our kids like Paddington Bear

    1. Yet Another Anonymous coward Silver badge

      Re: As soon as I saw this

      Dress them like little STOP signs?

      1. Piro Silver badge

        Re: As soon as I saw this

        Then it'll only roll over them at 5.6 mph.

    2. Version 1.0 Silver badge

      Re: As soon as I saw this

      Give the kids plenty of Vitamins and Minerals every morning and they will be able to jump out of the way of the vehicle.

      1. Yet Another Anonymous coward Silver badge

        Re: As soon as I saw this

        Tesla is an American company, the American solution would be to arm the little darlings with the anti-tank weapons

        1. Jedit Silver badge
          Joke

          "the American solution would be to arm the little darlings with the anti-tank weapons"

          To be fair most problems with a Tesla can be fixed with a rocket launcher. For best effect you need to use it before the ignition is turned on, though.

      2. Kane
        Thumb Up

        Re: As soon as I saw this

        "Give the kids plenty of Vitamins and Minerals every morning and they will be able to jump out of the way of the vehicle."

        Lots of vitamins and minerals in marmalade on toast!

        1. Anonymous Coward
          Anonymous Coward

          Re: As soon as I saw this

          Maybe if you gave the kids enough ready brek they glow in the dark sufficiently that even the cars merge sensors couldn't fail to notice them.

          1. ThatOne Silver badge

            Re: As soon as I saw this

            A hat with an emergency rotating light might help*. Just tuck the battery in a backpack.

            * Except in bright sunlight. It's best to stay hidden in those conditions, that's when Teslas are out to hunt.

            1. MachDiamond Silver badge

              Re: As soon as I saw this

              "A hat with an emergency rotating light might help*. Just tuck the battery in a backpack."

              Would they use it like a thumper? Plant it off to the side so the worm, err, tesla, ignores them?

            2. M.V. Lipvig Silver badge

              Re: As soon as I saw this

              And a Tesla will see a child with a rotating light on his noggin as different from an emergency vehicle how, again? They already aim for emergency vehicles, this would just turn the little tyke into a Tesla target.

  3. TRT Silver badge

    BUT...

    If you replace the child dummy with a traffic cone, does it stop?

    I see a whole new raft of YouTube videos along the lines of "Will it blend?"

    1. Anonymous Coward
      Anonymous Coward

      Re: BUT...

      The good question is:

      if you replace the dummy with Elon Musk, will it stop?

      You have 2 hours to write an essay explaining why the car decided to accelerate.

  4. Valeyard

    assumption

    Tesla Autopilot fails to notice child-sized objects in testing

    all we know is that it hit them. we don't know it was an accident.. musk does have quite a few sprogs who'll want money and developing "child-seeking road missiles"TM is the cheaper solution

    1. IGotOut Silver badge

      Re: assumption

      Well as one of his kids has already completly disowned him, for being a complete asshole, I'm sure he can count one off the list.

      1. MachDiamond Silver badge

        Re: assumption

        "Well as one of his kids has already completly disowned him, for being a complete asshole, I'm sure he can count one off the list."

        Hated him so much they decided to have themselves altered as it's the sperm that casts the vote on the sex of the child.

    2. Snowy Silver badge
      Facepalm

      Re: assumption

      That could be true if it where not for the fact that Elon thinks that not enough children are being born and the world is heading for a population crash.

      Or that could be the excuse he using when pumping out some more children, which unlike some other "tech gurus" he does at least acknowledge his spawn.

    3. ChoHag Silver badge
      Joke

      Re: assumption

      Worse, it seems that the car *did* possibly notice the child; as explained in the report the car would "start to stagger as if lost and confused, slow down a little, and then speed back up as it hit and " obliterated the enemy.

      Icon for the comedy impaired. Except it's not, really.

      1. veti Silver badge

        Re: assumption

        Except it wasn't a child. Possibly the car was smart enough to know that.

        Short of live testing, ideally using Elon's own kids, I don't know how to resolve that.

        1. heyrick Silver badge

          Re: assumption

          Correct: Stop the vehicle, then notify the human inside that the car needs help evaluating an obstacle.

          Wrong: Uh, um, maybe, oh fuck it, let's continue <crunch>

        2. ChoHag Silver badge
          Thumb Down

          Re: assumption

          I don't want my car hitting anything, thanks. It doesn't when *I'm* driving it, no matter how small or how many points small children are worth.

          Edit: Or, in recent news, if the brakes are failing and I need to limp to the garage on the hand-brake and clutch, because as a human being I can cope when the faeces flies without activating the murderbot. No small children or mannequins were in danger.

          1. deep_enigma

            Re: if the brakes are failing

            I literally had to do this once. Driving down the highway, about to turn left, push on the brake pedal.. Nothing. Push a lot harder. Nothing. Since I'm not a complete maniac, I had space to use the hand brake before running out of turn lane, and instead of continuing to work I diverted slightly in the other direction to the dealership to get the brakes fixed.

        3. doublelayer Silver badge

          Re: assumption

          "Except it wasn't a child. Possibly the car was smart enough to know that."

          Correct me if I'm wrong, but isn't it still kind of dangerous to drive around consistently hitting child-sized objects that aren't children? Even if it's a lightweight large object, you can knock it into a more dangerous position. If it's denser, that will likely damage your car and the object. When they finally get working self-driving cars, I'd prefer one that avoids collisions with all objects, animate or not.

          1. MachDiamond Silver badge

            Re: assumption

            "If it's denser, that will likely damage your car and the object."

            Given the cost to repair a Tesla, the insurance company will just write the car off. You can then get a newer one. The downside is you'll have to find a new insurance company and the premiums will be higher.

        4. Claverhouse Silver badge

          Re: assumption

          Or ideally using Elon, every time.

        5. Anonymous Coward
          Anonymous Coward

          Re: assumption

          Can we try and see if it manages with a full size adult first - maybe Elon himself?

        6. GruntyMcPugh Silver badge

          Re: assumption

          'smart' doesn't come into it. Something 'smart' might wonder if the child sized object is leaning up against, and obscuring an object behind it, like a concrete bollard.

        7. hoola Silver badge

          Re: assumption

          Are you volunteering then?

          I certainly won't be. We are light years off this tech being remotely usable.

  5. Tom66

    The problem is Tesla chose the word "autopilot" (and "full self driving") rather than "supervised self driving" or something like that, but then Elon is a four-letter-word when it comes to marketing.

    You have to always be paying attention -- because it's still level 2 supervised driving. So what's the point? Well, it is a pretty good way to get training data in the real world without spending much $ so obviously attractive to Tesla there. And it gives the user a 'taste' of what an FSD car might be capable of. Still not worth the $10k or whatever they're currently asking for it, but fools and money I guess.

    I drive a car with adaptive cruise control, from VW; in the manual there's about 20 warnings along the lines of what the system won't do. It won't stop for totally stationary vehicles. It doesn't see pedestrians. It doesn't adapt to the weather conditions. It might brake sharply if you get cut off. It might accelerate unexpectedly if you leave the road (e.g. motorway exit). It might accelerate unexpectedly if it loses the radar signature from the car in front, and you're on a sharp bend. Despite these limitations, I would not buy a car without this function. Provided you're aware of the system's limitations, and are *always* supervising it, it's a good way to reduce driver fatigue (your brain is no longer running the complex "regulate-speed-and-distance" algorithm, instead you're just "stay in lane, watch out for weird ACC stuff") which brings more safety benefits than the disbenefits of the system.

    I think the Tesla AP/FSD beta should be treated this way; it's a fancy lane-assist and speed-assist system, which means you can turn your brain from "driving mode" into "monitor mode", but you cannot be using your phone or idly ignoring the road conditions. Dan O'Dowd might as well say, "look at this modern car with ACC, it ignores children in the road" ... failing to note that it's a supervised system that isn't designed to respond to that specific circumstance.

    It might be interesting to note that O'Dowd also has some interesting competing interests that he doesn't disclose in the video, or in general. He owns a company selling safety-critical software and has been critical over Tesla's use of Linux (instead promoting his own operating system, 'Integrity') despite no obvious issues arising from Tesla's choice of OS.

    N.B. It seems many in the press confuse 'AP', 'Nav on AP' and 'FSD Beta'. 'AP' is the original system based on Mobileye chipset, it's pretty much the same as 'lane hold' in a modern car. 'Nav on AP' is based on the Tesla AP computer and is rated for highway use only. 'FSD Beta' (also known as 'City Streets') is a closed beta for individuals with a certain attentiveness score, based on a few factors disclosed like time to take over after AP requests. Most 'FSD' issues are actually 'Nav on AP' or 'AP' issues. I'm reasonably sure the video is showing regular Nav on AP; the system is simply not *trained* to process the idea of a small child appearing on a highway.

    1. Charlie Clark Silver badge

      Musk doesn't really do marketing. He does PR for the capital markets knowing that the media will generally interpret favourable capital market reaction as endorsement of the product, even though that isn't really what the markets are doing, not least because they don't understand it.

      The point for him is to keep the cost of capital down and thus at a competitive advantage.

      1. Anonymous Coward
        Anonymous Coward

        Musk doesn't really do marketing

        He relies on the Tesla 'cultists' to do it for him.

        I've encountered them when charging my EV. Their 'I'm so superior because of my computer on wheels' attitude IMHO sucks big time.

        I did rent a Model 3 when I was in the USA pre-COVID and after 500 miles, I'd had enough. It was just so basic even compared to my Leaf.

        Now I have an EV-6 that is superbly finished and drives better than a Model 3.

        The only Tesla that I'd willingly drive is the Model S (not plaid)

        1. Charlie Clark Silver badge

          Re: Musk doesn't really do marketing

          Quite a few of the cultists are also investors. They liked the "crowdfunding" aspect of the whole thing and also follow the financial markets, so there's a kind of feedback loop.

          I've never even been inside a Tesla, let alone driven one. But it is important to acknowledge that there has been a lot of innovation, both in the car and in its production, especially the Giga press. Less so in the marketing, which at times looked a lot like a snowball system. And, in an era where VCs were splashing cash on me too "network effect" bollocks, it was refreshing to see money being invested in new car architectures while the traditional manufacturers continued to build bigger and heavier vehicles and shun and actively lobby against lean burn motors, fuel cells and EVs.

          But none of that justifies the kind of valuation that allowed Musk to bail out his brother's failing company or accurately reflects the importance of trading emissions certificates to the bottom line.

          1. MachDiamond Silver badge

            Re: Musk doesn't really do marketing

            "both in the car and in its production, especially the Giga press."

            Somebody was very kind in flying their drone over a Tesla factory to show all of the Giga-pressed failures stacked up outside. I don't think it's ready for prime time. The concept is interesting, but there are so many points of failure in the process that I'd not want to implement it. It makes production planning and line cadence hard to do.

            The Tesla cars seem to be gadgetfests. Everything that moves has a motor or solenoid meaning it's going to wear out and/or fail at some point. Many people have already hit their frustration ceiling with the exterior door handles not working. Iced up windows mean they might not lower properly when the doors are opened. Another horrible thing just to have a frameless window. Many "features" are there because "everybody demands them" which is bollocks. Nobody I know makes a glass roof a make or break buying decision. I'd prefer not to have a glass or sun roof. It's hot were I live and I like the shade and lack of leaks in the wet.

            I have a background in aerospace and K.I.S.S. is a big thing. We were always looking to see if what we designed was as simple as it could be. Simple works and keeps on working.

    2. Peter2 Silver badge

      Provided you're aware of the system's limitations, and are *always* supervising it, it's a good way to reduce driver fatigue (your brain is no longer running the complex "regulate-speed-and-distance" algorithm, instead you're just "stay in lane, watch out for weird ACC stuff") which brings more safety benefits than the disbenefits of the system.

      I have a 20 year old car that has cruise control. On long motorway drives, it takes little intelligence to realise that the fuel efficiency on a diesel engine is really, really good at ~60 MPH. As in, something like 8 minutes per hour slower nets me a (fuel) cost saving of >20%.

      Hence you find an HGV doing 60MPH, sit a few hundred yards behind it, set the cruise control and then as you note your brain is no longer running the complex "regulate-speed-and-distance" algorithm, and nobody pulls in close behind an HGV (or in front of you) so you can just cruise onwards basically avoiding driving off the road.

      So what's the improvement in the super lane keeping cruise control to this ye old fashioned version? Personally i'm happy with limiting my problems to "avoiding other idiotic drivers trying to kill me"; adding my own car trying to kill me is not the sort of progress that I personally want.

      1. The Oncoming Scorn Silver badge
        Go

        Cruise Control - Long Journeys

        I have a older, not exactly green truck which I actually enjoy driving as opposed to being in Robocar, I usually find a optimum speed\efficiency (Usually just above the limit), set CC, then just watch the road & steer.

        Saves fuel, the accelerator creep & associated speeding tickets.

      2. katrinab Silver badge

        HGVs do 60mph because that’s their speed limit. If the were able to drive an extra 10 miles per hour, that’s maybe 50 miles in a shift, so an individual driver + lorry could do more jobs per day.

        Buses are allowed to do 70mph on the motorway, and they mostly do drive at that speed when they can.

        1. Anonymous Coward
          Anonymous Coward

          > Buses are allowed to do 70mph on the motorway, and they mostly do drive at that speed when they can.

          Bzzzzzt. In the UK and EU, passenger vehicles capable of carrying more than 8 passengers are limited to 100km/h which is a smidge over 62mph.

        2. Peter2 Silver badge

          HGV's are probably limited to 60MPH for two reasons. First is safety; they don't want huge HGV's playing racing drivers.

          Second is obviously going to be fuel efficiency; if I gain somewhere over 20% fuel efficiency (on flat[ish] roads) by going 10MPH slower then that's going to be as a result of reduced air resistance. My car is relatively pointy and therefore one assumes reasonably aerodynamic; an HGV is as aerodynamic as a breezeblock and so would probably suffer considerably worse fuel consumption at higher speeds.

          1. Dave314159ggggdffsdds Silver badge

            Fuel efficiency vs driver pay is an interesting calculation. It's an overall saving with a full truck and ample time in a driver's day - gallons per hour vs 8 mins driver pay - but massively the other way around if taking longer results in needing a second driver who has to be paid for an entire shift to finish the job.

        3. Charlie Clark Silver badge

          That's a bit optimistic. Lots of HGVs spend their time in HGV traffic jams because there are too many of them on the road. The more of them there are, the more damage they do to the roads, meaning the more road works there have to be.

          The limit is set to help reduce accidents and limit their effect. The faster a 30 tonnes (and 40 tonnes are worse) HGV is going, the more damage an accident is going to do. Then there is also the reaction time and brake distance to consider. Higher speeds should lead automatically to greater distance between vehicles, something I've yet to see.

      3. Piro Silver badge

        My car has one small improvement over basic cruise control, it's not adaptive, and it doesn't automatically brake, but if I've got cruise on and the computer reckons I might hit something, it disables cruise. It rarely ever happens, but why not, it's a worthwhile improvement without a disadvantage.

      4. Dave314159ggggdffsdds Silver badge

        "the fuel efficiency on a diesel engine is really, really good at ~60 MPH. As in, something like 8 minutes per hour slower nets me a (fuel) cost saving of >20%."

        There's a simple trick to save more while going faster: start off at 90, then slow down to 70. The proportional saving is larger, and you won't lose 8 minutes per hour.

        FWIW, I value my time at more than the cost of the fuel to go faster.

        1. MachDiamond Silver badge

          "FWIW, I value my time at more than the cost of the fuel to go faster."

          If you can get >20% better mileage by slowing down a bit, that means fewer fuel stops on a long trip saving time. I'd also rather spend my money on something other than fuel so I leave plenty of time to get where I want to go. Rarely does driving faster net me any benefit.

    3. Richard 12 Silver badge

      Tesla market it as "Autopilot" & "FSD"

      That's the problem. It makes a claim that it does not live up to.

      If they marketed it as "advanced driver assistance", there wouldn't be a problem.

      They don't. One has to wonder why that is.

    4. Anonymous Coward
      Anonymous Coward

      You have to always be paying attention -- because it's still level 2 supervised driving. So what's the point? Well, it is a pretty good way to get training data in the real world without spending much $ so obviously attractive to Tesla there. And it gives the user a 'taste' of what an FSD car might be capable of. Still not worth the $10k or whatever they're currently asking for it, but fools and money I guess.

      So what you're saying that it's perfectly acceptable to run trials on public roads with other road users, none of whom have been asked if they want to have their lives risked by something that's at best a beta, but -given the errors made - more has the smell of a hasty alpha test to keep the investors happy - with the Msuk zealots zapping video when it doesn't support the Musk narrative du jour or where the lack of adulation carries the risk of losing employment.

      I'd also note that Tesla choosing to call it "Autopilot" was by no means an accident - that choice was (and is) IMHO wilfully misleading marketing.

      1. Anonymous Coward
        Anonymous Coward

        So what you're saying that it's perfectly acceptable to run trials on public roads with other road users, none of whom have been asked if they want to have their lives risked by something that's at best a beta,

        From my reading of his post this was not what he was saying. However, it is perfectly acceptable to run planned and controlled trials on public roads with other road users, none [or pretty much a close approximation to that] of whom have been asked if they want to be involved. In fact this is necessarily a part of any sane development plan. What would the alternative be - "we tested the system on closed tracks, it passed - so we started selling it for public use.". At some point the systems need to be exposed to the general public, public road trials seems to be the best way to do this.

        Part of the planning and control should be to mitigate the risks to those other road users.

        Entrusting the oversight and monitoring of the automated systems to unregulated, untrained [in the specifics of the automation and its potential failure modes] public, particularly after numerous examples of them not behaving responsibly, seems imprudent at best. Previous articles on Tesla crashes suggests to me that both Tesla and their customer/drivers share a great deal responsibility for the losses of life involved.

    5. MachDiamond Silver badge

      "the system is simply not *trained* to process the idea of a small child appearing on a highway."

      ... which is where the human brain is going to work exceptionally well. We can adapt much more readily to off-nominal situations. We don't need to know what or how right away, only that there is something big laying across the road that we later, after we have stopped, figure out is a crane that just collapsed. It's highly unusual, but not unprecedented as would a child on a motorway would be highly unusual except in California where they need to post signs warning drivers about people running across the freeway near the Southern border.

    6. M.V. Lipvig Silver badge

      "I drive a car with adaptive cruise control, from VW; in the manual there's about 20 warnings along the lines of what the system won't do. It won't stop for totally stationary vehicles. It doesn't see pedestrians. It doesn't adapt to the weather conditions. It might brake sharply if you get cut off. It might accelerate unexpectedly if you leave the road (e.g. motorway exit). It might accelerate unexpectedly if it loses the radar signature from the car in front, and you're on a sharp bend. Despite these limitations, I would not buy a car without this function."

      That makes one of you. I won't buy a car WITH this stuff ever again. I have it all turned off on my current car. Hit construction or bad roads on a windy day and the car is CONSTANTLY warning you that you aren't driving correctly. Narrowed lane for construction? Thread the needle perfectly with no more than half an inch variance or the car starts hooting at you. Passing/being passed and the other guy can't stay in his lane? Car screams at you for staying between the large vehicle and the concrete barrier instead of the paint lines. The last thing I need when navigating a rough, tight road is the car throwing a fit or trying to take control at the worst time.

      I'm just glad enough that I'm as old as I am, that I can build myself a car that will last me till I die. And, I feel sorry for my kids and grandkids who will be stuck with self-wrecking cars.

  6. IGotOut Silver badge

    So...

    If you are somewhere in between a small child, or huge truck, you may be OK..... Unless your pushing a bike that is.

    1. Charlie Clark Silver badge
      Joke

      Re: So...

      Well, neither children nor bike riders are likely to be customers. So, who needs them? More road for the rest of us!

      1. Anonymous Coward
        Anonymous Coward

        Re: So...

        Down here we call the spandex crowd organ donors, mainly because we've already run out of motorcyclists.

    2. Anonymous Coward
      Anonymous Coward

      Re: So...

      Or an emergency services vehicle they've got a good track record of hitting those too especially police cars.

  7. Persona

    Blind

    As far as I know the Tesla doesn't have Lidar but it does have some radar as do many new cars with automatic emergency breaking. I wonder if the dummy was transparent to radar. Even so the optical system should have detected this otherwise it's going to be a bumpy ride.

    1. TRT Silver badge

      Re: Blind

      Would be interesting if the car stopped for an adult dummy. Or a life-size cardboard cutout of Elon.

      So the only way to survive crossing the roads when they're full of Teslas... hm... I sense a plan...

      1. Roland6 Silver badge

        Re: Blind

        >Would be interesting if the car stopped for an adult dummy.

        Missed a step, someone needs to conduct a moose test on a Tesla, if it can't pass that, its systems are at least 20 years behind Volvo.

    2. Anonymous Coward
      Anonymous Coward

      Sensors

      Yeah, that's not what the radar is for, and neither the cones nor the dummy probably had a radar return(unless there was a plate or rod we didn't see. Under those conditions the radar is going to see a flat road. The front stereo cameras then have to "see" the obstacles, which it manages well enough, and slows to a safe speed for a single lane stricture marked by cones. IMHO it should have kicked out at that point, but lane closures are common enough in the bay area I suspect the cars are trained to keep cruise control on and creep with traffic.

      When it gets close to the end of the line of cones it starts to speed back up to it's target speed. Seems like it was doing that when there was no dummy, so no surprise there. Not sure if it has been trained to treat rubber cones as smashables or as hard obstacles, that's the sort of tricky decision that anything on a roadway must make at least semi smart choices about. (the reason being that you don't want them causing a 10 care pileup because someone else clipped a cone and it bounced into traffic.)

      This is why I think the higher level autonomy needs to be limited to designated roadways, possibly with assists to keep the cars herding along safely.

    3. Piro Silver badge

      Re: Blind

      Radar isn't used in FSD and is being omitted entirely from newer cars.

    4. Anonymous Coward
      Joke

      Re: Blind

      > but it does have some radar as do many new cars with automatic emergency breaking.

      I think you meant "braking" but your comment was accurate as far as this Tesla goes. ;-)

    5. MachDiamond Silver badge

      Re: Blind

      "As far as I know the Tesla doesn't have Lidar but it does have some radar as do many new cars with automatic emergency breaking."

      My car just has standard breaking. I've never come across a time when I needed something broken for an emergency.

      Elon said he was deleting the RADAR in Teslas and going with a 100% camera vision system. I'm not sure where that change took place, but I expect that's all that's being used for FSD.

  8. fidodogbreath

    You'd think Musk would be aware of the existence of children, since he cranks out so many of them with his various girlfriends and mistresses.

    1. MachDiamond Silver badge

      "You'd think Musk would be aware of the existence of children, since he cranks out so many of them with his various girlfriends and mistresses."

      Most recently he only seems to be there for the easy and pleasurable part and not so much afterwards. He also boasts of only having a small bungalow in Boca Chica, TX which is not a good place for kids. Especially not for the number he has fathered so far (and are known).

      There are more than plenty of humans on the planet. The only excuse for more is to fulfill the infinite growth business models. From a fresh water and plentiful food supply aspect, we've overshot the carrying capacity. As petroleum fuels become scarcer, high intensity agriculture will become too expensive for the vast majority of the world's population. "Modern agriculture is the use of soil to turn petroleum into food" ~ Dr. Al Bartlett.

  9. Paul Kinsler

    You're supposed to keep your hands on the wheel and be able to take over at any time.

    I recently saw a conference talk by someone from a group which had actually run a sudden human-driver takeover thing as an actual experiment; i.e. going from "not driving while the car does its thing", to "you must drive now". I can't find a publication to check the detail, but two of the data points that stick with me went something like this:

    At 20mph (or maybe kph), it takes 2-3 seconds to get hands/feet properly on the controls ... and then about another 10s to regain situational awareness and be able to drive appropriately.

    At 50mph, to regain situational awareness and return to driving took over a minute (essentially, poor performance under sudden stress in a potentially unsafe situation, i.e. "panic", more or less).

    The lesson seemed to be that unless you are actually driving, "taking over the driving" in a self-driving car is emphatically not something that can be done quickly, and that any assumption that it can is entirely without foundation.

    A warning to a not-driver of these cars that they must be ready to take over at very short notice is, IMO, actively malicious, if - as it would seem - essentially impossible for most people to do so.

    1. Steve Button Silver badge

      Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

      ... aye, there's the rub. There is a massive chasm between "it drives itself, but you may need to take over at any time" and "it drives itself, and you'll pretty much never have to take over. On the one occasion every 10 million miles where someone has to take over the car stops safely first".

      Crossing that chasm will squish a lot of meatbags.

      1. The Oncoming Scorn Silver badge
        Joke

        Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

        if it drives into a chasm, it will (hopefully) only squish the occupants.

        I want to die peacefully in my sleep, like my father. Not screaming and terrified like his passengers! - Bob Monkhouse

      2. ThatOne Silver badge

        Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

        > where someone has to take over the car stops safely first

        That would be a hoot on the highway...

        Sorry, there isn't really a safe way to transfer responsibility back to the human, especially since the reason to do this will always be sudden, and most likely due to a rather complicated situation.

        There is no reason the AI would need to give you control on a straight empty country road. It most likely will happen while speeding down some complex road and traffic configuration, in which the suddenly-to-be driver will have a long "Sorry what?" moment before he even knows what he's supposed to do. By then he will have run over/crashed into whatever the software wasn't able to process.

        Hitting the brakes and suddenly stopping the car in the midst of that complex traffic situation isn't an option either: It would result in chaos in the best, and in a huge pileup in the worst case.

        1. MachDiamond Silver badge

          Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

          "There is no reason the AI would need to give you control on a straight empty country road. "

          But they do and your heart tries to burst through you chest as you scramble to figure out what the emergency is/was. Just like a jumpy passenger that freaks out at everything and makes you, the driver, nervous and shagged out by the end of the drive.

      3. SCP

        Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

        There is also the problem that as/when automated driving takes over, manual driving experience will fall off. Consequently the "average driver" (whatever that might mean in the future) will be poorly practised in driving under nominal conditions let alone suited to take over in a complex emergency situation - even allowing for the problems of reaction and orientation [to the situation].

        I believe [it has been a while since I was following what was going on] that most of the serious research in "full self driving" is looking at systems that do not require a manual back-up/emergency driver.

        I find several of the drive-assist/alert functions very helpful - particularly when driving abroad when I find my workload raised in dealing with driving on the 'wrong' [right] side of the road and different styles of road signage. I find the speed limit function good for keeping me the right side of the posted speed limit - especially as I am unaccustomed to driving on uncongested roads.

    2. Gene Cash Silver badge

      Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

      Yes, this is a similar situation in aircraft crashes. There's been a lot of times when the autopilot goes "IT'S YOURS" and the pilot takes too long to figure out the situation, or performs the wrong corrective action.

      You might want to check FAA or NTSB reports.

      1. Fifth Horseman

        Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

        Absolutely. Air France flight 447 now sadly being the classic textbook example.

        1. Dave314159ggggdffsdds Silver badge

          Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

          Not sure I'd call that classic and textbook - there seems to be a v strong implication that the pilots had been on a 3-day bender in Rio before the flight, and that it was a major contributing factor in the mistakes made. Air France made major changes in monitoring layovers at around the same time, but that was a complete coincidence :)

      2. Triggerfish

        Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

        I was just about to same the same, pretty sure I read of one major crash where the aircraft went it's yours now all of a sudden and in the ensuing confusion as they tried to get their bearings on the situation it was all to late. If that's happening with a proper trained crew then us normal folks are probably not going to do better.

      3. MachDiamond Silver badge

        Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

        "There's been a lot of times when the autopilot goes "IT'S YOURS" and the pilot takes too long to figure out the situation, or performs the wrong corrective action.

        "

        There are any number of pilots that vlog their flights and I've seen the autopilot kick out on them plenty of times with no crash. The difference is they are much more professional and always have things to do during a flight so they are more alert. The autopilot reduces their workload so they can get to everything they need to get to. If the flying is not nice and peaceful, a seasoned pilot is going to know at what point the autopilot is going on strike so they will be ready to take over. I've never seen a video where the AP just kicked out and caused an "oh, shit" moment. Not saying it couldn't happen, just not something I've seen.

    3. Doctor Syntax Silver badge

      Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

      "actively malicious"

      Not really. Just blame shedding.

    4. Martin-73 Silver badge

      Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

      Actively malicious describes Musk and Tesla to a tee, and to be frank, most tesla drivers also

    5. Anonymous Coward
      Anonymous Coward

      Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

      If you read the Tesla forums, you will soon find that there are lots of Tesla cult members who want to be able to sleep on their commute. We've already had people sleeping at the wheel of their Tesla while it drives itself. There are dozens of dodges that allow this to happen on the Internet.

      IMHO, it is these people who will effectively make self-driving a legal non-starter. The insurance companies will view it as too high a risk.

      1. david 12 Silver badge

        Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

        40 years ago one of my friends was a car delivery driver. He'd deliver a car somewhere in England or Wales, then hitchhike home again. He did so much driving for so many hours that he actually had a dream about it.

        When he woke up, he found he was driving...

        1. Anonymous Coward
          Facepalm

          Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

          A colleague once told me that he'd been driving long, early start commutes on essentially empty motorways and one day had had a dream where he was driving along and there was a slight rumbling sound from somewhere within the car. He concentrated but couldn't quite work out where the rumble was coming from but it got a bit louder and a bit louder until eventually he woke up and found he'd swerved off into the central reservation and was scraping along the crash barrier.

    6. Anonymous Coward
      Anonymous Coward

      Re: You're supposed to keep your hands on the wheel and be able to take over at any time.

      A warning to a not-driver of these cars that they must be ready to take over at very short notice is, IMO, actively malicious, if - as it would seem - essentially impossible for most people to do so.

      You're missing the point.

      The "You must take over" clause is legal speak for "it's not our fault, we can blame you for killing the child".

      As yesterday's https://www.theregister.com/2022/08/08/tesla_california_autopilot/ story points out. They claim that there have been zero accidents with Autopilot in control because AP cut's out milliseconds before impact and the get out of jail clause says the human driver had to take over from here.

      1. Anonymous Coward
        Anonymous Coward

        legal speak for "it's not our fault, we can blame you for killing the child".

        Deliberately trying to transfer blame to someone for something outside their ability to control sounds malicious to me; irrespective of whether it is couched in "legal speak" or not.

        1. Anonymous Coward
          Anonymous Coward

          Re: legal speak for "it's not our fault, we can blame you for killing the child".

          My point precisely

          (meanwhile a real lawyer locked in a dark cupboard out of harms way will be banging their heads on a table moaning it's not going to work, you can't really just avoid responsibility that easily)

  10. ChoHag Silver badge

    Sample size is fine

    The sample size might be a problem if in a dozen or so tests there was one almost-touch (or none); that would demand a few hundred more to assess the replicability. According to their report 3 samples was more than enough to prove the software 100% unsafe.

    1. druck Silver badge

      Re: Sample size is fine

      Repeating the experiment until it eventually manages not to stimulate killing a small child, probably isn't going to be much consolation.

      1. druck Silver badge

        Re: Sample size is fine

        'Simulate' even - that'll teach me to post from a phone.

  11. Snowy Silver badge
    Joke

    Child sized object.

    How big is a child sized object. Richard Osman has a Mum therefore he at 2m tall is a child sized object and if misses him the system is junk.

  12. Ramis101
    Joke

    1 in 10 Left turns is bad enough but 100% of children????

    Following on from the Tesla can turn left 90% of the time without crashing i am relieved to hear it can at least do something 100% perfect, even if this is running over children. Give them credit, they nailed something!

    1. MachDiamond Silver badge

      Re: 1 in 10 Left turns is bad enough but 100% of children????

      "Give them credit, they nailed something!"

      Now, if there were only a setting they could invert......

  13. Version 1.0 Silver badge

    Cats and Dogs?

    So the Tesla code is written to avoid cats and dogs hopefully but they didn't check the code with kids in the videos? This is just a typical code writing environment - the programmers were give a task which they achieved (the Tesla drives in a straight line) but were not coding in the universal environment. It sounds like the functionality was verified in testing with adults but not in the world. I wonder how this will work in the countryside when you drive around a corner and meet a flock of sheep?

    Essentially Tesla helps you drive but doesn't drive 100% without a sober human holding the wheel.

    1. Doctor Syntax Silver badge

      Re: Cats and Dogs?

      This is what's apt to go wrong with specifications. Specify exactly what the system is supposed to do and it might do that and nothing else.

      Never mind the flock of sheep, can it cope with just one sheep that looks as if it's about to head into the road? And how does it recognise "looks as if it's about to head into the road"?

      1. DS999 Silver badge

        Re: Cats and Dogs?

        Around here it is "a deer that looks like it is about to head onto the road" but the issue is the same. In both cases you need to be ready to brake immediately, and slow down if you wouldn't be able to stop in time at your current speed.

        Driving at night, the only hint of a deer you might have is a single eye reflecting your headlights, or if you're lucky a faint outline against the background in the ditch alongside the road. Presumably an autonomous car with radar would have a much better picture, but its AI still has to recognize it as a deer - and err on the side of caution so if it works properly you should find your car slowing a bit when it sees an oddly shaped bush.

        Until autonomous cars really are ready putting that sort of AI into manually driven cars could still help. If the radar sees movement on the side of the road the car could adjust the matrix headlights (which the US is finally going to allow) and basically "circle" it for you. That would probably significantly reduce the nighttime car/deer crashes around here (and the car/moose crashes in Canada which can easily be fatal for the driver)

        1. heyrick Silver badge

          Re: Cats and Dogs?

          Around here, there's a little town that has made "people" from scrap metal and wheels and such. Painted them brightly, placed them by the sides of the roads as art.

          Given their locations, I'm surprised that no children have been injured yet (unlikely killed, it's a 30kph zone), but still, placing one right beside where kids wait for a school bus is dumb. It's literally training drivers to ignore potential targets hazards because "it's just that metal thing" until maybe it's too late to stop in time?

          I would hope a property functioning Tesla would crap itself and drive at, like, walking pace through that town.

          1. Claverhouse Silver badge

            Re: Cats and Dogs?

            A village in Suffolk has around here, a ---- not at all lifelike [ except perhaps to speeders: but I would prefer a perfect replica ] --- dummy of a policeman by the side of the road.

            1. Peter2 Silver badge

              Re: Cats and Dogs?

              And a printed copy of one of those "your speed "X" MPH" signs, with a fixed value of something like 34MPH just up the road from it, as I recall.

              1. DS999 Silver badge
                Trollface

                Re: Cats and Dogs?

                A small town near where I live used to have a radar speed sign just before entering town, below a "speed limit 25" sign. They took it down because in the middle of the night someone kept sticking a piece of cardboard on the signpost below the speed indicator listing their "high score".

                That was many years ago, before photo radar was a thing.

          2. MachDiamond Silver badge

            Re: Cats and Dogs?

            "Around here, there's a little town that has made "people" from scrap metal and wheels and such."

            Going bang into a pile of scrap metal isn't the same as running over a child, but most drivers won't want to experience either one. I don't see how the art craze inures people to the point where they might not worry about hitting something.

      2. Roland6 Silver badge

        Re: Cats and Dogs?

        It also indicates what can go wrong with "Agile" and "let the developers write the requirements" - probably find that the car will perform an emergency stop for a high-value Pokemon...

  14. Jan K.

    Meh... children are cheap and easy (and fun!) to reproduce. ^ ^

    "Tesla says even FSD is not a fully autonomous system..."

    So what is that then to be called? Full fully autonomous system? Really Full FSD? Extreme FSD?

    Btw. can't wait to see the soon to be released 58 kg Tesla robot taking on jobs out here in "real life". Kindergarten teacher?

    1. Doctor Syntax Silver badge

      Will the Tesla cars run down Tesla robots? And which will come off best?

      1. F. Frederick Skitty Silver badge

        Tesla's answer would be to arm the robots, this being the USA.

        "You have twenty seconds to comply..."

      2. MachDiamond Silver badge

        "Will the Tesla cars run down Tesla robots? And which will come off best?"

        The robot. It's imaginary and unlikely to ever be a tangible product in our lifetimes.

    2. Anonymous Coward
      Angel

      > Meh... children are cheap and easy (and fun!) to reproduce. ^ ^

      Ah, the typical male view of childbirth.

      [Icon: because there's no longer any icon picturing a real-life mother.]

      1. Jan K.

        Ah! typical male wise I of course wasn't thinking _that_ far! :)

        Bows down in respect and admiration to the mums everywhere!

  15. BobC

    Road Awareness and Secondary Sensors

    The underlying issue is situational awareness, or lack thereof. Tesla FSD has problems with motorcycles and children, especially at the edges of the lane.

    When such things ARE detected, the correct responses are clear and proven. But none of the responses can be activated if no detection occurs!

    In part, I believe this is due to Tesla's removal of front-facing radar. Not that radar itself is a cure-all, but having ANY secondary detection method (something non-camera in this case) will help, even if it is less than perfect on its own.

    There are several other relatively inexpensive technologies that can help with the secondary detection role, including infrared ranging/imaging, LIDAR, ultrasonic sensors and more.

    To me, this is fundamentally a sensing problem, which falls way upstream of the software, the vehicle, and the driver. If you miss noticing something, you can't avoid it.

    1. Doctor Syntax Silver badge

      Re: Road Awareness and Secondary Sensors

      "The underlying issue is situational awareness"

      Or even just plain awareness.

    2. ThatOne Silver badge
      Devil

      Re: Road Awareness and Secondary Sensors

      > having ANY secondary detection method [...] will help

      What about a white cane?

  16. Ken Moorhouse Silver badge

    FSD

    Full Scale Deflection of anything in its path.

  17. Someone Else Silver badge

    You're supposed to...

    You're supposed to keep your hands on the wheel and be able to take over at any time.

    You're supposed to not drink and drive. You're supposed to maintain a safe interval between you and the car in front of you. You're supposed to use turn signals. You're supposed to come to a full stop at a stop sign. You're supposed to not text while driving. You're supposed to change one lane at a time on a multi-lane highway. You're supposed to keep an eye out for motorcycles (and your supposed to not run into/over them). You're supposed to not leave the scene of an accident you're involved in. You're supposed to move over and yield the right of way to emergency vehicles. You're supposed to put your kid in a car seat appropriate for their age. You're supposed to....

    And yet, the doofusim, day after day after day, don't do any of these things -- often at the same time. Who, with an IQ above room temperature, could possibly believe that anybody who owns a shiny shiny Tesla is going to keep their hands on the wheel using their "Full Self-Driving" autopilot cruise control?

    And for that matter, that they're not going to do any of the other brain-dead things on the above list, either?

    1. Pascal Monett Silver badge
      Coat

      Re: You're supposed to...

      Oh of course not.

      And they're definitely not going to go take a nap on the rear seats while leaving the car drive itself.

      Right ?

      I mean, how stupid would you have to be to imagine that that would be a good idea . . . oh wait.

    2. DS999 Silver badge

      Re: You're supposed to...

      If they really expected people to do that they'd have sensors in the steering wheel to detect your hands and have a camera in the car to make sure your eyes were looking at the road.

      Gaze detection should be a law for any sort of "L2+" system that can slow/stop and change lanes on its own - because it is too easy for people to turn that on during their commute and start fiddling with their phone or even read email.

      Meanwhile Tesla fans post videos of themselves sitting in the backseat while the car drives them, since their steering wheel sensors are easily defeated and they don't have hardware to make sure the driver is looking at the road.

      1. JohnG

        Re: You're supposed to...

        "If they really expected people to do that they'd have sensors in the steering wheel to detect your hands and have a camera in the car to make sure your eyes were looking at the road."

        Funny you should say that.... For some time now, Tesla cars have required drivers to exert pressure on the steering wheel at intervals based on the car's speed (essentially, drivers prove their humanity by resisting the sensible course chosen by the AI). Newer Tesla cars have cabin cameras and some hackers have posted videos which show that Tesla is experimenting with software to detect if the driver is paying attention to the road.

        Tesla's software refuses to engage autopilot below 30mph and disengages (with a warning) if the driver forces the car past 90mph.

      2. MachDiamond Silver badge

        Re: You're supposed to...

        "Gaze detection should be a law for any sort of "L2+" system that can slow/stop and change lanes on its own - because it is too easy for people to turn that on during their commute and start fiddling with their phone or even read email.

        And nobody is going to worry about having a camera pointed at them while they are driving and sending the video to ..... where does it go again? Who has access? Will the car company rat me out or will they require a court to issue a subpoena first? (it's the former, btw). No, yeah, no. I'll skip the creepy camera in the car thing.

  18. DenTheMan

    But ....

    Children do not pay road tax.

    1. Roland6 Silver badge

      Re: But ....

      Neither do owners of electric vehicles currently in the UK...

      1. Anonymous Coward
        Anonymous Coward

        Re: But ....

        As Road Tax was abolished in 1937 this should come as no surprise. Vehicle Tax, on the other hand...

    2. midgepad

      Re: But ....

      Nobody does.

  19. Anonymous Coward
    Anonymous Coward

    I don't make a habit of betting against technological innovation. People said chess would never be played by a computer before it was. I have no doubt that one day, self driving cars will be better at driving than humans ever were.

    But I think Tesla has pretty definitively shown that one day is not today. Not yet.

    1. Doctor Syntax Silver badge

      "I have no doubt that one day, self driving cars will be better at driving than humans ever were."

      It might vary depending on where you live but if you take the number of vehicles on UK roads, reasonable estimates for annual mileages and the statistics for fatal accidents the bar for self-driving cars is fairly high and higher still when you realise that it should match or beat the experienced driver and the accident statistics are skewed by inexperienced drivers.

      1. Dave314159ggggdffsdds Silver badge

        "the accident statistics are skewed by inexperienced drivers."

        OK, boomer. Still ain't true, but keep telling yourself that while being a terrible driver, we're used to you lot ruining everything.

        1. Peter2 Silver badge

          This suggests that he's right:-

          Research from road safety charity Brake found that despite 17 to 19-year-olds only making up 1.5% of driving licence holders, they are involved in 9% of all fatal and serious crashes in the UK. It is for this reason that new driver insurance premiums are extremely expensive.

          At ALA we conducted some of our own research into the issue of young and new driver accidents, considered reasons behind the phenomenon and looked at how new all drivers can protect themselves.

          New Driver Accident Rates

          From our research we found that just over one in five (21.6%) new drivers had been involved in an accident during their first year of driving. 26.12% of new drivers aged between 18 and 24 admitted to having an accident in their first year.

          (https://www.ala.co.uk/connect/tackling-young-driver-accidents/)

          Which to be fair is intuitively right; people with less experience make more mistakes.

          On the other hand, the mistakes that younger drivers make are rarely fatal, especially with improved vehicle designs; young drivers tend to end up in rear end shunts or accidents pulling out of junctions having checked they were safe left, then checking right, pausing too long and then pulling out into the path of a vehicle that had come along since they decided that left was clear. The more fatal accidents are more generally connected to driving under the influence of drink or drugs, or racing to impress their mates.

          Honestly? The worst human drivers are considerably better and safer than an unsupervised Fully Self Driving Car. Which we know; which is why they aren't allowed to drive without a human driver in the seat. My issue is that the car can put you in an inescapable position and then throw control at you. It's fundamentally unsafe.

          1. Dave314159ggggdffsdds Silver badge

            Yes, that's the classic boomer line - and it's akin to the 'covid vaccines don't work' nonsense. If you adjust for passengers carried - a huge risk factor - the stats are entirely the other way around. Which makes sense, because the younger drivers have passed a proper test, whereas the boomers were released onto the roads for (pretty much) just showing up at the test centre.

            1. Someone Else Silver badge

              If you adjust for passengers carried - a huge risk factor - the stats are entirely the other way around.

              Citation, please (or it never happened.)

              If you want to play the "OK, boomer" card, Dave, then I'll counter with: this is a typical Millennial/GenZ response -- to make a statement without supporting evidence, and assert it's true simply because you made it.

              1. Dave314159ggggdffsdds Silver badge

                https://abcnews.go.com/Health/Healthday/story?id=4507310&page=1

                A popular report. There is a whole body of scientific evidence here.

                ""Drivers with passengers were almost 60 percent more likely to have a motor vehicle crash resulting in hospital attendance, irrespective of their age group. The likelihood of a crash was more than doubled in the presence of two or more passengers," "

                That is a standard result.

                1. Someone Else Silver badge

                  Nice -- thanks for the link. However, that doesn't support your assertion that

                  If you adjust for passengers carried - a huge risk factor - the stats are entirely the other way around.

                  ...in other words, that 17-19 year olds do not have some 6 times higher rate of fatal or serious accidents.

    2. Anonymous Coward
      Anonymous Coward

      DaaS - Uber on steroids

      Rather than SDC, implement drone technology in vehicles so your car can be remotely driven as a service by someone qualified to do so. What could possibly go wrong?

      1. Roland6 Silver badge

        Re: DaaS - Uber on steroids

        See another EE 5G network ad here...

        Not sure if that counts as "going wrong"....

    3. MachDiamond Silver badge

      "I have no doubt that one day, self driving cars will be better at driving than humans ever were."

      Possibly, but at least in the beginning, the roads will need to evolve to accommodate autonomous cars. Where I live there is pavement, dirt and a varying combinations of both with not much in the way of painted striping that isn't a long lost memory that only a trained archeologist could find. Coupled with winds that can blow drifts of dirt over even the paved sections, an autonomous car doesn't have a chance. Sand, snow and heavy rain can lead to needing the more flexible human brain to determine where the road is. The cues that people can use aren't going to be valid for a silicon brain so methods will have to be implemented so autonomous cars remain on the road.

  20. mevets

    Mr. Freud, your patient is waiting.

    Maybe if Elon didn't have those Daddy issues he would insist that his vehicles not target children.

  21. Winkypop Silver badge
    Devil

    Why didn’t the little kid make it across the road?

    FSD in beta

  22. Potemkine! Silver badge
    Trollface

    It's not a bug, it's a feature. As less children means less pollution, it's an innovative way to save the planet.

  23. Mike 137 Silver badge

    So that's alright then...

    "Tesla says even FSD [which supposedly stands for 'Full Self-Driving'] is not a fully autonomous system"

    So might it really stand for 'falsehood seduces drivers'?

  24. JohnG

    Is there a video showing these tests which includes a view of the interior of the Tesla (i.e. the driver, the steering wheel and the Tesla screen)? I haven't found one and I would have thought this would be essential for O'Dowd's claims to have any credibility (especially given that he isn't just someone interested in automotive safety - he owns a company which touts an alternative to Tesla's Autopilot & FSD).

  25. John Robson Silver badge

    Presumably they’re investigating abs, airbags and seatbelts…

    “ In early June, the US National Highway Traffic Safety Administration upgraded a probe of Tesla Autopilot after it found reasons to look into whether "Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks." The investigation is ongoing.”

    On the basis that it is well known that motorists take more risks with any safety feature on a vehicle.

    It’s pretty easy to demonstrate that seatbelts, for example, had negligible impact on traffic fatalities.

    Their undeniable safety benefit was simply eaten up by drivers taking more risks, at the expense of those outside the vehicle.

    There is nothing new under the sun.

  26. Tim99 Silver badge

    Not perfect

    My base 2019 model VW Golf 7.5 with "Driver Assist" applied the brakes (suddenly) when a child (8 years old?) ran around the back of an SUV in a shopping centre car park. We were travelling at <20 km/hr and would not have hit the child, but the car seemed to err on the side of caution, no damage other than putting my heart rate up. Is it perfect? Of course not. When the radar cruise control is on, it doesn’t understand that undertaking is legal here, so I have to touch the throttle to stop the car decelerating (fairly quickly), it also does this if a vehicle turns into a right-turn lane. The lane centring gives a slight writhing effect on curved roads so I’ve turned it off, but just using the lane assist does give useful feedback if you get close to the line without indicating.

    One thing that I have found useful is driving in slow speed stop start traffic on a well marked freeway; it will control the stop and start of the car, but leaves a bigger gap between the car in front than a human driver would, and also takes about 4 seconds longer to move forward from stationary than I would (I learnt to drive in London). If we are stopped for more than about a minute, the dash gives an indication that you will need to depress the throttle to move off. If the driver’s hands are taken off the wheel for more than a few seconds, a warning sounds, and if no action is taken the car is brought to a halt. Self-parking is sort of OK, but I hardly use it except occasionally to reverse park into a tight spot. The car has automatic transmission (they nearly all have here) YMMV with a manual (pun, sorry).

    If all vehicles had transponders to broadcast where they were, and what they were doing, it seems possible that an autonomous vehicle would be safer than a flawed human - Particularly here, where I have seen people on the freeway shaving (thankfully an electric razor), putting on make-up; and my personal favourite, undoing a Dewer flask and pouring a hot drink into its cup placed on the dashboard.

    1. MachDiamond Silver badge

      Re: Not perfect

      "If all vehicles had transponders to broadcast where they were, "

      The issue is how the car would determine where it is to broadcast their location. GPS is only good to within about 3m and can be much worse. That's fine for the satnav to give you directions, but not good enough to navigate the car. Compound that margin by another car fitting itself on the road using that data along with its own error factor. There are versions of GPS that are much more accurate, but they take a fixed transmitter for differential corrections and permission from a government agency to buy an unlock code to turn on real time kinematics. In the event of a military emergency or even just a drill, the consumer side of GPS can be shut off or errors introduced to fool guided ordinance and aircraft.

  27. NeilT

    You couldn't make it up. Their own video shows that FSD was NOT engaged. So the driver drove the car up to speed, tried to engage FSD and when FSD failed to engage, the driver, not the software, ran over the model child.

    Classic. Foot, aim, shoot. Reload. Shoot again.

    There are other systems on board the Tesla which should try to avoid the collision. But they are NOT Full Self Driving. The human was still driving the car.

  28. SJA

    FSD was NOT ACTIVE

    Electrek shows, that FSD was NOT ACTIVE:

    https://electrek.co/2022/08/10/tesla-self-driving-smear-campaign-releases-test-fails-fsd-never-engaged/

    1. Roland6 Silver badge

      Re: FSD was NOT ACTIVE

      Interesting link, however it does highlight something that is going to be a big concern namely the ability of a user to turn stuff on and off whilst driving. Okay this is a beta, but we (Elreg readers) know about beta and production...

      I was a little concerned that the driver enabled FSD mode whilst the car was moving rather than from stationary, given it seems engagement is not instantaneous, which would also suggest the warnings (display and audible) are also insufficient.

    2. MajDom

      Re: FSD was NOT ACTIVE

      I wonder when The Register is going to rectify their miss-reporting.

    3. MachDiamond Silver badge

      Re: FSD was NOT ACTIVE

      The camera work shows the testing was performed very poorly. Why wouldn't they have solidly fixed a camera inside the car instead of using something obviously handheld? Even YouTubers do better and some are experts in rigging cameras in a car. They spent more time getting the cones places exactly.

  29. Fursty Ferret

    Not sure what this is meant to show. FSD isn't even engaged in the demonstrations, so it's just a test of the standard AEB system fitted to Teslas (which is admittedly not the best compared to Volvo etc).

    If you're going to use something in a media campaign against Tesla then you should at least turn on the feature you want to complain about.

  30. Anonymous Coward
    Anonymous Coward

    Every cloud has a silver lining.

    It appears that Ronald McDonald bought it in one of those photos.

  31. Johnb89

    Is FSD beta in the UK?

    I've looked around some and haven't been able to decide if Teslas in the UK are able to run the FSD Beta. Anyone know definitively?

    If so, under what law or regulation? How many cars have it? (How can we identify them to put "No kill I" stickers all over the windscreen?)

    I ask as a concerned/terrified cyclist/pedestrian/occasional driver.

    1. RS_Zurich

      Re: Is FSD beta in the UK?

      No. My information is currently only USA and soon Canada.

  32. MachDiamond Silver badge

    Competent approvals

    While the California DMV is threatening to yank Teslas teeth on AP/FSD, they were also one of the first states to give blanket approval for testing these systems on public roads before even working out a safety protocol. I remember I was unhappy when the governor proudly made a big deal over signing the documents on camera. The highway patrol was supposed to then work out what the rules would be, but until then, it was open season.

    Since autonomous cars have such a big potential of doing massive amounts of harm, manufacturers should have to do and document extensive testing, have their results verified by an independent testing agency to codified standards before getting a permit to operate the vehicles on open public roads. Before they can be sold to the public, there should be another round of approvals. One of the reasons air travel is as safe as it is can be traced to the myriad testing and standards every single component must go through to be certified for use on a plane. It's possible to get an "experimental" certification if you'd like to home build your own aircraft, but it still has to pass inspection before it's allowed to legally fly. An EV that can accelerate from 0-100kph in mere seconds weighing as much as EV's do have shown that getting airborne doesn't require wings, just a software glitch, just the perfect storm of bad shit happening all at once.

  33. RS_Zurich
    FAIL

    Full Self Driving (FSD) was not even engaged!

    Be aware that this has been debunked! In all the tests on that video, FSD was not engaged. Full explanation here: https://electrek.co/2022/08/10/tesla-self-driving-smear-campaign-releases-test-fails-fsd-never-engaged/.

    tldr;

    The display changes colours when FSD is engaged - it is clearly not engaged

    The camera angles seem deliberately chosen to hide the warning message, that probably says "FSD not available" or similar

    The screenshot shown is just the standard warning that FSD is still in beta.

    I am a long-time Tesla owner (who does not yet have FSD) and love my car. Tesla is not perfect, but this is just another Ralph Nader style hit piece by a wannabe US senator.

    1. diodesign (Written by Reg staff) Silver badge

      Re: Full Self Driving (FSD) was not even engaged!

      FWIW we've added a large update to our piece. I'd like to highlight that we drew attention to the weirdly small sample size and the incorrect use of Autopilot in our initial reporting, which may leave readers rightly healthily skeptical of the project's claims.

      C.

      1. Kapsalon

        Re: Full Self Driving (FSD) was not even engaged!

        Thanks for the update, it is probably the best update I have ever seen.

        FSD is not easy and only time can tell how we will get there.

        For the time being extreme caution is needed when using these features.

  34. midgepad

    Quite often human drivers

    manage to miss pedestrians.

    Quite often these automated systems manage to miss pedestrians.

    It is nice the smaller the gap gets where neither manages it.

    The large overlap is in retrospect.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like