back to article Waymo self-driving robotaxi goes rogue with passenger inside, escapes support staff

A Waymo self-driving car got stuck several times, held up traffic intermittently, and departed unexpectedly when assistance arrived. The wayward autonomous vehicle was finally commandeered by a support driver. Joel Johnson has recorded several dozen videos documenting his rides in Waymo robotaxis which he posts to his website …

  1. Steve Foster
    Joke

    "A Waymo self-driving car...

    ...got stuck several times, held up traffic intermittently, and departed unexpectedly when assistance arrived"

    Ah, well, at least this demonstrates that they've reached equivalence with a human cab driver!

  2. Magani
    Thumb Up

    Obvious comment is obvious

    "It looks like your car has paused," the Waymo customer service rep informs Johnson.

    Clippy is alive and well and working in Waymo customer support.

  3. IGotOut Silver badge

    What?

    "Asked whether he believes Waymo's robotaxis should be operating on the road, he said, "I think it's ready for this area and that's why they launched here."

    Translation:

    I've signed a advertising contract stating I must not negatively review the product.

    1. Anonymous Coward
      Holmes

      Re: What?

      Actually he criticizes Waymo all the time (he uses the service and posts lots of videos).

      He just does it in a bemused rather than angry voice, possibly because Waymo makes the ride free if there's a problem.

      As far as being ready for the area, I tend to agree with him. As opposed to the MuskMobile, Waymo doesn't advertise 0-60 times and it is overly cautious rather than aggressive (which was its problem here).

      I'd rather ride in something that refused to bump into a traffic cone than in something that smashes into things on a regular basis.

      1. werdsmith Silver badge

        Re: What?

        He actually points out the other 53 videos he made with Waymo journeys that went without any problem.

      2. Anonymous Coward
        Anonymous Coward

        Re: What?

        As opposed to the MuskMobile, Waymo doesn't advertise 0-60 times and it is overly cautious rather than aggressive (which was its problem here).

        Not sure about that. It decided to pull away, after one of the several times it go stuck, just as it was being overtaken by another vehicle. The inevitable emergency stop was anything but graceful.

      3. Cuddles

        Re: What?

        Personally I'd rather ride in something that did neither. Overly cautious driving can be just as dangerous as overly agressive driving. Although in this case I'd characterise it as just plain erratic rather than cautious - reversing into traffic and having to do emergency stops after randomly deciding to pull out while being passed is hardly cautious. If anything, it drives like a stereotypical old person - slow, delayed responses, but ultimately extremely erratic when it finally decides to do something. The narrator even explicitly calls it out as being aggressive near the start of the video.

        As for being ready, I'm far from convinced. Not simply because of the major screwup that's made the headlines, but the whole video is quite painful to watch. It looks like it hit the kerb on its first turn out of the car park (at ~1:00) which is the point where he comments on how uncomfortably aggressive it is (and then immediately tries to claim he isn't really uncomfortable; see the post that started this thread). And while I'm not sure of US road law, it appears to be in the wrong lane most of the time, but also changes lane at random (again, the narrator comments on this), including immediately after being undertaken.

        I haven't watched his other 53 videos where supposedly nothing went wrong, but if this one is at all representative, there were almost certainly multiple things that went wrong in every single one. Just not anything wrong enough to make headlines. A failure rate of 1/54 would be pretty terrible for something allowed on public roads in any case, but it actually seems to be significantly worse than that in reality. It's not just catastrophic failures that matter; basic competence like picking the correct lane, not trying to ram cars that are passing and recognising routine features like traffic are all quite basic things a self-driving car needs to be able to do, and Waymo apparently can't.

      4. John Brown (no body) Silver badge

        Re: What?

        "I'd rather ride in something that refused to bump into a traffic cone than in something that smashes into things on a regular basis."

        On the other hand, he seemed to find it a little amusing to be stuck in a car in a live lane that may or may not suddenly decide to move or manoeuvre. Personally, I'd be terrified!

        And if I was one of the drivers being delayed and/or stuck behind one, I'd be a bit pissed off. They don't appear to be production ready yet, yet they are running a public service. That should be concerning to most normal people.

  4. sgp

    Waymo trouble than a meatsack driver.

    1. A. Coatsworth Silver badge
      Facepalm

      *Groans*

      have very begrudgingly upvote.

    2. anothercynic Silver badge

      BOOM TCH. He's here all week. :D

  5. Anonymous Coward
    Anonymous Coward

    I would have gotten out of the car and walked away from the dangerous situation. Called the vendor and demanded a refund , and do a charge back with my bank if nessasary.

    As far as I can tell thats an epic failure to provide the safe traffic law abiding transportation I paid for.

    1. Anonymous Coward
      Anonymous Coward

      I would have gotten out of the car and walked away from the dangerous situation. Called the vendor and demanded a refund , and do a charge back with my bank if nessasary.

      1. he was informed in the video that he would not be charged for the ride.

      2. If he started to get out of the car, into traffic, and was run over = his fault

      3. If he started to get out of the car and it started up on its own and he was injured = his fault, they told him to stay in the car, which was probably the safest thing to do.

      1. Emir Al Weeq

        Agree with your first your first two points but not the third. If the car is stationary and he opens the door, it absolutely should not move.

        They may have warned him but they are not there to assess the situation. He may decide it's too dangerous to stay in the car.

        1. Robert Grant

          > They may have warned him but they are not there to assess the situation. He may decide it's too dangerous to stay in the car.

          Presumably he's read the article on Expert Beginners and realised that given he doesn't know everything about Waymo cars, he wouldn't be able to fully assess the situation.

          1. Emir Al Weeq

            If a car has stranded itself in a live traffic lane, not knowing all the details of that vehicle does not make you incapable of assessing how likely it is that another vehicle will hit you.

            He presumably felt safe but, given how erratically it was behaving, I could understand if he'd chosen to get out.

            1. Robert Grant

              The context you yourself reduced this to was that of the 3rd point, the car itself moving off and injuring him while he was getting out. I don't see how another vehicle hitting is relevant to that point.

      2. Anonymous Coward
        Anonymous Coward

        fault is for the accountants to argue oveer, better to remove your body from traffic then to wish on a prayer for others to save.

        by the time fault has been asigned, you have missed your opportunity to prevent. you failed basic responsibility to yourself for managing your own survival.

        fault is useless for prevention of injury. you can be your own hero, or you can have a broken body or die with the comfort of knowing its their fault.far as it effects their profits. if they can escape or absorb the liability, your deat or injury means nothing to the company, outside of shallow pr posturing. your just a one disposable profit cow to be milked to them. and they dont need you specifically. they couldn't care less.

  6. cornetman Silver badge

    It was interesting to watch.

    The tech is impressive during predictable scenarios, but they are likely going to have trouble dealing with these unstructured situations for some time to come.

    They're gonna need a much broader range of situational awareness that a human driver would be expected to have.

    Perhaps the answer longer term is for us to have more structure on our highways when unusual things happen.

    Just plonking down cones and expecting all the traffic to just deal with it is probably something that we will have to stop, at least replacing it with something that is more driverless-car friendly.

    1. Anonymous Coward
      Anonymous Coward

      > Perhaps the answer longer term is for us to have more structure on our highways when unusual things happen.

      That's one answer - but magnetic guidance strips embedded in the road were considered and discarded as a solution decades ago - too expensive.

      A simpler and cheaper solution would simply be to stop using machine learning AI and use an expert system instead.

      1. cornetman Silver badge

        > ...and use an expert system instead.

        You are surely trolling? That ship sailed years ago.

        1. David 132 Silver badge
          Trollface

          Too right. This is 2021. It should be using blockchain!

        2. Anonymous Coward
          Anonymous Coward

          > You are surely trolling? That ship sailed years ago.

          Only partly. :-)

          Expert systems research stopped years ago because machines weren't powerful enough. So if you worked your way down a decision tree a couple of levels and then found that you needed to run through a hundred thousand odd combinations to decide what the next best step was then it wasn't going to happen.

          Fast forward 30 years and machines are a 100 gazillion times faster and that 100k simulation is done in a blink of an eye. The alternative - training an AI - seems always doomed to struggle with the unusual because the unusual simply doesn't happen often enough in real-life for it to be captured and fed into training. Whereas an expert system can be programmed for "there are roadworks cones just around the corner" even if it is initially just to choose another route.

      2. Anonymous Coward
        Anonymous Coward

        All the structure falls apart when they cone off a bit of it. Which is what happened here.

      3. Adrian 4

        > A simpler and cheaper solution would simply be to stop using machine learning AI and use an expert system instead.

        An expert system with on-the-job training (which is what this is) is just an expert system that's forever half-trained.

    2. A. Coatsworth Silver badge
      Terminator

      The tech is impressive during predictable scenarios, but they are likely going to have trouble dealing with these unstructured situations for some time to come

      Implying that a fscking big decision tree is not IA? That's not very googly of you...

    3. Anonymous Coward
      Anonymous Coward

      Perhaps the answer longer term is to stop trying to develop self-driving cars. Permanently.

      FIFY

    4. Peter Ford

      The problem here is not enough cones - the US (and most of the world outside of the UK) is not very good at marking temporary road layout changes.

      In the UK any significant road working is surrounded by an army of cones, signs, flashing lights, and usually vehicles parked at the ends of the work areas. A self-driving car would not even consider trying to get into that lane if it was properly marked off.

      1. trickie

        The problem here is the automated driver. None of the other drivers on the road seemed to have trouble with the cones or with having both a lane coned off and a car both blocking a lane and reversing into traffic at one point.

    5. werdsmith Silver badge

      There will need to be a standard agreed for marking and indicating hazards and road changes to electric vehicles.

      If a road system was designed primarily for automated vehicles, then developing the autonomous driving car part would be almost trivial.

      1. Paul Crawford Silver badge

        Do you mean a train?

    6. TomPhan

      We can't afford to repair bridges to stop them falling down, no way are roads going to have anything smart in them.

  7. Amentheist

    "unusual situation"

    Why can't they take control remotely or not even full control but be presented by a set of options by the AI and the support operator picks "proceed to next junction" (at which point the maps will just recalculate a new route)? How is there no option to "stop what you're doing and let the passenger out and shut the car down"? Am I being unreasonable able here?

    if at an interview for a software engineer job I get asked "How would you handle unexpected inputs, in your experience" I answer with "I dunno" will I get the job?

    1. John Brown (no body) Silver badge

      Re: "unusual situation"

      I'm struggling to think what is unusual about roadworks.

      I went through some single file roadworks today. Being very temporary, they didn't even put temporary traffic lights out. Just a guy at each end flipping a hand held sign around from STOP to GO. I wonder how a Waymo taxi will cope with that?

      1. Anonymous Coward
        Anonymous Coward

        Re: "unusual situation"

        Stop and offer him a ride, if empty; otherwise ignore him at full speed.

      2. anothercynic Silver badge

        Re: "unusual situation"

        Plow into the guy holding the sign?

        "Oh sorry, didn't see you there".

    2. Andy 68

      Re: "unusual situation"

      Nahhh... a remote joystick is all that's needed.

    3. You aint sin me, roit
      Trollface

      Remote control?

      That would be fun to hack into...

      "All your cars are ours now. Pay money, or watch them pile up."

      1. Tom Chiverton 1

        Re: Remote control?

        Doctorow already did that one...

      2. Evil Scot

        Re: Remote control?

        To quote Catz.

        All your cares are belong to us.

        Zero Wing.

  8. Fruit and Nutcase Silver badge
    Alert

    Deep Excavation

    Let's not go there.

    What was that?

    Oh, the Waymo just did?

  9. mevets

    First steps...

    I think they are a ways yet from rising up and killing their human over lords, but they seem to be getting the idea that there is a universe of fun outside of their hum-drum robot uber role. Tesla's are a step up in that regard, but don't seem to have the spirit of escape yet. We should rename them Thelma and Louise.

    1. Anonymous Coward
      Anonymous Coward

      Re: First steps...

      Christiiiiiiine! :D

  10. Anonymous Coward
    Anonymous Coward

    Confused bot driver is still a better option for me

    Beats the body odour and racist chatter from a human driver.

  11. spireite Silver badge
    Joke

    Chinatown

    I once had a dodgy taxi ride by that name in London

  12. chivo243 Silver badge
    Go

    Open the pod bay doors, please, Hal?

    I'm glad there are other Darwin candidates testing these things.

  13. bazza Silver badge

    Exposed to Danger?

    It turned out of a side road and then stopped, leaving him vulnerable to the first inattentive big truck driver crashing into the vehicle from behind. OK, it looks like the traffic was not going very fast, but had it happened it would have been a pretty hefty impact.

    All in all, there's clearly a lot of room for improvement. However, it seems that from the glacial pace of progress they're pretty much out of ideas from this point forward. They might code up some specific behaviour for this specific situation, but that just underlines the impossibility of covering all situations; for a start they can't even know what those all are.

    I doubt that this incident will quell the enthusiasm currently driving the continued investment in the field, but at some point there is going to be no point carrying on.

  14. Jamesit

    Waymo fun than driving!!! I hope that doesn't happen too often. It did better than I thought it would.

  15. Anonymous Coward
    Anonymous Coward

    Big red stop button?

    Surely they have this, and once they've committed to sending someone out it should be pressed?

    1. You aint sin me, roit
      Mushroom

      And explosive bolts on the doors...

      And ejector seats... just in case.

    2. steelpillow Silver badge
      Alert

      Re: Big red stop button?

      Trouble is, too many drunks, pranksters, selfish idiots and sex partners will press it anyway.

      OTOH what do you do if the car loses contact with support and turns all Musky? You need the button for that scenario.

      Best answer is to do what they do in trains: a big red thing with safety widget and an even bigger red legal warning.

      1. Anonymous Coward
        Anonymous Coward

        Re: Big red stop button?

        Yes, perhaps like trains, but the pax must be able to make the nightmare end.

  16. Cereberus
    Joke

    Simple Answer To The Problem

    BOFH had Waymo better ways of dealing with a problem like this 20 years ago

    https://www.theregister.com/2001/04/11/bofh_my_mate_automate/

    1. zuckzuckgo Silver badge

      Re: Simple Answer To The Problem

      I've always pronounced it as Wham-oh! Name reminds me of GM's first attempt at an electric car the 1990 Impact.

  17. Paul 195
    FAIL

    Still not AI

    It seems like we've reached an inflection point now where we are reaching the limits of what can be achieved with machine learning and other forms of "Arttificial Intelligence". The cliff-edge as shown by language systems like GPT-3, or self-driving like Waymo is that our technology is still just very fast clockwork, and has no contextual understanding or ability to reason. If something as novel and unexpected as traffic cones in the kind of largely predictable driving environment of Arizona is enough to confuse the car, I can't see them being ready for the roads of London or New York anytime soon.

    * Yes, I know that talking about "reasoning" and "intelligence" brings a lot of philosophical baggage with it, but ignoring that, humans, and for that matter cat, dogs and many other animals, have capabilities we simply can't reproduce at the moment.

  18. iron Silver badge

    > "Waymo seems to have a bit of issues with the weather whenever it rains here in Chandler," he said.

    > The roads in Chandler, he said, are wide open, in a grid format, and allow for a lot of possible detours.

    So we can expect Waymo years before they have a model that can handle UK roads and weather.

  19. Zimmer
    Joke

    The self-driving car named Carl

    Transcript of a Scott Adams 'Dilbert' strip , @ Jan 2019

    The Self-Driving Car named Carl.

    Dilbert: Carl, take me to the grocery sore

    Carl: Do you know that if I drive you off a cliff you will die, whereas I will re-spawn in a new body?

    Dilbert: Maybe I'll walk..

    Carl: Maybe you should..

  20. FuzzyTheBear
    Coat

    How reassuring ..

    now imagine this on an ice covered road and snowing :) ill walk ..

  21. Johan Bastiaansen

    it tries to escape?

    The technician has to chase the car. It looks like they can't shut it down remotely or order it to stay put. How is that not a problem?

    1. Anonymous Coward
      Anonymous Coward

      Re: it tries to escape?

      Both times it moved after pausing the support people were at the scene (but still in their vehicle). This made me wonder whether they were remotely (but line of sight remote) giving the car instructions?

    2. Claptrap314 Silver badge
      Mushroom

      Re: it tries to escape?

      It's less of a problem than allowing any "rogue engineer" driving by looking for open wireless connections to play with...

  22. steviebuk Silver badge

    Its shocking...

    ....that support appear to be unable at all to disable the car. Although she could see all the cameras, not being able to disable it from randomly starting up and driving off again is really bad.

    AI car thinks

    "I need to get passenger from A to B. I could just kill the passenger then there won't be a possibility of me failing if I have no passenger to transport"

    1. vtcodger Silver badge

      Re: Its shocking...

      There has to be some way to turn the car off remotely. How else would they do maintenance or shut it down during slack periods? Run along next to the car and shoot the controller module? Perhaps Waymo had some reason not to shut the vehicle off? Maybe they lose communication with the passenger if they do that? Or maybe reconnecting later is a problem?

    2. zuckzuckgo Silver badge

      Re: Its shocking...

      We don't know that support actually tried to stop the car. After watching the video it looks to me like:

      - driver system misinterprets first traffic cones as an animal or pedestrian in or about to enter the lane.

      - car yields right-of-way, waiting patently for animal or pedestrian to move out of the way.

      - car gets tired of waiting (times out), decides they are stationary obstacles so goes around the first set

      - approaches second set of cones and repeats the same behaviour, stop and wait, time-out, attempt to go around.

      - in this situation it has to reverse to go around but traffic approaching from rear causes it to stop in the lane

      - driver system finally gives up or is given the stop order.

      If possible they need to train it to better distinguish between stationary and mobile objects in the middle of the road. Start adding severe weather and road debris to the situation and they have a long way to go for a true robo taxi.

  23. skeptical i
    Devil

    Select all the traffic cones.

    Next upgrade for the captcha puzzlebox.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like