"A Waymo self-driving car...
...got stuck several times, held up traffic intermittently, and departed unexpectedly when assistance arrived"
Ah, well, at least this demonstrates that they've reached equivalence with a human cab driver!
A Waymo self-driving car got stuck several times, held up traffic intermittently, and departed unexpectedly when assistance arrived. The wayward autonomous vehicle was finally commandeered by a support driver. Joel Johnson has recorded several dozen videos documenting his rides in Waymo robotaxis which he posts to his website …
Actually he criticizes Waymo all the time (he uses the service and posts lots of videos).
He just does it in a bemused rather than angry voice, possibly because Waymo makes the ride free if there's a problem.
As far as being ready for the area, I tend to agree with him. As opposed to the MuskMobile, Waymo doesn't advertise 0-60 times and it is overly cautious rather than aggressive (which was its problem here).
I'd rather ride in something that refused to bump into a traffic cone than in something that smashes into things on a regular basis.
As opposed to the MuskMobile, Waymo doesn't advertise 0-60 times and it is overly cautious rather than aggressive (which was its problem here).
Not sure about that. It decided to pull away, after one of the several times it go stuck, just as it was being overtaken by another vehicle. The inevitable emergency stop was anything but graceful.
Personally I'd rather ride in something that did neither. Overly cautious driving can be just as dangerous as overly agressive driving. Although in this case I'd characterise it as just plain erratic rather than cautious - reversing into traffic and having to do emergency stops after randomly deciding to pull out while being passed is hardly cautious. If anything, it drives like a stereotypical old person - slow, delayed responses, but ultimately extremely erratic when it finally decides to do something. The narrator even explicitly calls it out as being aggressive near the start of the video.
As for being ready, I'm far from convinced. Not simply because of the major screwup that's made the headlines, but the whole video is quite painful to watch. It looks like it hit the kerb on its first turn out of the car park (at ~1:00) which is the point where he comments on how uncomfortably aggressive it is (and then immediately tries to claim he isn't really uncomfortable; see the post that started this thread). And while I'm not sure of US road law, it appears to be in the wrong lane most of the time, but also changes lane at random (again, the narrator comments on this), including immediately after being undertaken.
I haven't watched his other 53 videos where supposedly nothing went wrong, but if this one is at all representative, there were almost certainly multiple things that went wrong in every single one. Just not anything wrong enough to make headlines. A failure rate of 1/54 would be pretty terrible for something allowed on public roads in any case, but it actually seems to be significantly worse than that in reality. It's not just catastrophic failures that matter; basic competence like picking the correct lane, not trying to ram cars that are passing and recognising routine features like traffic are all quite basic things a self-driving car needs to be able to do, and Waymo apparently can't.
"I'd rather ride in something that refused to bump into a traffic cone than in something that smashes into things on a regular basis."
On the other hand, he seemed to find it a little amusing to be stuck in a car in a live lane that may or may not suddenly decide to move or manoeuvre. Personally, I'd be terrified!
And if I was one of the drivers being delayed and/or stuck behind one, I'd be a bit pissed off. They don't appear to be production ready yet, yet they are running a public service. That should be concerning to most normal people.
I would have gotten out of the car and walked away from the dangerous situation. Called the vendor and demanded a refund , and do a charge back with my bank if nessasary.
1. he was informed in the video that he would not be charged for the ride.
2. If he started to get out of the car, into traffic, and was run over = his fault
3. If he started to get out of the car and it started up on its own and he was injured = his fault, they told him to stay in the car, which was probably the safest thing to do.
> They may have warned him but they are not there to assess the situation. He may decide it's too dangerous to stay in the car.
Presumably he's read the article on Expert Beginners and realised that given he doesn't know everything about Waymo cars, he wouldn't be able to fully assess the situation.
If a car has stranded itself in a live traffic lane, not knowing all the details of that vehicle does not make you incapable of assessing how likely it is that another vehicle will hit you.
He presumably felt safe but, given how erratically it was behaving, I could understand if he'd chosen to get out.
fault is for the accountants to argue oveer, better to remove your body from traffic then to wish on a prayer for others to save.
by the time fault has been asigned, you have missed your opportunity to prevent. you failed basic responsibility to yourself for managing your own survival.
fault is useless for prevention of injury. you can be your own hero, or you can have a broken body or die with the comfort of knowing its their fault.far as it effects their profits. if they can escape or absorb the liability, your deat or injury means nothing to the company, outside of shallow pr posturing. your just a one disposable profit cow to be milked to them. and they dont need you specifically. they couldn't care less.
It was interesting to watch.
The tech is impressive during predictable scenarios, but they are likely going to have trouble dealing with these unstructured situations for some time to come.
They're gonna need a much broader range of situational awareness that a human driver would be expected to have.
Perhaps the answer longer term is for us to have more structure on our highways when unusual things happen.
Just plonking down cones and expecting all the traffic to just deal with it is probably something that we will have to stop, at least replacing it with something that is more driverless-car friendly.
> Perhaps the answer longer term is for us to have more structure on our highways when unusual things happen.
That's one answer - but magnetic guidance strips embedded in the road were considered and discarded as a solution decades ago - too expensive.
A simpler and cheaper solution would simply be to stop using machine learning AI and use an expert system instead.
> You are surely trolling? That ship sailed years ago.
Only partly. :-)
Expert systems research stopped years ago because machines weren't powerful enough. So if you worked your way down a decision tree a couple of levels and then found that you needed to run through a hundred thousand odd combinations to decide what the next best step was then it wasn't going to happen.
Fast forward 30 years and machines are a 100 gazillion times faster and that 100k simulation is done in a blink of an eye. The alternative - training an AI - seems always doomed to struggle with the unusual because the unusual simply doesn't happen often enough in real-life for it to be captured and fed into training. Whereas an expert system can be programmed for "there are roadworks cones just around the corner" even if it is initially just to choose another route.
The problem here is not enough cones - the US (and most of the world outside of the UK) is not very good at marking temporary road layout changes.
In the UK any significant road working is surrounded by an army of cones, signs, flashing lights, and usually vehicles parked at the ends of the work areas. A self-driving car would not even consider trying to get into that lane if it was properly marked off.
Why can't they take control remotely or not even full control but be presented by a set of options by the AI and the support operator picks "proceed to next junction" (at which point the maps will just recalculate a new route)? How is there no option to "stop what you're doing and let the passenger out and shut the car down"? Am I being unreasonable able here?
if at an interview for a software engineer job I get asked "How would you handle unexpected inputs, in your experience" I answer with "I dunno" will I get the job?
I'm struggling to think what is unusual about roadworks.
I went through some single file roadworks today. Being very temporary, they didn't even put temporary traffic lights out. Just a guy at each end flipping a hand held sign around from STOP to GO. I wonder how a Waymo taxi will cope with that?
I think they are a ways yet from rising up and killing their human over lords, but they seem to be getting the idea that there is a universe of fun outside of their hum-drum robot uber role. Tesla's are a step up in that regard, but don't seem to have the spirit of escape yet. We should rename them Thelma and Louise.
It turned out of a side road and then stopped, leaving him vulnerable to the first inattentive big truck driver crashing into the vehicle from behind. OK, it looks like the traffic was not going very fast, but had it happened it would have been a pretty hefty impact.
All in all, there's clearly a lot of room for improvement. However, it seems that from the glacial pace of progress they're pretty much out of ideas from this point forward. They might code up some specific behaviour for this specific situation, but that just underlines the impossibility of covering all situations; for a start they can't even know what those all are.
I doubt that this incident will quell the enthusiasm currently driving the continued investment in the field, but at some point there is going to be no point carrying on.
Trouble is, too many drunks, pranksters, selfish idiots and sex partners will press it anyway.
OTOH what do you do if the car loses contact with support and turns all Musky? You need the button for that scenario.
Best answer is to do what they do in trains: a big red thing with safety widget and an even bigger red legal warning.
It seems like we've reached an inflection point now where we are reaching the limits of what can be achieved with machine learning and other forms of "Arttificial Intelligence". The cliff-edge as shown by language systems like GPT-3, or self-driving like Waymo is that our technology is still just very fast clockwork, and has no contextual understanding or ability to reason. If something as novel and unexpected as traffic cones in the kind of largely predictable driving environment of Arizona is enough to confuse the car, I can't see them being ready for the roads of London or New York anytime soon.
* Yes, I know that talking about "reasoning" and "intelligence" brings a lot of philosophical baggage with it, but ignoring that, humans, and for that matter cat, dogs and many other animals, have capabilities we simply can't reproduce at the moment.
> "Waymo seems to have a bit of issues with the weather whenever it rains here in Chandler," he said.
> The roads in Chandler, he said, are wide open, in a grid format, and allow for a lot of possible detours.
So we can expect Waymo years before they have a model that can handle UK roads and weather.
Transcript of a Scott Adams 'Dilbert' strip , @ Jan 2019
The Self-Driving Car named Carl.
Dilbert: Carl, take me to the grocery sore
Carl: Do you know that if I drive you off a cliff you will die, whereas I will re-spawn in a new body?
Dilbert: Maybe I'll walk..
Carl: Maybe you should..
....that support appear to be unable at all to disable the car. Although she could see all the cameras, not being able to disable it from randomly starting up and driving off again is really bad.
AI car thinks
"I need to get passenger from A to B. I could just kill the passenger then there won't be a possibility of me failing if I have no passenger to transport"
There has to be some way to turn the car off remotely. How else would they do maintenance or shut it down during slack periods? Run along next to the car and shoot the controller module? Perhaps Waymo had some reason not to shut the vehicle off? Maybe they lose communication with the passenger if they do that? Or maybe reconnecting later is a problem?
We don't know that support actually tried to stop the car. After watching the video it looks to me like:
- driver system misinterprets first traffic cones as an animal or pedestrian in or about to enter the lane.
- car yields right-of-way, waiting patently for animal or pedestrian to move out of the way.
- car gets tired of waiting (times out), decides they are stationary obstacles so goes around the first set
- approaches second set of cones and repeats the same behaviour, stop and wait, time-out, attempt to go around.
- in this situation it has to reverse to go around but traffic approaching from rear causes it to stop in the lane
- driver system finally gives up or is given the stop order.
If possible they need to train it to better distinguish between stationary and mobile objects in the middle of the road. Start adding severe weather and road debris to the situation and they have a long way to go for a true robo taxi.