"...a [Lidar] blind spot low to the ground all around the car."
Near the ground? Like where the pedestrians are?
This isn't making much sense.
The death of a pedestrian in Arizona by an Uber self-driving car may have been the result of a blind spot caused by the use of a single LIDAR sensor on the roof. In 2016, Uber decided to shift from using Ford Fusion cars to Volvo XC90s for its self-driving car program. When it did so, it made big changes to its sensor design …
If there is a blind spot it is right in front of the bonnet. The lidar should have still seen the cyclist from far out. There is no info if the bike was carbon framed, I suspect not. If it was metal it would have been visible on the radar for miles. Similarly, most vis sensors are not actually vis, they stretch into near IR so there should have been no issue with the pedestrian having dark clothing and the car being driven dangerously for a human driver (to fast to stop within zone covered by headlights).
The only issue I can see is Uber. No other.
Similarly, most vis sensors are not actually vis, they stretch into near IR so there should have been no issue with the pedestrian having dark clothing
1. A lot of cameras have an IR filter to keep infrared from reaching the sensor. Since the focus point for IR is different than for visible light, getting a clear image is enhanced by blocking the out of focus IR,
2. A visible light sensor may pick up near IR, it will not pick up far IR - generally such sensors, used in missiles, need to be cooled significantly to function, often with liquid nitrogen. Far IR is from 8 to 100 microns, and the peak emissions from the human body are at 9.3 microns. Unless the car has IR floodlights, near IR is no better than visible light with no lights.
.... A lot of cameras have an IR filter to keep infrared from reaching the sensor. Since the focus point for IR is different than for visible light, getting a clear image is enhanced by blocking the out of focus IR,...
You are talking about a camera optimised for visible light. Of course that has filters cutting out extraneous light that it's not meant to respond to.
But why on earth should a sensor INTENDED to pick up IR have an IR filter? That makes no sense. I am pretty sure that the design engineers would have speciified technology wich was designed to do its job. And that its job was to detect obstacles around the car in varying lighting conditions - which includes pedestrians in a night-time street. That there was a failure there is obvious. But I don't think it was as simple as using a technology which was not designed for the task....
All these sensors, and yet humans seem able to do this kind of thing quite a lot of the time with just a pair of mark 1 eyeballs and a g-sensor in the seat of the pants... it seems we can do remarkably complicated things with only a handful of sensors.
Though on a more serious note: I have wondered what happens when a lidar system, spitting out a presumably rather bright light (as far as its sensors are concerned) meets another lidar system. Oncoming headlights incorrectly adjusted or left on main beam are bad enough to cope with for a human driver; I'd love to know what an electronic system thinks of them. The same comment applies whether visible, IR, radar or indeed any other active lighting system is used.
I have wondered what happens when a lidar system, spitting out a presumably rather bright light (as far as its sensors are concerned) meets another lidar system
I think this is done by using specific frequencies for each car.
IR Filters..... most web cams are sensitive to IR, and an astronomer buddy of mine attached one to a telescope for imaging, after removing the IR filter. Photons are precious in astronomy, and IR is welcome. So yeah, you'd use as wide a spectrum as possible, you wouldn't filter anything out.
Not really, no. The whole sensor package looks mounted to the front section on the roof. There isn't any roof in front of it to block anything. I would expect the whole "blind spot" to be on the order of "something shorter than a foot, within two or three feet of the front bumper".
LIDAR maker CEO waxing about the indisputable need for more of its wares... laughable. That single one should have been perfectly capable of detecting the pedestrian, and I'm pretty sure it did too - at least at LIDAR reflection level. Why that never resulted in the car breaking is the actual million dollar question.
Even a carbon framed bike carries a large amount of metal.
Lidar <> radio frequency radar, so the metal or carbon fiber content are unlikely to be relevant. The question is: did the bike and pedestrian reflect laser light?
The question is: did the bike and pedestrian reflect laser light?
My question is more "what the fuck was the supervising meat-sack doing whilst this 'testing' was going on?". I'd have though that in any such test the human in the vehicle is still ultimately in charge else why be there at all? Not sure whether it was media bias and selective edits or not but the video I saw of the inside of the vehicle showed them paying zero attention to the road ahead when the accident occurred. Not really how a supervised test is really supposed to work.
Radar should have seen the pedestrian and bike.
Lidar would have picked up the bike and woman faster and easier than the radar.
Even still the car should have picked up the cyclist well within range of the sensor on the top of the car.
Drunk crawling. Victim of previous accident. Someone who tripped. Recumbent bicycle. Kid sitting on a skateboard.
Speed bump. Log in the road. Bricks, blocks, other obstructions.
An autonomous car has to scan for any and all obstructions ahead, and also be aware of hazards approaching from the sides and rear.
There is no excuse for not detecting someone wheeling a bike. Far smaller targets should be detected and acted on.
"There is no excuse for not detecting someone wheeling a bike. Far smaller targets should be detected and acted on."
And that's the bit that's really, really hard. Is it a paper bag? Is it a brick? Is it a plastic or glass bottle? Is it something that fell off the back of a lorry that might cause damage or and old newspaper just blowing in the wind?
"And that's the bit that's really, really hard. Is it a paper bag? Is it a brick? Is it a plastic or glass bottle? Is it something that fell off the back of a lorry that might cause damage or and old newspaper just blowing in the wind"
...and in the weird and wonderful real world of coincidences, I went out in the car a few hours after posting that and while driving had to avoid a claw hammer laying the road right where the wheels would normally be passing over. I've never seen one of those in the road before and it could have done serious damage either to the tyres or, if hit "right", the underside of the car, eg brake lines. Would an AV have seen it, recognised it for the danger it posed and avoided it?
Low is a relative term. Keep in mind a xc90 is 1766 mm or a couple inches shy of 6 feet tall and the LIDAR is roof mounted so it's likely easily over 6 feet. Depending on the vertical scan angle of the LIDAR there will be a blind spot created by a superposition of the roof of the car creating a largely rectangular cone the size of which is determined by how high the LIDAR unit is mounted and by the lowest angle it can scan.
If we assume the LIDAR is at 2 meters high and is mounted 1 meter from the edge of the roof (as the car is ~2 meters wide and I'm modelling it as a brick) this gives a maximum angle of 12.6 degrees below horizontal before the LIDAR beam is blocked by the roof outline. That puts the closest point on the ground the LIDAR unit can see at 8.9 meters (29.3 ft) away directly ahead and to the sides and 12.6 meters (41.4 ft) over the corner. If we move the LIDAR to 2.5 meters high these numbers drop to 3.5 meters (11.3 ft) and 4.9 meters (16 ft) respectively. The distance will be considerably greater toward the rear as the assumption here is that it is mounted 1 meter from the front edge of the roof and that would potentially place the rear edge considerably farther away decreasing the downward angle visible.
Yeah, this. Looking at the car it's mounted well above the roof line and should have a better view than a human driver. Assuming it's a 360 degree sensor as the story says it shouldn't have any issues picking out a pedestrian.
Indeed... People are taller than cars.
Even push bikes come up a good few feet, and from the footage, on that nice clear, uncluttered, and almost straight road, the entire bicycle and person were completely visible to the video camera.
To have a blind spot extend that far, the LIDAR would have to be at the rear of the roof, and as low down onto it as possible, which would be the stupidest location ever devised for a vehicle that will spend 99.99% of its time going forwards.
... and for some reason has never had to learn about consequences?
The only thing that makes me sad about Uber's ongoing fall from grace is that Kalanick and his bros have probably already cashed out, and they don't seem to be the sort of people to be overly concerned by the blood now on their hands.
diverting some of the investment into bank accounts as a share of the company is acquired is within the ability of most of these people.
Investors, yes but not employees who receive shares in lieu: their shares are untouchable and untradeable. Yet another way the VCs manage to shaft people. Didn't Kieren cover this a while back?
Hedgehogs. Why those haven't yet evolved titanium skin with tungsten spikes is one of Evolution's Great Mysteries.
A hedgehog is strolling the edge of a road, and comes across two rabbits. "Hey guys" he speaks up, "What is it that I see so much of my fellows being flattened while trying to cross, and only rarely a rabbit or a hare?". "Geometry", one or the rabbits answers. "If you're trying to cross at night, and you see headlights approaching, then either move back to the verge, or sit right in the middle of those lights. And by day you can use those headlights as markers too, even when they're off. Let me show you.". He jumps into the road, positions himself (looking over his outstretched front legs towards the approaching car's headlights), and indeed, the car passes right over him and he hops back to his mate and the hedgehog. "Aha" says the hedgehog, "Thanks, I'll try that". And as he starts crossing the road, another car approaches. The hedgehog shuffles around a bit, positioning himself and finds the right spot. "Oh look", the other rabbit remarks, "you don't see those Reliants much any more".
"Kittens!!!! You forgot kittens!!! Thank deity this isn't FB or you'd be toast."
For a more pragmatic approach (and a significant statistic): Toddlers playing on driveways
Not a problem at the moment, but it will be when robocars are the norm.
The lidar in question should easily have seen a person walking a bicycle in the middle of the road. There was some failure that wasn't caused by the lidar manufacturer's "not enough lidars" excuse. Maybe a dog right next to the car might be obscured by a fender, but not a five foot tall person walking a five foot long bicycle on a flat road sixty feet away.
There is also confusion about sensors being turned off. There are two sets of sensors: Uber's self drive sensors which are being tested and Volvo's proximity and braking sensors which are not. You can't test Uber's system if the Volvo system constantly interferes, so the Volvo system is shut off. The supplier of a Volvo system component is just pointing this out, so that their product does not get tarnished by Uber's mistakes.
You can't test Uber's system if the Volvo system constantly interferes, so the Volvo system is shut off.
This sounds like Uber is pretty bad at coding and debugging for real life situation.
In a testing lab, they could flip on their system and turn off Volvo system, where the worse case is you witness the car crash into stuff. However on real roads, it should be an OR case where if either system triggers the car brake, it brakes and a debug log is kept to verify if Uber's braking system was working.
It may be harder than that to setup an OR system. Volvo’s XC90 has an partial autonomous driving system of it’s own, not just automated emergency brakes. It steers the car (in lane), recognises speed signs, maintains following distance, etc. Separating out just the one part could cause bugs of it’s own too....
"The lidar in question should easily have seen a person walking a bicycle in the middle of the road. There was some failure that wasn't caused by the lidar manufacturer's "not enough lidars" excuse. "
You are assuming that the approaching human in question was in the 'cone of surveillance' for the lidar. With the reduction in sensors, that zone may have been relegated to another type of sensor, leaving the lidar for long range detection in the vehicle's planned path.
In that case the human would be invisible to the lidar until stepping in front of the car. Take a few fractions of a second to analyze and recognize, and you've run out of time.
You are obviously unfamiliar with the Velodyne unit used by Uber. Here is a somewhat dated link that will still help you visualize what that unit should have seen. Note that shadows appear behind large objects and very close to the vehicle, but that coverage is 360deg from a couple of meters to beyond the area depicted. The concentric rings are from the Velodyne.
The question being asked as fsck-all to do with why she chose to cross there, or it being night time. The key point is the car totally failed to see her and make any attempt to stop.
Uber execs should be facing jail time for this: they have shown the sort of negligence in design and system testing that lead to a death. Having some low-paid meatbag sit there with the hope of taking over in the event of a fault is something already ridiculed in tech circles such as El Reg, and yet that seems to be their approach to checking the sensor system could detect all reasonable risks.
"Proper road crossings were two miles either side of where she crossed, if only she had a bike.
Bad shots aside and sad as it is, why choose to cross there?"
One thing about reality is that people sometimes do things that they're not allowed to do, or might be dangerous, and so on and so forth. So sometimes, people will try to cross roads when it's not safe to do so, in places where you might not expect them to cross. Sometimes they're drunk. Sometimes they're on the phone. And so on.
But "the idiot crossing the road wasn't paying proper attention" is no excuse for running the supposed idiot down and killing them.
Cars driven by people are moderately okay - most of the time - at not running into errant pedestrians. Whether one counts as cheap labour or not, the typical taxi driver really does try to avoid driving into actual people.
It seems obvious to me that self-driving cars need to be at least as good as human driven cars at avoiding killing people who have put themselves in places where they're not supposed to be.
Every school day, I see lunatic high school lads on bicycles on their way home going in every direction across road and pavement, often pulling wheelies and seemingly having no care for their own safety or anyone else's. The main reason I've never seen one of them crushed by two tonnes of petrol powered motor car (or twenty tonnes of diesel powered bus) is that the drivers - many of them obviously irritated - react appropriately and, erm, slow down in time. In fact, almost all of them are travelling pre-emptively slowly since the drivers have noticed it's school chucking out time and the place is crawling with high school kids wandering about all over the place.
I reckon if a self-driving car can't exceed the performance of an average driver in this regard, it shouldn't be on the road.
Thank you both for the replies, I was unaware of that which is why it didn't make sense. I still think at some point Uber might try to sabotage self driving cars if it looks like it's going to kill it's business, I mean realistically if you could use your own car to drop you when going out or pick you up then Uber is mostly redundant and then whoever does it isn't going to license the software when they can use it to make a lot of money themselves.
I still think at some point Uber might try to sabotage self driving cars if it looks like it's going to kill it's business
With the second or third such 'accident' they will get their license to test self-driving cars revoked by the relevant federal agency, so no getting out from underneath that by moving the testing to a more lenient state. With the other companies maybe getting a couple of restrictions, if at all when they can show their accident rate being way lower, especially when they would have zero fatalities, because of their setup.
Which would leave Uber at a severe disadvantage, still having to use badly-paid wetware to provide their service, when the others can reach full vehicle autonomy much sooner and leave the scraps to Uber
"if you could use your own car to drop you when going out or pick you up then Uber is mostly redundant"
Except that once self-driving cars become the norm, the paradigm shift is that taxis become so cheap that it's not worth having your own car. In addition, a slightly larger taxi (6-8 seats) displaces busses (entrained for peak services), breaking away to shuttle or taxi operation (or simply parking) outside of peaks - which removes the vast inefficiency of bus routes outside peak periods.
Uber thinks it's going up against taxi companies when in fact it's going up against bus operators, taxi operators, car makers and other consortia. Self-driving vehicles is an inflection point that turns the entire transportation market on its head.
The advent of the affordable robocar means that personal vehicles will once again be something that only 10% of the population own. The difference is that they'll be something that 90% of the population USE.
The problem at the moment is companies like Uber whose aim is to "break taxi monopolies" and make a quick buck - mostly from investors - are focussed on profit instead of R&D. We're heading into Railway Mania territory.
There's still a very good reason to own your own car, which you will smell or sit in on a regular basis if you call for a ride late at night or the morning after. For some reason people think having a few cameras in the car will prevent this, but ask any cab driver how often someone gets sick in his car and that's with an actual person there. There's a lot less social pressure if there is no human in the car.
Sure, they can say "if you make a mess we'll charge you for cleanup but how is the CAR going to know cleanup is required? Even if it can, how is the cleanup going to happen? Are the cars going to include a souped up Roomba to clean vomit off the seats and floor, and what will it be able to do about general stank from when someone with nasty BO sits in it in a humid day? Or will it simply drive itself to a place where some poor minimum wage slob has to clean it?
Thanks, but I'll stick with owning my own car even if I could save a little money otherwise.
"and with no light but the cars "
As other people have demonstrated on Youtube, Uber have cranked the gamma of their video HARD to make it appear like an unlit rural road when in fact it's a relatively well lit urban one.
I was fooled too, as what I saw was on par with what my own smartphone camera DOES show on unlit rural roads, however when there's even a single streetlight around it's far more like the other youtuber videos than the Uber one.
They've done everything they can to try and blame the victim on this one and it's backfired. Despite the stupid american anti-pedestrian laws the fact remains that a human will react to and avoid a pedestrian being where they shouldn't. Uber fucked up on an epic scale and instead of putting up their hands to the fact, tried to weasel out of it by doctoring the video and failing to release the sensor data (which would have shown the cycle being detected at least 150-200 feet out)
I'm not victim blaming at all.
Uber are a despicable company, who loses out if self driving cars become a reality? Uber's business model disappears immediately, why would you need to call a cab when you can just use a self driving car? There is your motive. The facts of this case feel off. Disagree if you wish but there is something not right with this.
and what happens if someone else gets there first? Bye bye Uber, they won't be able to compete with someone that doesn't have to employ drivers. Therefore it is in Ubers interest to stifle self driving cars or do you think they will just take a chance that they will win given their track record of abuse.
Let me ask a couple of questions to back this up.
Why release the video? What purpose does it serve other than to push public opinion against self driving cars?
Who do you think will get self driving cars first if ever? Who has the most money and best r&d to make it happen? (hint: google)
Like I said this is my opinion based on the fact that something feels off with this whole incident.
Why release the video? What purpose does it serve other than to push public opinion against self driving cars?
The police fairly quickly said Uber did not seem to be to blame, suggesting that no human driver could have done any better. We were all left with the impression that 'some idiot' had jumped out in front of the car and kablam!
It's a fear we all have as drivers, something we know can happen, have seen happen, and, for a few, have experienced.
I think the video may have been released because it was felt it would only confirm a 'these things happen, there could have been no better outcome' perspective.
And it might have been to help shift blame from Uber, their set-up and technology, on to the driver.
"Homeless Junkie Killed Jumping In Front of Robot Car Carrying Trans Armed Robber" appears to have been a genuine newspaper story trying to lead people that way.
I don't think they realised that not everyone had been drinking the Kool Aid.
If you don't own your own car, you will have to hire one. Someone has to own them, and Uber has (at least for now) deep pockets thanks to their inflated valuation and will have the advantage of already having their app installed on many people's phones.
Self driving cars are a dream come true for companies like Uber and Lyft - they don't have to pay drivers or worry about when they want to come to work. Uber would control the cars and have them "work" when and where they want. If there's a football game in town next weekend it can send cars from several nearby cities to insure there is enough capacity to meet the increased demand...they can't order their drivers to do that today so they have to use surge pricing instead (they might make more profit per ride from that but they serve fewer customers so they'd probably more make more overall by serving them all at normal rates)
"Proper road crossings were two miles either side of where she crossed, if only she had a bike."
At which point, AC demonstrates the car-centric american point of view of the world, with rules that "though shalt ONLY cross at designated points and thou SHALT give way to vehicles" that gave rise to the programming mindset which killed this pedestrian.
A "self driving" vehicle programmed like this would last less than 30 seconds in test mode in any european city before being ordered off the road.
Scratch that - It would never pass the initial hazard testing required to be allowed ON the road in the first place.
This post has been deleted by its author
People when sitting for hours doing nothing, as was this safety passenger (NOT a driver), become bored and disengaged from their surroundings. Add to that the fact that the passenger is not directly involved in driving, so that when they see something ahead that could possibly go wrong there is a tendency to wait a moment or two to see if the car is going to correct itself; a delay that could very easily leave it too late to change the situation. This, apparently, is what humans do. All of us. Having a human safety passenger is deceptively comforting. In a real emergency they are very likely to fail, as did this cars systems.
"People when sitting for hours doing nothing, as was this safety passenger (NOT a driver), become bored and disengaged from their surroundings. "
The US Government contracted Disney to produce a bunch of road safety and Driver Ed films in the 1950s and 1960s - they may be 50+ years old, but they show this problem has been around for at least as long as freeways have existed. (it used to be called highway hypnosis and distracted driving)
You can find them on Youtube without much difficulty
It would be interesting to know what action the human driver took to avoid hitting the cyclist.
None. Which is normal. If you have any degree of assist beyond basic lane following you start getting bored off your tits and doing other stuff. As recent crashes with Tesla demonstrate as well.
But someone in control of a multiple Ton piece of machinery, especially one that needs to be "managed" on average at a higher rate than every 13 miles, that is still under test as it isn't proven technology, as their job?
They need to be paying attention at all times, no matter if it gets boring.
They are there to be the lifeline in the system.
There is no excuse for being inattentive in this situation, and I would hold the driver - yes, driver, because they are sitting in front of controls, and they are responsible for the vehicle responsible for this as much as Uber itself.
"If you have any degree of assist beyond basic lane following you start getting bored off your tits and doing other stuff. As recent crashes with Tesla demonstrate as well."
At least the Tesla (and the unmodified Volvo) will attempt to avoid pedestrians.
in the case of Joshua Brown (and the chinese crash into a road-cleaning vehicle a week or so later), it's worth noting that Tesla manuals explicitly warn that the driver aids WILL NOT protect against stationary objects in the roadway ahead when travelling in excess of 50mph.
Yes I know that Joshua's Tesla interpreted the side of the trailer as an overhead gantry sign. What's of more interest to me is that he never went anywhere without his dashcam and that device has never been found. There's a lot that isn't explained about the crash and its aftermath.
It always confused me too. You watch Columbo or Starsky and Hutch or NCIS or anything like that, and the police protagonist is forever nipping across the road to have a word with the officers or agents sat in a sedan on surveillance opposite an apartment block. Surely that's jaywalking! Or does it only apply to some designation of road above a certain threshold like the UK's A road, B road and unclassified?
No they don't. Maybe they do where you live. Sure as fuck not where I do. Strangely enough, that's the sole point of the existence of marked pedestrian crossing points. No idea how it works in the US, but considering the mere existence of the term "jaywalking" your odds don't look good.
UK: As soon as a pedestrian has set so much as a toe in the road, they have right of way.
If you are a driver and you seem someone stepping into the road in front of you, you stop for them.
It doesn't matter whether that person is young, old, sober, drunk, suicidal or even Boris Johnson.
To the phantom thumb downers... a "right of way" is a legal term meaning [from Black's] "The right of passage or of way is a servitude imposed by law or by convention, and by virtue of which one has a right to pass on foot, or horseback, or in a vehicle, to drive beasts of burden or carts, through the estate of another." and "'Right of way', in its strict meaning, is the right of passage over another’s ground; and in its legal and generally accepted meaning, in reference to a railway, it is a mere easement in the lands of others, obtained by lawful condemnation to public use or by purchase."
That is to say that for a public road, vehicles and pedestrians both have rights of way.
The term you are thinking of is "priority", which applies only at junctions. If pedestrians simply had priority by virtue of putting their foot on the road, then we wouldn't need zebra crossings, would we?
It wasn't a point of view, it was a fact of law. You can check it out in The Zebra, Pelican and Puffin Pedestrian Crossings Regulations and General Directions 1997 if you want. Or look in the Road Traffic Act 1991.
If you want to cross the road and you use a couple of parked cars as a shield, then you're stood in the road, but you still have to wait until the road is clear if you want to cross safely. Drivers do NOT have to stop to let you cross in that situation.
Precedence and priority are the terms used in the applicable law, not "right of way". I wasn't condoning knocking people over, just correcting a misunderstanding. Pedestrians do not have automatic priority just because they step into the road. If anything people who think that is the case, as pedestrians, are a danger to themselves and other road users.
"I'm not saying right or wrong that a pedestrian has to yield outside a protected zone, but it is Arizona law."
Does the law also make it legal for a driver to run a pedestrian down if it is possible to stop or go around them? Or shoot them dead and then drive over the body?
The other videos which have been uploaded make it clear that Uber doctored their video to make it appear that the pedestrian was not visible until less than 2 seconds before impact when it's clear the area is relatively well lit and she would have been visible for at least 5-6 seconds.
Under such circumstances _regardless of traffic laws regarding who must yield_ a human driver would usually be charged with vehicular homocide or reckless driving causing death. This might still happen.
With regard to 'Jaywalking' and the other pedestrian-hostile laws of the USA, I suggest you look at the following URLs: https://en.wikipedia.org/wiki/General_Motors_streetcar_conspiracy and https://www.youtube.com/watch?v=p-I8GDklsN4
I've tried really hard to not jump on the Hate_Uber bandwagon, but am finding it increasingly difficult. The more I see and read of this case makes me think they are trying to hide something. Other videos taken by local people on the same road tell me that this is a well lit road, wide, and open. So why was the released video so dark? Either they are using the shittiest cameras available, and possibly other equipment also, or they doctored that video in attempt to control the Story by creating a first impression that makes us think she suddenly appeared out of nowhere darkness and there was no way that either human or machine could have avoided her, thus passing blame onto the victim herself. The footage of the safety passenger confirms this impression - calm, calm, then sudden OOOH, Shock. Damage control. Judging by comments seen on these forums they were somewhat successful. First impressions tend to stick, even when discredited.
I'm finding it very difficult to avoid the conclusion that the way this was initially presented was a cynical exercise in audience manipulation and guilt avoidance. Shame on them, local politicians included.
I'm thinking of creating a service, via an app, where one can quickly and simply call on a less regulated, sort of community based, bandwagon. It will be cheaper and easier than existing bandwagon boarding, providing a challenge to the established norms. And you won't be obliged to pay for hidden extras like pitchforks and burning torches either unless that's a service that you particularly want, where other micro-operators will step into the market gap as a complimentary service provider.
Uber may be guilty of cutting corners but this may not be caused by this but rather our exaggerated expectations of how these vehicles should behave. Most telling in this case is that the car did have a backup driver, a human, and that human also failed to react to the cyclist. This suggests that there was going to be a collision no matter who or what was driving.
I don't know this road but I do live in the US and have driven extensively in both the UK and the US. Driving at night in the US is far more difficult than the UK because the road markings, signs and lighting in the US is markedly inferior to the UK's. I live in fear of pedestrian crossings here, for example, because they have a habit of appearing out of nowhere -- you really need to know where they are -- and attempts to highlight them with street lighting invariably make things worse with glare.
Ultimately, though, its a see/be seen situation. You'd be amazed at the number of people who get hit by trains in the US, both pedestrians and drivers. Our trains are 12 feet tall, have numerous bright lights at the front and a horn that can be heard miles away (which they use a lot more than in the UK). People still don't notice.
..Uber self-driving car death riddle: Was LIDAR blind spot to blame?....
We assume that the system is designed to detect objects and avoid hitting them.
So it has to see an obstacle, correctly identify it, and take some avoiding action, like braking.
W know that the car took no avoiding action. So, assuming that the brakes were working, there is either a problem in:
1 - the detection
2 - the identification
3 - passing the message to the avoidance process.
I assume that 'a blind spot' refers to a failure of item 1). And without knowing anything else I have a 1/3 chance of being right...
It shouldn't be hard to reproduce the accident scene, and find what went wrong.
What I wonder is, even if we admit that the lidar didn't see her, at some point she was in the headlights, and the cameras should have seen her. For almost a second. Too late to avoid the accident, but never too late for an emergency break, reducing the damage. If it takes more time for a computer to recognize the situation and react than for a human, we have another big problem.
This strikes me as another weakness of the human monitoring. *If* we take the video on trust, then the car was being driven with dipped headlights which made it difficult for the human driver to spot the pedestrian.
But I just cannot imagine the human driver going along switching headlights from dipped to full beam all the time if the car auto doesn't depend on it.
But there's so much about that video that makes you think WTF. And indeed the whole situation.
*If* we take the video on trust, then the car was being driven with dipped headlights which made it difficult for the human driver to spot the pedestrian.
Don't take the video on trust, it gives a wholly misleading impression of the lighting conditions.
See here for a more realistic view:
Using dipped headlights was entirely appropriate in the circumstances: the area where the accident happened is well lit with street lights and the human driver would have been able to see the pedestrian for at least 400 yards, probably more.
"This strikes me as another weakness of the human monitoring. *If* we take the video on trust, then the car was being driven with dipped headlights which made it difficult for the human driver to spot the pedestrian."
Someone posted dashcam footage, at night, of the place where the incident happened. It's very well lit. Even driving with no headlights, a human driver would have seen the lady and her bike in the road.
Too late to avoid the accident, but never too late for an emergency break,
If the parts the car is supposed to break into are small enough, and dispersing sideways, the pedestrian about to be hit will have a much greater chance of survival, maybe even just suffer a few scratches and bruises.
(I've long fantasised about red-light cameras being able to send out a disintegrating beam, turning the offending vehicle into gooey snot. Subjecting the driver, not physically hurt very much, to howls of derisive laughter from all onlookers).
Who killed the driverless car? We did, with our Technology, that millennial data doppelganger of mass personalization and uniquely meaningful alienation. Make no mistake the car’s technology nor its failure played any part in its own demise. The most autonomous among us will feign futility saying it was an impossible mechanized folly anyhow, but the culprit was of flesh and blood - a real human error. The human pedestrian failed to use the crosswalk, which was just 200 feet away, and then forgot to look before crossing a dimly lit street at night, because only a human could be so lazy and forgetful in such matters. The human driver also failed, although in a significantly more culpable and meaningful way. The vigilantly inattentive non-driver-driver. How ironic that a human should fail at such a trivially important task as a result of his utter reliance and obsession with Technology. Although the failed Technology would have you believe otherwise, in realty there is no such thing as a self-driving car, and now perhaps there never will be. Because in the end it is of no consequence the malfunctioned software, scanners or safety systems – this tragic death was caused by human failure.
To answer the question in the title of the article - No for at leats two reasons.
1. The predestrian and her bike cannot possibly have been in a blind spot for the LIDAR. They were large and tall objects exactly in locations where the lIDAR should be designed to look with nothing obstructing the view. If the LIDAR could not look in thes elocations there would be no point to it.
2. If the LIDAR is critial to safety then it should not be the sole sensor used. I have no idea how the vehicle is deisgned but the article mentions multiple radars.
It is much much more likely that something in the algorithms or the implementation of the algorithms was wrong and despite having input from sensors that picked up the women and bike the vehicle simply failed to brake or slow down in any way.
What is shocking about this incident is that it was almost ideal conditions. Long straight road without other vehicles and a clear unobstructed view of a predestrian for hundreds of yards who is slowly but consistently moving. Something that hits an object inder these conditions has serious flaws and you would expect multiple independant systems would be in place to prevent such a collision. The only thing worse would be to hit a stationary object. Some combination of dififcult conditions would make this understandable: a winding road with multiple vehicles, fog or smoke, and a predestrian who moved out from a location where they were obscured, a small child without a bike who was low to the ground, slippery surface that was difficult to brake on etc. None of this was present.
Ther shoudl be a detailed investigation and a focus on risk management because on the face of it the accident suggests negligence.
They hardly jumped out right in front of it. As the post directly above yours points out, they were crossing the road in the open long before the car even got close.
As other posts have made clear, the "dashcam" footage of the incident is very underexposed, making it appear a lot darker and makes the pedestrian visible in the footage well after the point in time when they should be spotted, even by a purely optical system - dash-cam footage taken by other people on the same road shows that it is actually well lit.
... And this is what lack of regulation looks like. It's a shame that an ideologically driven Governor would overlook the lapses in candor, morality, and safety of a company, and its CEO just to try to score ideological points against a state that actually innovates and regulates.
Hint to honorable Arizona governor - there is no such thing as a free lunch, as you have now discovered. Time to pay the bill.
These links put the incident in a very different light to the dark dashcam footage:
The road widens at this point, although you can't see that in the original dark dashcam footage.
The lady with the bike was crossing her *fourth* lane by the time she came to grief.
- From the pedestrians' perspective, it's a lot harder to time/judge how fast a car is coming when she left the kerb so much earlier. It was not a wise place to cross (and is forbidden by signs) - but the her error in judgement of timing is much finer / more forgivable than other reports or a view of the dashcam alone would suggest. As pedestrians, we've all made similar misjudgements a handful of times in our lives. The car would only have had to slow by 5-10 miles per hour a few 10's of yards earlier, and the lady would have made it across the road; it would have been close, but she'd've been ok.
- As far as the car's navigation cameras / systems are concerned she most definitely did not "step off the curb into the path of the car" - she'd already crossed 3 lanes, and should have clearly been identified, and identified as being on a collision course.
The car definitely should have been able to prevent this tragedy.
Found this from August 2016.
In summary, driver assist systems (collision avoidance) for (non-autonomous) vehicles aren't nearly as reliable at detecting pedestrians and cyclists as some people might like to believe.
At the moment there's no objective standard for how good the perception software of a self driving car should be so we are in danger of expecting just too much from the technology -- it has to be 100% all knowing, 100% all-seeing and 100% capable of reacting to any situation without any error. This is a bit unreasonable -- its an ideal to strive for, certainly, but its just not attainable.
An unfortunate fact of life is that if you appear out of nowhere to any driver - human or machine - then there's a high likelihood that you're going to get run over. Motorcyclists are only too aware of this problem because their machines can move faster than humans can perceive their presence (put simply, the car that gets you will have a driver that says "I didn't see him, he came out of nowhere" which, unfortunately, is true quite a lot of the time). So, if you plan to cross the road, especially at night (and doubly so in the US where street lighting is still very much in the Stone Age) you need to be aware of who's coming and make sure that they see you and you both avoid each other.
Alternatively, we could just build in precognition into the software.....
Biting the hand that feeds IT © 1998–2020