Clear cut...
...reckless driving by the actual meatsack behind the wheel, no?
A woman has died after she was hit by one of Uber's autonomous cars in the US. The taxi app maker said it is cooperating with the cops in the wake of the death. According to police, Uber's vehicle was driving itself, although it had a human pilot behind the wheel, when it hit a woman crossing the street in Tempe, Arizona. …
From the San Francisco Chronicle:
"Pushing a bicycle laden with plastic shopping bags, a woman abruptly walked from a center median into a lane of traffic and was struck by a self-driving Uber operating in autonomous mode."
and
"From viewing the videos, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said. The police have not released the videos." (Moir is the local chief of police)
This is about what I expected.
Unlike humans, machines do not typically go off task and forget to pay attention. They'll never be perfect, but nothing is. The practical question becomes 'on the average are they at least as safe as human drivers'?
It sounds like the vehicle was operating in a lane where the probability of pedestrians or bicyclists was lower than, for example, curb lanes... a good choice for limiting the chance of the sudden incursion of a pedestrian into the lane. Of course, probabilities are not certainties.
Autonomous vehicles have been doing road testing for years, and it seems like they have a reasonable grasp on the basics, at least for easy conditions (no snow, fog, ice, cold, massive crowds, parades, demonstrations, etc).
"It sounds like the vehicle was operating in a lane where the probability of pedestrians or bicyclists was lower than, for example, curb lanes."
In a country where pedestrians are second class citizens, and where road rules against crossing like this are generally enforced vigorously as an income earner for the local authority.
All of which adds up to programmers making assumptions that turn into clusterfucks.
Alan, you seem to have more information than the rest of us. How did you know the Uber car didn't slow down for the pedestrian?
Also, it was news to me that Arizona was like the Eastern US with regards to pedestrian right-of-way. I always assumed they were like the other Western states, which have very different customs and laws from the East. But then, from my limited experience, I would have thought that Europe in general and the UK in particular were more like our East coast: Walker beware. Certainly the attitude of a London cabbie toward my walking behavior on my first visit to that area told me to assume cars there were out to get me. I was young and grew up in the West. It was a learning experience.
"Alan, you seem to have more information than the rest of us. How did you know the Uber car didn't slow down for the pedestrian?"
TBH, I don't, but the statements being made make it clear that thew pedestrian was in a "non-allowed" position and made an "unexpected" move.
That, coupled with what I've seen of autonomous vehicle behaviour and the assumptions coded into it (which are limited by the cultural assumptions of humans and "road rules" without regard to the fact that we unconciously react to a bunch of other stimuli which tend to be "commonsense" such as "that pedestrian shouldn't be there, switch to high alert mode") lead me to believe that it simply wasn't coded to handle the situation and the minder wasn't paying attention until the victim had already bounced off the front of the vehicle - Google specifically modified their supervisory roles because they found the humans who were supposed to be paying attention were spending almost all of their time with eyes inside the cabin.
You can _never_ "assume" when driving and one of the most important assets about driving automation is that a robot should be paying 100% attention for anomalous items and potentially hazardous behaviour 100% of the time. Pedestrians shouldn't be on the freeway either, but it happens, as do deer. I don't want my robocar tangling with a 7 point stag because it doesn't recognise it as a hazard.
You've never been to AZ
1) generally flat terrain
2) economically driven, substantial pedestrian traffic
3) wearing of dark colours - the need to not stand out [economic and personal safety issue]
4) http://ktar.com/story/443238/tempe-police-step-up-efforts-to-enforce-pedestrian-laws/? [mistaken about definition of crosswalk, which most of AZ seems to be]
Saying this, have no idea of the circumstances.
Some folks in SoCal prefer the train.
Sister-in-law was a Jane Doe for about two weeks and prefer to think she was confused and wandered into traffic.
Do miss Uber testing in the neighborhood where dust storms and wet microbursts are common. Set up E-Band propagation range kitty-corner across intersection. Wanted to see them earn their 9's without mm-Waves.
Mock not. I have seen almost the exact same accident.
Elderly and preoccupied and deaf person steps out straight in front of car. No chance whatsoever.
All this self righteous guff about 'stopping in the distance you can see' is meaningless if somewhere closer becomes occupied unexpectedly.
The only safe place for a car is in a garage.
Some risk is unavoidable, elsewhere.
"somewhere closer becomes occupied unexpectedly"
Somewhere closer never becomes occupied unexpectedly. It always becomes occupied because something a little further away moved there (time travel excepted). Just as traffic lights never go red unexpectedly.
So if you're prepared to go slow enough and are perfectly attentive you can avoid all collisions.
Sure, that means you have to go really slow on narrow streets with cars parked either side and you end up being limited to around 20mph in built up areas, but is that really so much hardship for a world with no pedestrian road traffic injuries?
Avoiding fast moving objects like deer and other vehicles is a much harder problem.
The bit you cut out did mention the parked cars...
You don't end up limited to 5mph. That's only because of the "thinking time" need for humans. An autonomous car tracking all the potential hazards shouldn't need that. And you don't need to come to a complete stop, just slow enough that injury is the same as running into a brick wall. There are formula for stopping distance. Run them, you get 16mph to 23mph depending on the street and visibility.
No, at most 5mph. As that is 2m braking distance.. and assuming instant determination of collision risk and instant full braking power. Both fail to happen.
Someone can jump between two cars to cross the street and avoid a water puddle. This is not theorical, as it has happened to me!! behind a van, and it is impossible to see, and there you have it, a flying pedestrian, that cannot stop and appears in front of you. It has in a 32mph zone.. and only because I was looking for a parking place was I able to stop barely touching the pedestrian.
"a flying pedestrian, that cannot stop and appears in front of you. It has in a 32mph zone"
Was the speed limit appropriate for the street?
Of course not.
30mph was an entirely arbitrary speed chosen in the early 1930s when a period of no speed limits resulted in a rapidly increasing number of crashes and pedestrian deaths. An urban limit had to be set and 30mph was chosen as a compromise between those wanting to return to the old 20mph limit and politically-connected advocates for 40-50mph on urban roads.
There's a more basic fail in the assumption being shown by motorists that they have more right or priority to use the road than anyone else. In law, they don't and in most countries, motor is required to give way to non-motor which is required to give way to animal or foot traffic - and using vehicle size to intimidate is a serious criminal offence.
"in most countries, motor is required to give way to non-motor which is required to give way to animal or foot traffic - and using vehicle size to intimidate is a serious criminal offence"
You should see the People's Republic of China. The only country I've been to where drivers, pedestrians and bicyclists have absolutely zero regard for others, the laws of the road and the laws of physics. Especially the laws of physics.
From the article: "it’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how [the victim] came from the shadows right into the roadway."
One of the selling points of autonomous vehicles is that they are not limited by human senses. They can see under vehicles and around corners and they can certainly detect that someone was lurking in the shadows even if it was between two parked cars on the side of the road. If I as a driver notice someone or something lurking there, I pay attention, slow down or otherwise take precautions for just this eventuality.
Two implications of the above quote are that police are simply unfamiliar with the capabilities of self-driving vehicles or that they are and those abilities are being overstated.
"Two implications of the above quote are that police are simply unfamiliar with the capabilities of self-driving vehicles or that they are and those abilities are being overstated."
Not really.
Like a human driver, an AV cannot assume everyone standing at the side of the road is going to wait until the vehicle is almost there, and then step in front of it. If you did that, you would have to ban pedestrians from roadsides in order to make a road functional.
Looking at pictures in the various reports, the road looks like a six or seven lane arterial road,and one report put the speed limit at 56 kph (35 mph).
A stopping distance calculator for this gives a stopping distance of 29m, of which 10m is 'thinking time'.
While a machine may calculate faster than a human, given recognition algorithms, etc, it may be significantly slower than 'zero think time'.
In any case, if the pedestrian steps out 10m in front of the vehicle, they will be hit at or close to full speed. There is no reasonable way to prevent this.
In parts of the UK at the moment they have a thing for lowering residential speed limits from 30mph to 20mph, but research undertaken in other areas of the country that have had them in place for a while has found that the risks of serious injury between car and pedestrian in a 30 zone vs 20 zone are almost the same... it's no safer... and there are more incidents, as both pedestrians and drivers consider the risks of a 20 zone to be much lower so no one is really paying attention anymore, and the vehicles that didn't drive at 30, certainly don't drive at 20!
Council says it's revolutionary.... yet it's weird... in the late 1980s I lived in France where the speed limits in most residential areas where 30kmph which is about 18mph.
Anyway, that's a kinda side note.... back to the conversation....
Somewhere closer never becomes occupied unexpectedly. It always becomes occupied because something a little further away moved there (time travel excepted). Just as traffic lights never go red unexpectedly.
So if you're prepared to go slow enough and are perfectly attentive you can avoid all collisions.
------------------------------------------------------------------------------------------
Absolutely and totally wrong.
1) Sometimes something happens that cannot be anticipated as a probable event ... such as an oncoming vehicle swerving into your lane and hitting you. The only way to avoid this is to never approach an oncoming vehicle. And yes, this happened to me. It was a while before I was comfortable when seeing oncoming headlights. There are a thousand variations of the unavoidable, unpredictable obstacle, including cars running red lights at speed when their approach is masked by buildings, people who have been standing on the curb for a minute as you approach, then suddenly starting to move by stepping into the lane, and so on.
2) Traffic lights do go red unexpectedly. It takes a failure rather than normal operation, but it does happen - if you drive enough, eventually you will see this. It has happened to me while approaching a light on two occasions.
3) Large vehicles can completely obscure pedestrians a metre from the lane you are using. In busy areas, you can't avoid it, And driving 30 in a 60 or 70 kph zone is not safe.
4) The only way to achieve zero pedestrian road traffic injuries is to ban at least one of (a) roads, (b) all vehicles and transport animals including bicycles, or (c) banning pedestrians. And yes, pedestrians have been killed by collisions with bicycles and horses.
5) Humans cannot be perfectly attentive. Electronic driving systems, maybe. Humans - never. And if you want to achieve good reliability levels for attentiveness, you have to ban drinking, fatigue, lack of sleep, many medical drugs, cell phones, radios, music players, cigarettes, emotional stress, and passengers. Good luck with that.
"Somewhere closer never becomes occupied unexpectedly. It always becomes occupied because something a little further away moved there (time travel excepted). Just as traffic lights never go red unexpectedly."
I am surprised Adam52 has so many downvotes. He is completely accurate in this observation.
There must be a lot of absolutely bloody awful drivers on this forum if they don't agree with his statement and all I can say is that the sooner insurance companies notice that robocars have lower crash/claim rates and start making human drivers pay much higher premiums unless they pass advanced tests, with mandatory retesting to _keep_ those lower premiums, the better.
There's an old joke that most cars know the laws of physics better than most drivers and it predates the use of computerised driver aids.
Ok, so now hold that thought and apply it to a city like Paris, Madrid or London. In London you very much drive with the expectation that someone is about to jump in from of the car, walk across your blindspot point, run past the back of the car even thought they can see you reversing or (as has happened to me once) actually jump onto the bonnet (hood) and run over the car.
And that's pedestrians: now factor in London cabbies, bloody-minded bus drivers, cyclists, Deliveroo mopeds, your average White Van Man. Oh, and Über drivers pulling out without signalling or notice.
If they can kill a lady on a bicycle in a nice flat, open place like AZ then the only people who are going to benefit from Über Autonomous Cars are BÜPA, Axa PPP and a lot of funeral directors.
Whooa, let's wait up a bit to get the facts first, right? Uber or not Uber.
First, we don't know if the pedestrian did something that made avoidance really hard.
Second, I've always had my doubts about systems-good-enough-to-drive-almost-all-the-time-but-human-as-failsafe. If human then fails, then throw book at him.
That model works, well, with autopilot systems on aircraft. But, the crucial bit is that pilots have plenty of time to take over at cruising altitude and they are most definitely in the loop, if not outright controlling, at crucial points like takeoff and landings. Commercial pilots are also top level professionals in their field and they benefit from decades of massive investment in researching failure root causes in aviation.
Expecting a trained operator, who is a passive bystander almost all the time, to react instantly to avert an accident, each and every time when needed is unrealistic psychology of how humans work. Yes, an attentive backup driver may get it right 99% of the time, but it won't be like someone who is already in control of the vehicle. This is true for test drivers, but it will be 10x true if regular Joes and Janes are expected to instantly correct bungles by their autonomous vehicle.
Even with a backup driver, the AI really needs to be very, very, good at avoiding accidents. This is going to require some rethinking of test protocols, even if AI surpass regular human drivers in safety. Maybe we also need to mandate some failure analysis collaboration between competing AI companies - don't want to have the same mistakes done over and over due to commercial secrecy.
Thoughts to the family of the killed pedestrian. And, yes, to the driver and engineers too. This is a sad moment, no need for gloating and finger pointing. And, yes, that remark extends to the headline, despite my dislike of Uber.
"Even with a backup driver, the AI really needs to be very, very, good at avoiding accidents. This is going to require some rethinking of test protocols, even if AI surpass regular human drivers in safety"
Many of us have been expecting something like this to happen sooner or later if automated vehicles were allowed to be programmed by american drivers, due to the uniquely pedestrian-hostile environments and laws in that part of the world.
Humans don't like running things over (even animals), but if you program your vehicle with an assumption that legal road rules are really the way things are then you'll get robotic killing machines if people don't do as 'expected' and only cross at "crosswalks", or with lights.
Programmers from other countries know that pedestrians have priority on the road at all times and are legally allowed to step onto the road from anywhere, so will make sure the machines are setup to react accordingly - and ensure that if a pedestrian 40 metres ahead looks to be about to step onto the road, that the vehicle is already slowing down.
Arizona is one of those states which is one of the shittier places for pedestrian safety, which makes it a lousy place for testing robot cars - sure you can test how they go when things are 'normal', but you have very few exception conditions appearing to exercise the "non-normal" testing space and that means the supervising hoomun gets complacent, sleepy and too slow to react when things go pear-shaped.
"Maybe cities will start installing "safety" fences to protect autonomous vehicles from random human behaviour."
Like that works well....
https://www.gov.uk/government/publications/pedestrian-guardrailing-ltn-209
http://content.tfl.gov.uk/guidance-on-assessment-of-pedestrian-guardrail.pdf
http://content.tfl.gov.uk/Item09-Guardrail-Removal-Programme.pdf
"why was the pedestrian-cyclist crossing the road that that point?"
Unlike the USA, in most of the rest of the world a motorist is required to avoid running over pedestrians on the road no matter whether they're meant to be there or not.
If the car could have predicted the collision and pulled up then it should have. Even if it couldn't have stopped in time, it should have attempted an emergency stop.
38mph impact = 95% chance of death
35mph=50%,
30mph=5%
20mph=1%
At 38mph, most cars can stop in around 2-3 car lengths once the brakes are applied on dry road. It takes a significantly longer distance for a human to react and get foot from accelerator to brake.
Laws giving cars priority, 'jaywalking' and general pedestrian hostility on US roads are a direct result of decades of lobbying by the US car industry. The USA used to have the best public transport system in the world until it was systematically targetted and destroyed by General Motors between the late 1920s and 1955. They collected a couple of antitrust convictions for it, but that didn't stop the activity.
Unlike the USA, in most of the rest of the world a motorist is required to avoid running over pedestrians on the road no matter whether they're meant to be there or not.
Yes, obviously, but that doesn't mean it's your fault if you're on a motorway and you hit a pedestrian crossing it. Don't exclude the middle.
Having now seen the in-car video of the accident (it's a challenging watch), I find myself asking the same question I asked at the start. It really doesn't look like the human was engaged in any sort of meaningful supervisory role at all.
Assuming that the camera from the footage isn't anywhere near as light-sensitive as a human, it looks to me like there would have been enough time for an attentive driver to make the collision survivable or even avoid it altogether.
I wonder why the tests mandated any human in the car? It might give a veneer of safety, but they must know that the amount of attention a human is able to pay to the road when they've got nothing to do tends towards zero.
Here's a somewhat better camera making the same journey at night. The video of the incident is a complete con.
And that's where humans are superior. You see someone on the sidewalk that's walking erratically or towards the street and you cover the brake. Do self-driving cars even scan for behaviour of bodies on the sidewalk along the way? Do they recognise when little kids are playing ball? I thought not.
"Er, people driving cars hit cyclists and pedestrians all the time!"
You're quite right. And perhaps the overall accident rate will indeed go down. My point is that humans are superior at some things. I'm thinking that despite a lowering of accidents, the ones we will see may be of a different kind.
OK to add another angle to the debate. AI systems will struggle to identify every pedestrian or cyclist likely to do something spontaneously that could cause an accident. In the UK, although not official policy, the driver of a vehicle is assumed to be the person responsible for an accident until proved otherwise. In considering the ability of AI driven vehicles to be safe, perhaps those in government promoting self driving vehicles should consider promoting road safety education for kids more (the old "Tufty Club" and "Green Cross Man" of the 60s and 70s have never been fully replaced) and in particular rethink about age limits and training for cyclists before being allowed out on roads. Autonomous driving or not that should help reduce accidents anyway.
I'm not a cyclist but I regularly visit the Netherlands. There are far more cyclists and far fewer cars there, and, as a result, their towns are far more pleasant places than our car-congested British towns.
So maybe the answer is actually to have fewer cars and more cyclists.
"There are far more cyclists and far fewer cars there, and, as a result, their towns are far more pleasant places than our car-congested British towns."
Until the 1970s, the Netherlands' roads were just as cycle-unfriendly and child-killing as the UK's.
They decided they'd had enough of it and that they wanted to change it. So they did.
"Taking cyclists off the roads would save a lot more accidents. Cycling is a Victorian mode of transport – incredibly dangerous on modern roads"
I have a feeling that taking cars off the roads would make them much safer!
If you look at the stats - the DfT publish them - then cycling has about the same injury rate per km as walking. Motorcycling on the other hand...
"I have a feeling that taking cars off the roads would make them much safer!"
I suppose we could all switch to trucks.
If you got rid of trucks and cars, then the roads would be pretty much useless and a lot of people would die from starvation, lack of medical attention, boredom, poverty, and so on.
As someone living in a rural area completely unserviced by public transport, with no car, no license and no disposable income (and therefore no means to either pay for removal services or buy a car), perhaps you'd care to explain how exactly I'm supposed to travel the 10 miles to my nearest town each day, for work, doctor appointments and everything else, if not by bicycle?
Anti-bicycle bigots are not only the lowest scum on Earth, they also seem to live in a privileged little bubble, completely disconnected from reality.
Very true. Still, no matter what you do you can never fully eliminate all accidents. I read somewhere she decided to cross the road suddenly at a non-crossing place, giving no time for both the human and robotic operators to react. Doesn't matter how fast your reaction time is or how good the electronic safeties are, you still can't beat the laws of physics: a 1.5 ton car moving at normal speeds cannot come to a full stop instantly.
Anyway, we don't know the details of what exactly happened yet.
"I read somewhere she decided to cross the road suddenly at a non-crossing place"
Which shows the implicit assumption that people are only allowed to cross at designated locations, which translates to the programmed assumption that people WILL only cross at designated locations.
"giving no time for both the human and robotic operators to react."
The robotic failure was the fault of the progammer(s) fucking up.
The human supervisor failing to notice and react to someone standing on the median shows what google found - human supervisors simply "switch off" after a while. I'll bet he or she wasn't even looking out the window.
"It is more likely that it was the cyclist that made the mistake."
Virtually all road crashes require AT LEAST 2 (usually 3-4) or more serious errors to be committed before they can occur.
In this case:
1: The pedestrian crossed at an unsafe position
2: The pedestrian crossed without paying attention to oncoming traffic
3: The vehicle driver failed to notice the pedestrian on the median strip and prepare to take action if necessary or move into another lane
4: The vehicle driver failed to take emergency avoiding action when the pedestrian showed intent to move into its path
A human might have noticed the lady some distance off and decided to move to another lane. This is a fairly common preemptive move that the robot programmers hadn't considered.
Emergent Tech? Not Rise Of The Machines?
Seriously though, if it turns out the "driver" was having a light snooze or something, there will be hell to pay.
Either that or it approached the intersection: "it's gonna brake... it's gonna brake... it didn't brake." - I'm wondering if the window between "oops shit isn't working" and "she's dead" was too small to actually do anything useful. By the time you realize it's not going to brake, it's too late.
@YAAC it was a cyclist, so probably entitled to not be on a crosswalk.
Though one of the tweets said it was at night so possibly she had no working lights on her bike, or perhaps cyclist have not been included in the collision avoidance software.
Sad to be remembered for being the first to die in an autonomous vehicle accident.
I'm pretty sure that cyclists were included but the dataset for AI training must have been rather small. I am also pretty sure there must be several other scenarios for witch the dataset for AI training is rather small.
The investment required to train the AI software properly is huge and investors are such an inpatient bunch...
Posted in response to: "...or perhaps cyclist have not been included in the collision avoidance software"
Are there countries where it's legal to run down pedestrians* on a normal road? There may be mitigation, but in civilised countries, it's the driver's responsibility to not drive into pedestrians (and many other things).
Certainly here in the UK the law has offences like "Driving without due care and attention", e.g. driving too fast past a line of parked cars when a kid runs out.
All the moral dilemna debates about autonomous vehicles having to choose who to crash into are missing the point. If the self driving car has got itself into a position where such a choice has to be made, then the programmer / manufacturer has failed and someone probably ought to go to jail.
Hopefully this accident will cause a review into the reasonableness of testing on the public highway. It's now harder to justify it as being safe.
You assume the circumstances would have to be created entirely by the self-driving car, but that needn't be the case.
I can think of many driving scenarios, and indeed I've witnessed a few, where non-AI (a.k.a. human) drivers found themselves caught up in other people's stupidity, or in some cases mechanical failure, or even unforeseen weather conditions, or unexpected obstacles suddenly appearing (typically falling), from which there is no escape route.
In fact, it's happened to me.
I was cycling on a long straight with a sheer drop on one (my) side that would certainly have killed me, and came face to face with an idiot driver who decided that a cyclist wasn't worth worrying about as he overtook another car.
I could have gone left and plunged to certain death, gone right and into the path of the oncoming vehicle he was overtaking, or splattered straight into the overtaking vehicle's windscreen like a bug.
Another remotely possible alternative would have been for me to make a 90 degree turn directly into the grass verge on the right hand side, and just hope I could make it before the other car hit me.
Bear in mind that both cars were maybe 3 seconds away, and travelling at high speed on a country road.
The rational solution would have been for the overtaking driver to slam on his brakes, but then had he been thinking rationally then he wouldn't have been overtaking with an oncoming road user in front of him in the first place, so clearly that was never going to be an option, especially when the collateral damage was "only some shitty cyclist who shouldn't be on the fucking road in the first place".
I had plenty of time to think about this in hospital, including the viability of a cyclist doing a 180 and trying to outrun a lunatic driver going full throttle on the wrong side of the road.
Ultimately I had to concede that this was a checkmate scenario, and there was no way to win.
The question is, how does an AI handle no-win situations?
"If the self driving car has got itself into a position where such a choice has to be made, then the programmer / manufacturer has failed and someone probably ought to go to jail."
Not necessarily. The situation could be thrust upon them outside their control (a "Crap Happens" moment), like a tree suddenly falling onto the road. That's why the Trolley Problem exists: because Crap Happens. You can't account for Crap Happens moments because they can happen even when standing still, meaning the only alternative is to not do anything.
"If it helps, think of a noted scientist. A small child may grow up to be a serial killer."
The vast majority of noted scientists will be taller than the parked cars are high so will be seen before stepping into the road. The vast majority of small children are...er...small, so you don't see them step between the cars, only when the appear on the road in front of you.
> The vast majority of noted scientists will be taller than the parked cars
Nonsense. Not unless you exclude both female noted scientists and large cars. (And in the US, most cars are large.)
Here’s a random website with car dimensions: https://www.automobiledimension.com/ford-car-dimensions.html I’m shorter than 11 of the 21 cars on that page. I’m male, but about average height for a female.
it's the driver's responsibility to not drive into pedestrians
But sometimes it's not the drivers fault.
UK official stopping distance at 30mph is 23m, if a small child runs out into the road 1m in front of my car then it may be my responsibility to avoid them but the laws of physics disagree.
Ironically we may end up with new stopping distance rules that ONLY robot cars can comply with - eg. thinking time is now <1ms and stopping distance requires 4wheel drive DC motors + ABS.
Is that similar to the historical Official (Maximum permissible) Stopping Distance of "240 feet" from "60mph" as so often ridiculed by the famous but loveable orangutan on Top Gear?
There's a list somewhere of the 24 cars that can pull-up to a dead stop from 60 mph in less than 100 feet. A few of the cars were semi-normal, but most were high performance. There's another list for motorcycles where the very best is about 109 feet, a +20-odd foot delta that should be kept in mind.
I'm just saying that there is a practical limit to the "driver should always be able to stop".
If with perfect reflexes +perfect brakes I can stop from 30mph in 6 car lengths, then the kid is safe if they step out 6 cars ahead of me but not 4. So speed limit in school zones is 20mph = 4 car lengths. But what if they step out 2 cars ahead of me? Should I have been going 10mph? What about 1 car length?
You soon end up in Xeno's commute where I get asymptotically slower as I approach the front bumper of a parked car
"You know, stopping is ideal, but slowing down and avoiding is an option. So you are saying if the kid jumps out 2 car lengths ahead, you wouldn't react and just run him/her down?"
If a child jumps out 2 car lengths ahead, there is no time to stop, or even slow down.
For a human being, the time to reach the kid and the time to see and recognize the situation are the same.
"You soon end up in Xeno's commute where I get asymptotically slower as I approach the front bumper of a parked car"
No you don't. Most children will survive an impact at 20 miles an hour. Few will be fatally injured at 10. The idea that any one would consider it OK to drive at 30 - or even 20 - anywhere where there is a possibility that a child 1m in front of you could enter the road is just incredible. Slow the hell down and pay attention to what is going on at the sides of the roads. Unlike an AI, a meatsack driver has no excuse for not anticipating what groups of children may do.
"The idea that any one would consider it OK to drive at 30 - or even 20 - anywhere where there is a possibility that a child 1m in front of you could enter the road is just incredible."
---------------------------------------------
That would be pretty much any road except for a controlled access highway with tall fences on either side.
Which makes your suggestion ridiculously counter-productive... Although you could achieve the same thing by banning powered vehicles, animal transport faster than oxen, and any human powered vehicle capable of more then 2 km/h. I wouldn't want to try to live in your world.
If someone steps out 1m ahead, most people couldn't avoid a collision while walking fast.
The biggest risk for a small child isn't impact but getting caught under and dragged, which can be fatal at any speed.
Hence the universal switch to massive SUVs and pickup trucks outside schools that a small child can walk under. They are just thinking of the children
> If with perfect reflexes +perfect brakes I can stop from 30mph in 6 car lengths, then the kid is safe if they step out 6 cars ahead of me but not 4. So speed limit in school zones is 20mph = 4 car lengths. But what if they step out 2 cars ahead of me?
That's not how physics works.
d=(v^2)/(2*μ*g)
v is your velocity
μ is your coefficient of friction
g is 9.8ish here on earth
In your example, the only variable is v. So if it takes 6 car lengths under some condition to stop from 30, then
Assuming a 5m car length and otherwise just doing si unit conversions
μ=(13.4^2)/(2*9.8*(6*5))=0.305
Plugging those back in for the 20mph case
d=(8.9^2)/(2*0.305*9.8) = 13.2m
That's a pinch over 2.5 car lengths. Not 4. So you would still hit, but at a *much* slower speed. Maybe they'd even survive.
But if people undertake an activity that requires them to assess the safe speed for a certain visibility distance, like we driving say, they owe it to society to get a basic understanding about how speed and conditions affect stopping distances.
If with perfect reflexes +perfect brakes I can stop from 30mph in 6 car lengths, then the kid is safe if they step out 6 cars ahead of me but not 4. So speed limit in school zones is 20mph = 4 car lengths.
At 20mph rather than 30mph, some collisions become avoidable, but of at least equal benefit is the greater survivability of unavoidable ones.
In a UK school zone (around pupil arrival / exit times ) you will probably be doing less than walking space as the road is chocker with polluting parked Chelsea tractors of parents driving their offspring to the school gates.
I used to live near a school - you would go nowhere near that road at peak pupil times as you would be gridlocked
"you will probably be doing less than walking space as the road is chocker with polluting parked Chelsea tractors of parents driving their offspring to the school gates."
Which gives rise to the other problem in the UK.
Speed humps are installed to protect children walking to school from being run over by cars containing those being driven to school.
"Speed humps are installed to protect children walking to school from being run over by cars containing those being driven to school."
Even worse:
1. Speed bumps distract drivers from looking for hazards diverting them to watching for and dealing with bump after bump - taking attention and vision from other areas to a narrow focus while increasing task loading. As others have noted here and in many many accident reports (aircraft, diving, and maritime ones are often very complete and informative) task loading can be a major contributor to operator error.
2. The less you mitigate the effect of the bumps, the more incremental stress and damage on steering, braking, tires, and suspension components, gradually reducing the reliability and emergency capabilities of the vehicle. In most cases this won't cause an accident, but most trips don't result in one either. By the time you talk accident you are already way down in the tail of the probability distribution, and you don't need any more issues that will make things worse, or change a not-accident into an accident.
This post has been deleted by its author
"I didn’t realize that those signs in the UK (who else uses miles and meters?) were minimums."
You're right, of course, the signs are a limit, not a target. Stopping distances and small kids stepping out from between parked cars means that although the limit may be 30mph, the driver ought to be using their judgement as to what a safe speed is considering the obstructed view. Residential areas, daylight, outside school hours width of the road, other conditions, then the driver should probably consider doing less than 30mph, possibly much less.
@mevets;
I'm pretty sure that is how the speed limit is applied and intended in the UK, it would be covered by either 'driving without due care and attention' or 'driving without due care and consideration for other road users'.
I was taught that you must drive according to road and traffic conditions so the limit is only the maximum when conditions allow.
"UK official stopping distance at 30mph is 23m, if a small child runs out into the road 1m in front of my car then it may be my responsibility to avoid them but the laws of physics disagree."
It was your responsibility to see them at the side of the road, or to see that you couldn't guarantee that there was no-one there, and adjust your speed accordingly.
Thus changing the 23m stopping distance into something much more reasonable, and starting at a speed which is far less likely to kill said small child if you still fail to avoid the collision.
This post has been deleted by its author
This post has been deleted by its author
but the braking distance of a car is not reduced at all by having motors, as the limit is already defined by the tyre/road interaction
Is that necessarily true ? Why do supercars have huge actively cooled carbon fibre discs?
I was thinking that if a Tesla can do 0-60 in <3s with the torque of the motors then assuming the battery can absorb electricity at the same rate it should be able to actively brake at the same rate ( a little better since aerodynamics are working for you)
Why do supercars have huge actively cooled carbon fibre discs?
Because brakes convert kinetic energy to heat. A heavier vehicle, one braking from higher speeds (especially this), or braking more often needs to be designed to dissipate greater power in its brakes to prevent them overheating.
> Is that necessarily true ? Why do supercars have huge actively cooled carbon fibre discs?
Ah, the thing your are missing is the second corner. Even regular stock brakes can do an emergency stop from any speed your car can travel. The issue is when you want to do the same thing again 30 seconds later at the next corner. And again. And again. You cannot do that with stock brakes
Your braking limit is the maximum deceleration at each tyre before it loses traction. That depends primarily on the road surface and the contact patch of the tyre. That is why wide profile tyres and racing slicks improve stopping times (in the dry). You do also need to take into account that under heavy braking, your car's centre of gravity will move forward (blame Newton), so you have higher traction on the front tyres but lower on the rear. Modern ABS braking systems continuously monitor each wheel speed (plus steering angle) to make sure this each brake is doing the most that it can possibly do beneath this limit to wash off the speed. There is some serious boffinary in these systems.
"Why do supercars have huge actively cooled carbon fibre discs?"
Because kinetic energy is proportional to the _square_ of velocity and you dissipate 4 times as much energy going from 200km/h to 100km/h as from 100km/h to 50km/h
If they weren't well cooled the brakes would stop being effective partway through slowing down from maximum speed (which used to happen in 1970s muscle cars and wasn't pleasant to experience)
Below 100km/h they make very little difference at all.
Yes, in Switzerland.
I've heard that the rules governing jaywalking in Switzerland, and its traffic regulations in general, are quite draconian. Apparently if you get hit by a tram then neither the tram driver nor the company is ever liable, under any circumstances, period.
It could be that my ex-pat friend is overstating things somewhat, but I'm sure there's at least a kernel of truth to it.
Years ago when I was in Turkey (1991 and again in 94), I was told cars have the right of way.
If you get hit it's your fault.
---------------------------------
That was my impression of New York City, at least where the taxis were concerned. If you were lucky, they slowed down while going through red lights.
I lived in NYC 20 years (moved to CA two years ago) and I can tell you the cabs know their **** will be pushed in if they hit somebody - its not those guys I had an issue with.
It was the wankers in nice cars that would try to bust through the crosswalk when the pedestrian sign says WALK. Those bastards would try to barge past, though the true NYCers dared the mongrels to try it, whereby the car owner's **** would also be pushed in.
Amusingly the last time I saw a cab in an accident in NYC (non-injury is why it is amusing) was when a courier bicyclist blew through a red light and plowed into the side of a yellow cab, getting a bent front wheel for his effort.
That is one of the extra benefits of the taxi license system; careless drivers/owners can lose their license and hence their livelihoods. They are rude; but, they try hard as hell to avoid actual accidents that can cost them money in more ways than one.
Probably Turkish cab drives then? From what I've seen, the cab drivers are all immigrants from other countries. It's been a few years since I was there, but back then, I had great cab drivers and the drivers from hell and for the most part, they were all immigrants.
This post has been deleted by its author
"That was my impression of New York City, at least where the taxis were concerned."
You'll be happy to know that NYC has been installing cameras specifically to nail drivers who barge their way through pedestrians on crossings and they've taken at least 100 taxi drivers off the road amongst others.
It's not true in Turkey.
In a lot of countries the driver is fully responsible for ALL your medical costs and as those are very high they will try to ensure that if they hit you, that they kill you as this is much cheaper to pay off - to the point that many drivers will reverse back over someone they ran over to ensure that they finish them off.
In many "less developed" countries a driver who is silly enough to stop and try to help someone he's run over will be beaten to death by bystanders.
"Are there countries where it's legal to run down pedestrians"
'run down pedestrians' is a needlessly biased way of putting it, which slyly implies malevolence or carelessness.
Better to ask 'are there countries where vehicle/pedestrian collisions are not always assumed to be the fault of the driver'?
The answer is yes. Some of them highly civilized.
Are there countries where it's legal to run down pedestrians* on a normal road? There may be mitigation, but in civilised countries, it's the driver's responsibility to not drive into pedestrians (and many other things).
*If it helps, think of small children irresponsible parents and legal loopholes.
Fixed that for you, because you are only looking at the green grass and disregarding other problems. First, law is not your parent. Parents are not excused for letting or leaving kids on their own on the road. Second, there are actual people who jump in front of cars to get hit just to abuse this type of loopholes to get money.
So, yes there are civilised countries where you do not get punished if it is the pedestrians' fault for the accident.
Also in countries with overall higher intelligent, people tense to put their personal responsibility in front of any legal responsibility, meaning they actually avoid jaywalking when seeing a fast car approaching. It's because when they or kids die, it's dead regardless of what happens afterward.
What a stupid comment!
Of course there is nowhere wher it is legal, but, depending on circumstances, there are many countries where a driver can be held not to be at fault for running over a pedestrian.
Not being at fault (and therefore receiving no penalty) is NOT THE SAME as 'legal'.....
The TV news chyron in that image says "bicyclist" and there is a crumpled bike on the scene. Further, that spot is right where a new right turn lane starts, by crossing over the bicycle lane.
This post has been deleted by its author
"How many people are killed every day by human drivers???"
Globally about 3,500. But that includes a huge 25 fold variation in per capita rates per country. The question you should be asking is "how many miles per fatality have been driven by AVs at normal speeds* on US roads, and how does that compare with the figures for human drivers?"
* Because a lot of testing has been done at unrealistically slow speeds.
The USA typically has about 2x higher auto fatality rate compared to other developed countries - mostly because of low seatbelt use (although it is improving)
The rate for pedestrian injuries is lower - mostly because there are fewer pedestrians/cyclists in US cities and more commute miles are driven on freeways with no pedestrians.
How many people are killed every day by human drivers???
16 in the USA (and rising)
https://www.usatoday.com/story/money/cars/2018/02/28/pedestrian-fatalities/376802002/
"Although reasons for the recent rise have not been scientifically determined, experts suspect that smartphones and marijuana use are key factors in the deadly trend."
An initial report said it was a cyclist, but then the story was updated to say pedestrian.
The photos on (eg) USA Today show a crumpled bike, and name the victim Elaine Herzberg. Could we at least give her the dignity of a name? I'm saddened for the fact that she died whilst a money grubbing corporation experimented on the public, for her family, friends and colleagues, for the fact that this happened so early in AV history.
But I'm unsurprised Uber were involved. What the fuck will it take for this verminous, incompetent, predatory organisation to be shut down and rubbed out from history?
"Could we at least give her the dignity of a name?"
Is it REALLY necessary to attach a name to the victim? In the US at least, it's considered very bad form (not to mention an invasion of privacy) to prematurely identify victims. At the very least, time needs to be allowed to inform next of kin.
Early reports suppressed her name because her next of kin had not yet been informed. This is so people don't find out about the death of loved ones from a news report. They get told by a human in a more sympathetic way. Maybe it took longer to identify her next of kin because she was homeless.
Early reports suppressed her name because her next of kin had not yet been informed. This is so people don't find out about the death of loved ones from a news report. They get told by a human in a more sympathetic way. Maybe it took longer to identify her next of kin because she was homeless.
Before this became common practice, there were the ambulance chasing press. Many people heard the doorbell ring, answered it, and some jackass would push a microphone into their face with a camera running and say something like: "Hi Mrs. XXX, we just came from an accident. How does feel to now be a widow?" Ok...maybe not quite that harsh but a lot of reporters were over the top shall we say.
Everywhere in the US, pedestrians have the right of way.
Everywhere.
Even if they are not in a crosswalk.
So yes, it is the drivers fault.
----------------------------------------------------------------------------------
No, it is not the driver's fault, until a court decides that it is.
It is the "driver's" responsibility to show that they had no reasonable chance to avoid the collision.
There is no driver, human or electronic, that can always prevent a pedestrian from causing an accident.
The question is, or should be, is an electronic 'driver' more likely to be involved in fatal accidents over the range of circumstances that occur in a large, diverse, and representative sample of accidents.
I suspect the answer is, or very soon will be, no. Given the human failings of distraction, addiction, emotion, fatigue, inattention, incompetence, and illness it is very likely that electronic replacements will be significantly safer for everyone. They will kill a few people, but far fewer than humans would under similar circumstances.
As always, policies based on statistically insignificant anecdotes are usually a bad idea.... however popular they may be with the twitter consuming masses.
"Everywhere in the US, pedestrians have the right of way.
Everywhere.
Even if they are not in a crosswalk."
Citation needed? Counter example (Vermont Statutes Annotated):
23 VSA 1052
(a) Every pedestrian crossing a roadway at any point other than within a marked crosswalk shall yield the right of way to all vehicles upon the roadway.
>"Everywhere in the US, pedestrians have the right of way.
> Everywhere. Even if they are not in a crosswalk."
>>Citation needed? Counter example (Vermont Statutes Annotated)...
+1
There are many ignoramuses in the US who assume because "they do it this way in my state" that it must be the same in every other state. That is often not true. Only a small minority of US states give pedestrians the right-of-way outside a crosswalk. A US state-by-state overview is at:
http://www.ncsl.org/research/transportation/pedestrian-crossing-50-state-summary.aspx
Not true, though they often behave that way.
"Vehicles must yield the right of way to pedestrians at plainly marked crosswalks and at intersections where stop signs or flashing red signals are in place. Pedestrians must yield the right-of-way to vehicles when crossing outside of a marked crosswalk or an unmarked crosswalk at an intersection." ncsl.org
Self-driving cars seem to be being touted as the be-all and end-all for the future. In which case, surely the beast should have had enough electronic know-how to realise what was happening and either brake or avoid? Isn't this the very thing that is supposed to be their salient feature? Safer than a human driver I have seen quoted. Doesn't appear to be so in this case. I reckon they have many years to go before they can be trusted (if at all) and, speaking personally, I don't think I would EVER feel at ease in one.
No that clever, why the hell was an AI car driving at 38 in a 35 zone? That is totally and utterly inexcusable. I don't care that it is "only 3mph over", the whole point of autonomous cars is exactly so that this sort of thing cannot happen. That in itself indicates that the software is of poor quality. Why should we trust any other function if it is incapable of doing something as simple as not exceeding the speed limit. This also brings in the other point people have commented on. The speed limit is the maximum and speed should be adjusted to suit the conditions or situation.
It appears to have failed significantly on this.
No that clever, why the hell was an AI car driving at 38 in a 35 zone? That is totally and utterly inexcusable. I don't care that it is "only 3mph over", the whole point of autonomous cars is exactly so that this sort of thing cannot happen. That in itself indicates that the software is of poor quality. Why should we trust any other function if it is incapable of doing something as simple as not exceeding the speed limit. This also brings in the other point people have commented on. The speed limit is the maximum and speed should be adjusted to suit the conditions or situation.
It appears to have failed significantly on this.
--------------------------------------------------------------------------------------------------------------
You are touting legalisms to the detriment of safe driving.
Safety experts are quite clear that as long as road and weather conditions permit, for a good driver and a vehicle in good condition, the safe speed is the one that meshes with the flow of traffic.
On a good road in good weather, this is almost invariably reasonably higher than the speed limit, and comfortably below the actual design speed of the road in question.
When all the other traffic is moving faster, a vehicle refusing to exceed the speed limit is an egregious safety hazard, unless that speed is justified by a compromised vehicle or driver - in which case it should be looking to get off the road.
"In which case, surely the beast should have had enough electronic know-how to realise what was happening and either brake or avoid? Isn't this the very thing that is supposed to be their salient feature? Safer than a human driver I have seen quoted. Doesn't appear to be so in this case."
---------------------------------------------------------------------------------------------------------------------
How on earth could you possibly know that?
So far I have not seen any information on what the pedestrian did, or what other things may have been happening at the time. There is absolutely no reason at this time to presume that a human driver would have been any less likely to hit the pedestrian.
You are jumping to the conclusion you want, not reasoning logically.
"You are jumping to the conclusion you want, not reasoning logically."
It'll also be interesting to see, from all the recorded data, how quickly, if at all, the human supervisor reacted and what s/he did. This will have a real bearing on non-fully autonomous cars and how alert a "driver" may be when the car is self-driving. It's already been speculated on at great length here as to peoples instinctive feeling that drivers will not be able to go from supervising a car (probably not very well, if at all) to taking over in an emergency.
"It's already been speculated on at great length here as to peoples instinctive feeling that drivers will not be able to go from supervising a car (probably not very well, if at all) to taking over in an emergency."
There have been some experiments with drivers and car simulators which indicate that drivers do not perform well when taking over from an automated system unexpectedly. Among other things, they have an initial tendency to over-control. More experimentation needed.
This is very much like the issue with airline pilots and autopilots. The pilots aren't nearly as good and reliable as the autopilots, so letting the machine drive is safer... but that means that pilots don't get much practice actually flying the plane, rather than just inputting parameters into the autopilot.
Generally, the feeling seems to be that having autopilots is safer than not having them, despite the pilot tendency to screw up when given control, as has been demonstrated by a number of airliner total losses. If you want to read a really horrific case, look up the accident report from that French Airbus crash in the South Atlantic several years back. Three pilots, but when they did silly things, and the autopilot handed them control, one of them managed to fly the plane into the ocean, before the other two managed to sort out what he was doing.
Still, airliners are really safe, and on some flights now use autopilots for everything, including landings.
As is usually the case, there is no solution which is the optimal solution for all possible circumstances, and anyone who insists that a solution must be universally optimal is probably just trying to derail the solution, for whatever reason.
I understand your point, but I think there is a very large difference between aircraft autopilot and driving in traffic.
How often do you imagine an aircraft has to evade an object 10 meters ahead? In flight, how often is following distance to another aircraft less than 35 meters? How often does an aircraft need to merge into a stream of other aircraft, or avoid pedestrians? How often does the pilot need to negotiate a banking turn while maintaining +- 1 meter tolerances to avoid a fatal collision with oncoming aircraft?
For perspective, the FAA mandates 1000 vertical feet clearance between aircraft, or 3 miles horizontal clearance.
And how often is highway traffic controlled via radio instructions from a central traffic control tower?
My personal feeling is that driving a car is a very different kettle of eels from piloting an aircraft. (As per the Pythons, a hovercraft full of eels is another matter.)
I work with industrial automation. Millisecond control loops are common. Very fast responses. Very accurate control, in the right circs. (But watch the oscillation, mate, 'cos your actuators may not be that fast. Integrator windup.) However, the challenge lies in programming for those rare events, unexpected perturbations, and unanticipated failure conditions.
A container ship on the open sea may take 6 kilometers and 20 minutes to turn through 90 degrees, but the driver of a Honda Civic has no such latitude when the motorcycle in front of him skids out. (If a porpoise skids out in front of a container ship... well, sorry, Flipper.) An airliner traveling at 500 km/hr is in desperate peril if it comes within 50 meters of anything of substantial mass, but that's following distance on the motorway at 110 km/hr. In plain words, drivers of automobiles face much more tightly constrained and unpredictable conditions.
Again, my personal opinion, as a programmer of rather simpleminded and -- erm -- often inelegant industrial automation routines: programmers of self-driving automobiles face a challenge probably two orders of magnitude greater than programmers of aviation or nautical autopilot devices.
It needs a lot of proving. AI is nice too, but when human lives are at stake, it too needs a lot of proving.
"programmers of self-driving automobiles face a challenge probably two orders of magnitude greater than programmers of aviation or nautical autopilot devices."
Agreed, completely. That's probably why we've had decent autopilots for 30 years or more, and are still refining and testing various AV technologies.
Also, besides complexity and proximity, the sensor situation is much more difficult in traffic.
On the other hand, our digital electronics is many orders of magnitude more capable, and we seem to be sorting out the algorithms, both directly, and as a result of progress in various other fields.
There will be some failures in software, even after we figure out how to do AV. We still get the occasional airplane falling out of the sky due to an implementation bug or unanticipated edge case that causes the software to choke... but overall software makes airplanes safer, and cars safer.
Daniel, I suspect we agree very closely.
I do think there's a good chance that, as you write, software will -- eventually -- make cars as well as airplanes safer.
My only caveat is that, because street-level driving is so much more complex than aeronautical or nautical travel, street-level autopilot needs more proving-out.
I like automation. It rocks the industrial world I work in. But -- eh, well, you already know the but. Maturity. The algorithms must mature. In my rather humble opinion (IMRHO) auto-driving auto-mobiles have not matured yet.
"I like automation. It rocks the industrial world I work in. But -- eh, well, you already know the but. Maturity. The algorithms must mature. In my rather humble opinion (IMRHO) auto-driving auto-mobiles have not matured yet."
Agreed.
One of the catch-22s here is that at some point further improvement will require realistic real world driving.
... and that is not going to occur at the same time for all the projects, so forming general or arbitrary rules (politicians at work?) will be a bit less than ideal.
Not sure I have the answer for that one, unless we can base it on statistical accident experience, but unless you have a good way of weighting for circumstances, that can be a bit fraught, as well.
A whole new bunch of issues will arise when we try to run these vehicles at -10, in snowstorms, or icing conditions, or with gusty winter winds bringing occasional flying snow.
I keep reminding myself that there is a short term risk with new tech, for major long term gains that persist almost forever, past a certain point. How do you balance those things as well - the 'safety opportunity cost' of waiting for 'right now' improvements?
"So you're okay with self-drivers killing children who run into the street after a ball?"
If they do so when the physics precludes the vehicle stopping, then yes, absolutely.
We don't get to rewrite the rules of the universe to make it one vast expanse of cotton wool for the terminally incautious.
Why don't we call this 'natural selection'? It's worked reasonably well for 3.5 billion years.
Your position is absurd. As others have pointed out; if I am driving down a street and see kids playing near the road I, as an adult, keep an eye on them, slow down, and prepare for them to do something stupid. They are, after all, children. I have no idea if an autonomous car can or would do the same. If they cannot, then they are not ready for real world driving. Although, from reading your post, you may not be either.
"If they do so when the physics precludes the vehicle stopping, then yes, absolutely."
Fuck you.
If a ball bounces onto the road, you KNOW a small child is likely to run out after it. I learned that by observation when I was ten years old (and at the same time I also learned it was a bad idea to run after balls if they went onto the road)
As such, slowing down IMMEDIATELY, is the logical course of action (which is what my mother did), rather than waiting for the child to appear (which is what the driver coming the other way did)
Failure to do so and then hitting the child at speed will get you a "careless driving causing injury" charge at the very least in most countries, if not vehicular manslaughter (The child survived, the driver lost his license for 2 years)
"If they do so when the physics precludes the vehicle stopping, then yes, absolutely."
Fuck you.
If a ball bounces onto the road, you KNOW a small child is likely to run out after it
-----------------------------------------------------------------------------------------------------
What part of 'If they do so when the physics precludes the vehicle stopping' did you not understand?
Changing the warning time changes the physics.
Assuming a ball changes the warning time.
Changing the question changes the problem.
If you want me to answer a different question, then ask that question.
Don't complain when I give you the logical answer to the question that was asked.
I am not responsible for your failures of logic.
may see, for example, a small child careening down a driveway
Good point. You would expect a human to be aware small humans are more likely to behave erratically and move more erratically and focus more attention on them as an unpredictable or harder to predict hazard than other moving objects and unmoving hazards, at least up until the point they no longer have the time to intersect with the vehicle from their current location.
Basically driver would pay special attention to a childs movement as a potential hazard for a different span and level of focus than a potential adult hazard.
Would an AI or trained system be able to understand or achieve the same?
Doubt it.
"Good point. You would expect a human to be aware "
That's all you need to say.
Humans are terrible at focussing on more than 2 hazards at a time. Most people develop tunnel vision and completely fail to notice the kids playing on the footpath UNTIL they step out onto the road, by which point it's far too late.
With all these comments about small children, I want to know who all these parents are that (A) fail to teach their kids basic road safety; and (B) allow their kids anywhere remotely near a road without holding their hands until they're old enough to have learned?
I'm frankly pretty fed up with the calls to wrap the whole world in cotton wool because some parents are too stupid to actually parent.
This post has been deleted by its author
Or the kid was never in your sight until the very last instant when he pops out from a seemingly-too-narrow gap just two feet in front of you (possibly deaf so unable to hear your approach and blind due to position) with cars hemming you in on both sides. Even at 10kph you're still likely to hit and possibly drag the kid under. Heck, not even a computer would probably able to save the situation. As they say, sometimes Crap Happens.
@AC
Don't be absurd. No one is "okay with" people dying in accidents.
what people are saying is there are (plenty) of instances where the accident cannot be avoided, and we don't know this wasn't one of them. You only look daft trying to prove a point by stretching an argument to such an illogical conclusion.
If the unfortunate person involved stepped out just a meter or two in front of the car, then there's not much most human drivers could have done, nor would the software do much better. Obviously that might not be the case too, but no one is taking the position you're suggesting.
" I'm wondering if the window between "oops shit isn't working" and "she's dead" was too small to actually do anything useful. By the time you realize it's not going to brake, it's too late."
You've hit on a big problem with (semi-) automated cars where the driver has no chance to do anything after realizing that the car's systems failed to react properly. Some Tesla drivers that live in snowy areas have discovered that regen braking is much different in winter conditions. Sometimes the regen is reduced due to the battery being cold and drivers fail to brake soon enough or the heavy car fishtails on slippery roads.
I expect that there will be many more deaths and serious accidents in the US as politician fall over themselves to pass laws allowing testing on public roads. The DARPA Urban Challenge was run at a shut military base around the residence blocks. Perhaps there should be a national/regional testing center set up at a similar place where automated vehicles have to pass a comprehensive suite of challenges before they are allowed to drive on public roads, with or without a driver.
Classy. Gonna send an e-card to the funeral too?
In a just world he'll soon be taken to a courtroom offering many many millions of dollars in compensation too, not that that can ever compensate for the loss of someone's life. This was always going to be a question of when rather than if. And given that the when has happened rather quickly in the limited time and with the limiited numbers of cars since they were allowed to be used on public roads, it really should be halted rather quickly before the excuses and apologists for it cost more lives. A good long think should then be had about the wisdom of continuing this way, with considerations of potentisal profit making and srategic advantage for private companies put well at the bottom of the list of priorities. I for one do not wish to take part in this beta test every time I go out.
We don't know yet. If we step out of this sad event and consider the hypothetical: how could both a autonomous car *and* the human 'backup' fail to spot and stop for a pedestrian, there appear to be two major scenarios:
a, The human backup driver was distracted because he'd effectively been a passive passenger for X number of incident-free miles.
b, the pedestrian moved at such speed prior to the collision that the car was incapable of stopping.
Now, writing as a fallible driver and wary pedestrian, it is the first scenario that interests me as one that could be addressed with engineering solutions.
Or the Human driver could never react quickly enough to an error by the car? If the person was pushing a bicycle it could very easily be a similar accident to the Tesla "Autopilot" one. Where the sensors missed the obvious, as the object had gaps in it.
Perhaps the car never hit the pedestrian, but the bike they were pushing?
Similar accidents use to happen here in the UK with large trucks/trailers. So they both added signs on trucks to warn cyclists/pedestrians "don't go this close", and barriers/mirrors to protect the pedestrians/cyclist from being crushed.
Nearly anything will have a blind spot in it's search space.
"a, The human backup driver was distracted because he'd effectively been a passive passenger for X number of incident-free miles."
Not necessarily distracted, but perhaps unready to instantaneously assume control.
There are a number of aspects to this.
1. A human driver controlling a vehicle does not have to decide to assume control... that's already done. This produces an extra evaluation and decision step to the process. Not only 'what is happening?' but also 'should I take over control, or let the systems currently trying to deal with the issue do so?',
2. Tests have shown that humans taking over control have initial problems actually steering or controlling a vehicle initially... their performance as drivers is significantly poorer than if they had been driving for a while. This may be an argument for leaving the driving to the computer in an emergency. That, and the fact that the computer effectively has 360 degree vision.
3. Humans stubbornly insist on learning from experience. After the first 250,000 km with no incidents, at some level they will tend to expect that to continue, and to treat an apparent problem as a mistake, not an actual problem. This may be related to the issue with baggage x-ray inspectors. After a million bags full of every imaginable safe thing, a bomb is likely to be interpreted by the brain as one of those familiar safe things that you have already seen 5,000 times.
None of these things are distraction, as such, but they will impair response time and response accuracy.
There's a large number of variables and we don't know all the facts. Whilst I don't advocate uber, in fact I rather detest them. But right now we need more information before we can judge who was to blame. Otherwise all we have is conjecture over this unfortunate incident.
One of the big arguments for automated vehicles is that they are supposed to be able to react faster and with better choices than wetware.
If this was a bicyclist and I'm in a car coming up on them, I am cautions about watching to see if they are going to stay along side the curb or follow on straight along in the continuing lane. This is even more important if I am going to turn right into the turn lane and cross their path. AI is going to have to advance an incredible amount to take in the subtle clues about what a pedestrian, bicyclist or motorbike rider may do. That can even apply to what another motorist in a manually driven car could do based on how they are driving. It's much better to stay out of an accident rather than being found in the right afterwards.
This post has been deleted by its author
Being on the road requires a minimum set of abilities, and for good reason.
Besides, the victim is a woman, it is highly unlikely that the situation you describe has any resemblance to reality in this particular case.
In this case I'd bet that the uber was cutting corners, not the cyclist.
In this case I'd bet that the uber was cutting corners, not the cyclist.
If the AI was "trained" by a lot of drivers I see then that is a distinct possibility. At the same time a lot of cyclists aren't entirely blameless for cutting corners.
When I am on the road (exclusively 4 wheels now, but occasionally 2 motorised wheels in an earlier life) I am ever concious when approaching junctions that a cornering driver may well turn up in "my" bit of road because they were never taught to drive properly beyond passing the test or because they either don't know the risk of cutting corners or simply don't care.
Either is possible but they are a bloody hazard regardless.
I used to cycle in London. My wife hated it, and to be honest I was the one always on guard.
Then along came Boris bikes and a whole generation of idiot cyclists without helmets scared most London drivers to the point where they are paranoid about ensuring they don't mow down a two-wheeler. Now I like cycling in London, as most drivers take excessive care about cyclists (NB I observe the highway code, and get annoyed at all the idiots that ignore traffic lights etc...) it is generally other cyclists that behave like idiots.
It has always concerned me that as long as an autonomous vehicle shares the same space as anything on two wheels (from a spotty adolescent caning a 250cc motorbike then emergency braking in front of a 'convoy' of autonomous vehicles on the motorway to a drunk exec deciding that a Boris bike is the best way to get to the last train out of waterloo at 11:30) let alone two feet, we will always have problems.
It is one of genuine objections to autonomous vehicles in a European city (rather than LA or Arizona).
If the car has to brake everytime a child wanders near the kerb edge of the pavement, or if a delivery vehicle is illegally parked and it refuses to cross a solid white line to get round them - then the whole of London is going to be gridlocked by autonomous vehicles obeying the rules.
"First glance my emotions wants to blame uber, but I need more data. Iv'e seen bikes cut across a major road( ie not crossing at the intersection) I've seen bikes cut people off an run red lights."
Here, in a major city, I see more bikes out at night without lights than with them. I have yet to complete a long trip on a major road at night without counting more unlit bikes than lit ones, even allowing a light at either end to count as 'lit'. Not once.
Similarly, in areas with one way streets, at night, I have yet to see any street with more than two bicycles on it where the majority of the bikes are not going the wrong way. I cannot explain why.
And it is quite rare for bikes to stop at stop signs and fairly often they run red lights.
Given that a lot of riders wear pavement/building/tree coloured clothing, all of this is quite risky.
I came very close to killing a cyclist one rainy night just after turning into a one way street, only to encounter an unlit bike with no reflectors going the wrong way down the *middle* of the street... but if I had, it would have counted against my insurance.
"And since when did a monitoring meatbag escape responsibility for the actions of the bot they are monitoring?"
I see people everyday who seem to think they can alternate between acting as a pedestrian (while still riding their bike) and acting as a vehicle )on the road) to suit their personal whims and whether they feel the need to stop or not at junctions, traffic lights and pedestrian crossings of various types. Fortunately, those suicide cyclists are not the majority.
so why aren't they already being sued for a few BEELLIONS?
I hope they lose and go out of business. Anyone driving for them must know that it won't last and that their slavedrivers want to replace you with their infallable AI system.
If it was a choice between using a Uber or walking then I'll choose the latter.
@Charles 9
No.
they'd use one of the many alternatives. I've managed perfectly well without ever using uber, so have more people than have ever used them.
I can't imagine a situation in which one would be left with *only* uber as a transport choice. It's a fictional scenario that doesn't really help you make your point.
This post has been deleted by its author
When driving, I periodically have to make decisions based on unexpected and unpredictable circumstances. Often these decisions must be made very quickly, and therefore the decision is made intuitively -- using a human brain with something over 40 years of accumulated on-road experience.
I'm not exceptional. Most of you commentards are equally skilled and safe on the road.
Obviously, when automation is handling controls, the human involved will allow his attention to relax. That's a major reason for automation of tasks: to remove the need for a human's continual, concentrated attention. Talking or texting on cell phones while driving is banned in some places for exactly that reason: it impacts driver concentration.
To me, the salient question is not whether the pedestrian or bicyclist was hard to see, or did something unpredictable, or disobeyed the rules of the road. To me, the question is whether a human driver with hands on the wheel, feet on the pedals, and eyes on the road would have saved a life.
Did it even brake?
Remember the Tesla Autopilot that failed to see the big truck, and DIDN'T EVEN BRAKE. Major FAIL.
In this case, if the Uber Artificial Imbecile system immediately saw the pedestrian and tried to brake, then perhaps it's understandable (perhaps bad timing). But if it didn't even brake, then Major FAIL.
An autonomous vehicle has killed someone in an accident. No doubt, this is a tragedy, but let's put it into context here.
How many other people were killed on the roads by vehicles driven by humans?
As horrible as it might sound, one death does not necessarily mean there's an issue with autonomous cars. I do appreciate that there are millions more "regular" vehicles on the road than there are autonomous vehicles, and thus millions more journeys that don't result in death, but we simply don't know the facts concerning this accident as yet, so it is far too early to draw conclusions about the relative safety of autonomous vehicles.
This is a sad loss of life but lets not forget that a human driver might not have been able to avoid the collision either.
The important lesson here is to ensure that the people responsible are held to account but this should be balanced against other issues such as drivers using vehicles when tired/full of cold/etc which causes many fatalities every single day yet DWT isn't yet illegal.
If anything this reiterates that the technology needs improvement and manufacturers should consider making the cars smarter, simple AI without self-awareness may be inadequate.
Presumably one of the reasons Uber selected the Volvo XC90 is because of its underlying auto-braking system. It uses radar and a vision system to automatically brake to avoid or minimize a collision (along with adaptive cruise control and lane-keeping, which are probably not relevant here).
The advertising for the system explicitly calls out pedestrians and bicycles, which covers someone walking a bicycle across the street (the present situation as reported thus far).
and all that. But the data in the article is insufficient to determine who was to blame here.
Those cars are experimental, and have a shedload of telemetry and cameras built in. No doubt the local plod will be quite adamant in nosing through all that to see what exactly has happened here.
Speculation is all good and nice, but until some more actual facts get released to the public, that speculation is pretty much pointless.
That may not help you much. An ambulance got into a fatal accident with a pedestrian just a few days ago not far from here. And it wasn't even self-driving. The pedestrian is dead regardless. And while I couldn't swear I remember the details right I'm fairly sure it was his fault - for some folks even noticing a large vehicle with strobe lights and glaring sirens is too much work when they decide to cross "right now"...
We need driverless cars for the people who cannot drive themselves... very important for the safety and independence of the old, disabled, young, etc., and also to extend services to such people at their home through affordable transport for goods and medicines.
We should have driverless cars because they will enable a whole bunch of good things, such as much less inexpensive public transport, inexpensive delivery, increased road safety, increased vehicle use flexibility, more efficient use of roads, optimization of resource use, reclamation of usable human time, and probably a lot of things we haven't yet thought of.
People railed against the replacement of elevator operators on the grounds that automatic elevators would be dangerous in all sorts of ways... yet they are the safest form of transportation on a per km basis that we have ever developed. Certainly safer than walking, and much safer than other vehicles or animals.
This post has been deleted by its author
One thing that has not been mentioned is that a safe autonomous vehicle has to be able to run over a pedestrian at need, as a human driver can, or there will be a huge number of gunpoint carjackings by people who step in front of a car... at the very least a 'run them over' button under a safety cover beside the hazard light switch.
Giving the car its own gun is probably too extreme.
"Why would a carjacker jack an autonomous car? They run the risk of (1) being identified by the car and (2) not being in control of the destination, which can be locked."
Identified by the car? Not if they don't want to be. Loads of ways to prevent that... or has crime ceased in areas where there are CCTV cameras? I must have missed that.
Presumably the owner/driver has control of the vehicle, possibly via an authentication token. Getting that token is one of the reasons for carjacking rather than just stealing unattended cars. Carjacking wasn't a 'thing' until the introduction of RFID chips in the keys that were needed to run the car. This anti-theft feature directly lead to the counter-tactic of carjacking. If you have the driver and the token, you should have control of the car.
Drive it into a truck that acts as a Faraday cage to prevent tracking, and reprogram at leisure.
You get the car, the driver's watch and wallet, and if you take the time, all the money you can pull from ATMs using their bank and credit cards.
Assuming your victim is young and attractive, and you have the right connections (surprising how many stolen luxury vehicles end up in containers going to other continents) you can sell them for additional profit.
Hence a "go, don't stop" button is mandatory.
Then there's nothing you can do. The next step after carjacking will be driveway/parking lot carjacking: jacking the car before they get started. There will ALWAYS be vulnerable points where a carjacker can jack the driver's identity and the car would have no way to know the difference (Perfect Impersonator Problem).
"Then there's nothing you can do. The next step after carjacking will be driveway/parking lot carjacking: jacking the car before they get started."
What do you mean, 'will be'? It already is.
That's one of the reasons for not stopping at your car if there is a suspicious person hanging about close enough to accost you at gunpoint, or knifepoint - a piece of basic urban safety awareness.
Just keep going so they don't know that a car and the matching token are in proximity.
"That's one of the reasons for not stopping at your car if there is a suspicious person hanging about close enough to accost you at gunpoint, or knifepoint - a piece of basic urban safety awareness."
Thing is, if they REALLY want you, they have ways of FORCING the car to stop like a pre-planned roadblock, a confirmed tactic of certain organized criminal organizations.
About 40000 people are killed on the roads in the USA every year (about 109 per day).
For the UK, the present figure is around 3000, maybe 2800. It's been falling for decades, but I believe it's increasing again in the past couple of years with increasingly distracted drivers.
"About 40000 people are killed on the roads in the USA every year (about 109 per day).
For the UK, the present figure is around 3000, maybe 2800. It's been falling for decades, but I believe it's increasing again in the past couple of years with increasingly distracted drivers."
The small size of the UK may also play into this as well as having cities built when cars did not exist, resulting in optimization for other forms of transport.
I bet the per capita commuter miles per day in the US is much higher than in the UK.
In a time before seatbelts, that long ago.
I was driving through town when a child darted out in front of me; less than three feet (less than a meter)
I think I heard the front bumper hit the pavement, and the child ran on never noticeing.
OK, I have fast reflexes, but this has kept my speed way down through occupied areas.
(I might have missed the little fellow by six inches)
Based on this photo.
https://www.google.com/maps/@33.4370667,-111.9430321,3a,75y,125.63h,59.39t/data=!3m6!1e1!3m4!1ss_YmQlzA3ace0P2LVk4wwA!2e0!7i13312!8i6656
I think what might have happened is the cyclist was passing where the lane turns right and was alongside the car. The car being so close to the side of the cyclist either did not detect the cyclist or did not react in time to a sudden change in the cyclists direction. If the former it may have turned into the right hand land thus running over the cyclist. If the latter then the AI may not have been able to react quickly enough. This can happen if there was a human driver too. It's just the law of nature and physics.
A car hit a pedestrian.
There is insufficient evidence to support the theory that the self-driving software or mechanics failed in some way.
If there was a human behind the wheel, and no automation, unless witnesses and/or irrefutable proof is supplied that the driver made an error, or drove recklessly, then the driver MUST be presumed innocent. If the car is found to have suffered a fault, and the driver could not have known about it before the accident, such as a sudden loss in brakes or wheel traction, the driver again cannot be blamed.
As much as I detest Uber, and the concept of self-driving vehicles given todays technical limitations, there is absolutely no proof 'it' was responsible. Remember, it isn't the CAR that we're questioning - it is the SOFTWARE that's driving the car. And as mentioned above, if it was the CAR that failed - then it would have failed in the same way for a human driver.
If, however, there is irrefutable proof that the SOFTWARE was clearly to blame, then I would suggest ALL self-driving vehicles be removed from the roads immediately until it can be proven to as safe a point as possible, that these errors have been erased.
And in all cases of software error, insurance companies should be able to claim from the company that wrote the software in the first place.
"There is insufficient evidence to support the theory that the self-driving software or mechanics failed in some way."
According to the Guardian, https://www.theguardian.com/technology/2018/mar/19/uber-self-driving-car-kills-woman-arizona-tempe The 2017 Volvo SUV was traveling at roughly 40 miles an hour, and it did not appear that the car slowed down as it approached the woman, said Tempe sergeant Ronald Elcock.
If true then the software didn't detect the woman so there is a failure somewhere. If the vehicle slowed down then you could argue it just didn't have enough time, but if it didn't then something failed on the vehicle, either in its detection algorithms or avoidance mechanisms. Your argument about 'car' systems failing - agree where human drivers are involved but I thought one of the advantages of self driving cars was all of the self diagnostics so the vehicle should be aware of things such as brake failure etc and take account accordingly.
"If true then the software didn't detect the woman so there is a failure somewhere. If the vehicle slowed down then you could argue it just didn't have enough time, but if it didn't then something failed on the vehicle, either in its detection algorithms or avoidance mechanisms. "
There is no reason to expect an AV to assume that every person standing near a road will suddenly walk into live traffic. Human drivers don't, or vehicles would be unusable anywhere there were numbers of pedestrians.
As I predicted several years ago the rush-to-market mentality to reap windfall profits from AVs will result in many deaths. Computer control does not mean accident free. It means the occupants have no control over the vehicle and when the computers malfunction, people die. While there is a lot of potential good from AVs, there needs to be mandated safety, security, design, engineering and maintenance minimum standards just like in aircraft with redundant, hardened control systems because there is no pilot or driver to take control when an AV's management system BSODs.
Many more will die before federal governments worldwide impose basic safety standards for AVs. Governments have abdicated their responsibilities and allowed a "wild west" mentality to prevail. The legal community and insurance companies haven't even agreed on who will be responsible in an unavoidable accident which will result in death. Programmers will be the ones who decide who will die as a vehicle's computers will respond according to it's data input and programming.
I partly agree. Autonomous control systems for road vehicles need to be overseen with the same rigour as those for planes or trains. That said, the standards have to be imposed before AVs are made available to the public but not before development is complete. Too many bad laws have been made by politicians under pressure "to do something now".
"Autonomous control systems for road vehicles need to be overseen with the same rigour as those for planes or trains."
No, they don't.
Issues like this need to be subjected to cost/benefit analysis.
Trying to get road transport to airline levels of safety will kill more people than it will save, because of the distortion of resource allocation, and loss of functionality and capability due to cost.
Trying to match airlines 'zero fatalities in a year' record would make ground transportation too expensive to use... with horrifying economic, social, health, safety, and other damage.
Yes, safety too. When fire trucks, ambulances, police cars, as well as vehicles delivering food, medicine, fuel, etc. become too expensive to use, what do you think would happen to the victims of such nonsense?
AVs need to be as safe as human drivers, and preferably safer - which they will become as they evolve. They will not be perfect - even airliners crash, regardless of costs that sometimes stagger the imagination - aircraft parts can be a hundred times the cost of their automotive equivalent or more - and the level of servicing - in some cases every hundred hours of operation, with major rebuilds fairly frequently - but they will be good enough.
Insisting on perfection is just a sneaky way to sabotage the advances AVs represent.
Nothing I said implies that we need to apply airline service standards to AVs. Motor vehicles as they stand are extremely safe (because of, surprise, rigorous standards) - it's the roads and drivers that kill. The danger is those rigorous standards wont be applied is to the design of the software (because, hey it's just software and it can be fixed later).
Without rigorous standards we'll get consumer grade software running cars which would be a disaster. With it we won't get anything like perfection because roads are inherently unsafe but we can road travel much safer. And why shouldn't we? Something like one in ten thousand people die on the roads every year in developed countries. If those people were dying in a war there would be mass demonstrations. Yet we just accept it. I don't consider that remotely rational given how risk averse we are in other areas of life.
"Trying to match airlines 'zero fatalities in a year' record would make ground transportation too expensive to use... with horrifying economic, social, health, safety, and other damage."
But it'll be demanded once more people get killed because they'll demand "What price a life?" The reason the airliner industry gets the scrutiny it does is because although aircraft disasters are rare, they tend to bite HARD: low incidence, but high consequence. Car fatalities are higher per capita, but the numbers and environment tend to run against them (more and less predictable obstacles). It's basically a case of comparing birds to bulls: not much in common between them.
"Many more will die before federal governments worldwide impose basic safety standards for AVs. Governments have abdicated their responsibilities and allowed a "wild west" mentality to prevail."
Because governments are so good at developing and implementing optimal policies?
If that were true, this would be paradise - we've got enough government to do it, if they could only get it right instead of wrong, mostly wrong, often wrong, and oops, missed the mark a bit.
I totally agree. To ban the technology now would be foolish but a possible workaround might be to fit the module to driving instructor's cars and train it with real data to see what it does without any actual risk.
In this way it would also be able to learn a safe driving style that doesn't annoy other road users, thus causing more problems.
"as we work with local law enforcement to understand what happened".
Well, i'll tell ya what happened.
A human variable got mixed up in the cars software and the car, due to shit software didn't know which way to run so it run over her.
Fucking self driving cars.... Don't make me laugh. That technology is 50 years away...
So if the car was in full autonomous mode and was travelling at 38MPH in a 35MPH zone, i.e. it was technically speeding, who physically decided to allow the vehicle to speed? Is a maximum speed set by the 'operator' and the vehicle can travel up to that speed, or does the programmer decide that the vehicle should select a speed up to posted speed limit (plus 10% in this case?) and the on-board operator has no control regards maximum speed (except by hitting the brake pedal)?
I am more curious about how they measured the speed.
Reminds me of one of the stories as told by one of the engineers working on Saab Viggen (the car, not the fighter jet, although that engineer had also worked on the jet back in the day). They were racing against a similarly specced Volvo. When the Volvo's speedometer was showing 220kph, the Viggen was showing 200kph...
So take that 38MPH with a grain of salt.
And finally: At least when driving here where I am, if you drive 50 kph in a 50-zone, cars will start queuing up behind you. 54 kph is a safer speed IMO. The AI could have picked up on similar patterns.
" When the Volvo's speedometer was showing 220kph, the Viggen was showing 200kph..."
My car _speedo_ shows about 10% higher than actual speed (reported by GPS)
The ODB reported speed matches the GPS speed.
IE: what's on the meter is deliberately reading high to encourage the hooman to stay below the speed limit.
Its also a legal get out for manufacturers.
"But your honour, my speedo read exactly 40MPH but the calibrated radar, LIDAR, whatever registered 44mph".
So the manufacturer then cops SOME of the blame. By making them read low, that excuse is trashed.
"I was fined for doing 45mph your honour" But your speedometer would have read ABOVE 45mph so you have no excuse.
I always use my GPS and OBDII for accurate speed readings.
If an object manages to intercept the vehicle at the right moment and close enough proximity, the vehicle will not be able to stop or manoeuvre sufficiently to avoid a collision. This is physics, not AI or meatsack fallibility.
The only slight surprise I have out of this is that I (unaware of the facts yet) thought these vehicles used radar or lidar making shadows irrelevant (at least to the AI).
In this case, if she ran like a cat from the sidewalk right across the path of the car, then it is likely that no additional controls could have prevented a collision.
"The only slight surprise I have out of this is that I (unaware of the facts yet) thought these vehicles used radar or lidar making shadows irrelevant (at least to the AI)."
The Uber car is supposed to have both Lidar and RADAR with 360° coverage. It appears that the system didn't register the person crossing the road. The driver was looking down at something and didn't look until just before or right at the impact. Did he catch a glimpse out of the corner of his eye. The dashcam video is rather dark, but that could be due to it being a cheap sensor. One of the latest Sony sensors with much better dynamic range may have revealed the victim in the video.
Autonomous cars are supposed to be much better than a human in exactly this situation. There wasn't a bunch of other vehicles, peds, bicyclists and other moving targets around the car at the time. I would think that the computer wouldn't have been maxed out dealing with too many targets at the time.
The victim shouldn't have been crossing where they did and a car at night with it's headlights on is easy to see from a fair distance. The Uber corpsicle should have been paying attention. The US government shouldn't be allowing the beta testing of highly dangerous technology on public streets. This isn't going to be the last of this type of accident. I expect that in the future some accidents will be avoided where drivers have been inattentive, but other accidents will happen where the automated car and AI in general does very poorly in assessing a situation and people do much better. It's hard to say if the balance will be biased one way or the other. I plan to avoid autonomous automobile transport in the same way I studiously avoid Uber/Lyft today.
If the powers that be don't like you, they can switch off your vehicle remotely during a journey.
This is relinquishing control of your personal movement, restricted to the parameters of the 'AI grid'.
This is different from a public bus or train, even for a driverless light rail train, where the time schedules, routes and stops are properly established.
"I'm pretty sure the "powers that be" can find easier ways to switch you off.'
Sure, they can just come up from behind and shoot you in the head, but that tends to be reported in the media. It would be so much more convenient to just have your car drive you to someplace where they can make it look like a random accident or a heart attack.
"You surrender your sovereignty with self-driving cars "
This is a matter of implementation, not an intrinsic characteristic.
I agree that there is a LOT of work to do on AV design to make them not only autonomous in terms of self-driving, but independent of outside control, and largely anonymous to remote or retrospective tracking.
I think that can be done, and have some ideas about how that could work, but it has to become a priority in the design and implementation of AV vehicles and systems.
"it’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how [the victim] came from the shadows right into the roadway."
No. The whole point of autonomous vehicles is they are supposed to be super safe and better than human drivers. The whole point is they are supposed to have the tech so they can "see" in the shadows. Clearly Ubers tech isn't up to par so should be fully suspended for at least a year or so. Not to mention it was speeding. 38 in a 35mph zone. Only 3mph over you say. It's not the point. It's autonomous so should be able to keep to 35mph.
"Not to mention it was speeding. 38 in a 35mph zone. Only 3mph over you say. It's not the point. It's autonomous so should be able to keep to 35mph."
It should be able to keep to the speed limit. It should also 'know better' than to do so all the time. Speeding is sometimes safer than not.
"Only 3mph over you say. It's not the point. It's autonomous so should be able to keep to 35mph."
There is always a bit of hysteresis with any type of cruise control. If they tighten the range, the car would be constantly jabbing the brakes and jumping on the accelerator. The police don't write tickets for going 3mph faster than the posted limit to allow for speedometer errors, wandering right feet, etc. Do a little test with your cruise control on and a graph of your speed via your phone's GPS.
I have, actually, and maintaining a tight speed is pretty easy for your average cruise control since it can react more quickly and thus not need to use the brakes: just light adjustments to the throttle is all. It only gets tricky when the angle changes (curves and inclines), but in my firsthand experience the cruise control (which isn't exactly state of the art, it's an early 00's car) tends to correct itself pretty smoothly.
"The self-driving vehicle was doing 38MPH in a 35MPH zone, we're told."
Huh? If the AI can exceed the speed limit, then there is something wrong with it. It either can't measure its speed accurately, doesn't know what the speed limit is everywhere, or has a tolerance programmed in.
I wonder what the actual limits are...
She shouldn't have been crossing the road, she was riding a bike illegally, she wasn't riding a bike (therefore we can't know the facts), bikes are illegal on public roads, it's the pedestrian's responsibility to be safe, she was wearing dark clothes at night, the sun was in the driver's eyes, didn't pay road tax, innocent motorist blamed for tragic unavoidable accident.
Has that covered everything?
Based on what I've read to this point, the victim was crossing the road, which ordinarily requires you to walk it across, as one is only supposed to ride along a road. Second, there wasn't a crosswalk there, so the victim was likely jaywalking. Third, the car didn't even attempt to stop, meaning it basically never saw the victim. This is there I'm not too clear because I don't know enough of the context to be sure the victim was visible for long enough to expect to be able to react.
"it’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how [the victim] came from the shadows right into the roadway."
Well if you cannot easily see the dark roadsides then maybe you drive more slowly just in case? * Especially with current scenario of phone & headphone distracted people - I get several "unexpected" stepouts into teh road a year in town driving (except they are not wholly "unexpected" as I'm assuming stupidity from people and driving carefully based on that premise e.g. (sometimes considerably in badly lit / parked car areas) under the speed limit and so (fortunately) not hit any of them (even taking care I was only able to avoid hitting one by swerving partially into the opposite lane - only possible as no traffic in taht lane))
As a UK driver I expect people to cross anywhere - we have no jaywalking rules here so in towns you can expect "unexpected" crossing anywhere - often irritatingly within 10 - 20 metres of a designated crossing (and also pedestrian, cyclist , bus and train user as an aside)
Surely "shadows" not that much of a get out if the car had LIDAR? An advantage human drivers do not have.
Are the Uber AIs "trained" on "king of the road" style drivers instead of considerate ones>
* Or am I just an old has-been advocating "defensive" driving instead of me, me, me - my journey as fast as possible is all that matters appraoch
It seems to me we have metal vehicle bodywork to protect those inside from harm, mostly from impact with other metal clad vehicles. With autonomous vehicles it seems those outside the vehicle are more in need of protection.
So bodywork which is soft and squidgy but doesn't bounce impactees across the road may be where we need to invest next.
"... But am I the only one that is reading that the car was speeding? It was actually breaking the law?"
Who cares?
A good driver knows when to speed and when not to. There is no reason to assume an AV can't become similarly discerning. The issue should be safety, not slavishly following arbitrary regulations.
Human drivers have an error rate. Accordingly, they run down an expected number of pedestrians or bicyclists annually (among other adverse events) and they kill some of them. According to the Pedestrian and Bicycle Information Center, the2015 deaths ran to about 5400 pedestrians and 800 cyclists.; nonfatal injuries seem to have been around 70000 pedestrians and 45000 cyclists.
Controlling software for autonomous vehicles also will have an error rate, and such vehicles will run down an expected number of pedestrians and cyclists and kill some of them. In their history to date, the known number of such errors seems to be 1, far too low for statistical analysis.
The fact that both humans and the sensor/software autonomous vehicles operate in a physical environment with a certain amount of unpredictability guarantees that an error rate of zero never will be more than a goal that can be approached but not attained. As they are an alternative to human controlled vehicles, autonomous vehicles should be judged against the fairly well known results attained by human control, although it is proper now, while they are being evaluated before general use, to subject each incident to elevated scrutiny directed to improving outcomes.
When you die, how would you like it if it says "possibly homeless" on your death notice???
That person was a child, had parents, friends. There is no use other than to try and marginalize the woman's death by saying such thing. Even if the person was without a home, it's still a berating thing and has NOTHING to do with being in a car accident. Did Uber pay you make her death seam less important? Pisses me off that people can be so mean to the dead.
Yeah I was homeless 35 years ago for 3 years, and in AZ. I worked my ass off and got things going fine. I met a lot of good people at the shelter that through no fault of their own were screwed. People that live pay check to pay check and their apt building burned. A woman married 30 years and her husband kicks her out for a young girl. She had nothing. Kids who were dumped on the street as their parents were to messed up to care for them. Don't marginalize someone's death to make Uber less bad. That is disgusting. Maybe when you end up out of work from brain cancer and have your home taken when you run out of money to pay for taxes, will you see, we are all people - not worth more or less than anyone else. GRRRRR that pissed me off.
Why did the algorithm break the law?
The latest on the Uber death seems to be that the woman suddenly came out of the shadows which may be a stupid thing to do at any time. The car was speeding while travelling at 38 instead of 35 mph* and appears to have made no attempt to stop before killing her.
The human monitor is guilty for failing to monitor the car on two counts, by allowing the car to speed and making no effort to prevent harm to other road users. ** Was the monitor even watching the road or a screen? In flying there is a well known concept that the cockpit is for looking out of.
* I know 3 mph is not much but the driver was breaking the law.
** In the UK this is called driving without due care and attention.
Having seen the video from the car of what happened, it's very cut-and-dry that the Cyclist was the one at fault. Crossing the road in the pitch-black-dark. No High-Vis at all, no retro-reflectors on the bicycle.
The only way any car was going to avoid that incident is if the car has an infra-red night-vision system to see in the dark, which obviously Uber didn't spring for in the car's vision system. Given that there are ways for cars to be able to see in the dark, they need to be made mandatory for autonomous cars.
She (particularly her jeans) looked more than visible enough to me in the video the moment that the headlights were on her, and even before that, it seemed clear that something was in the road. An average adult driver would typically slow down upon seeing that some kind of large object was in the road, turned on their high-beams, then changed lanes and/or braked hard to avoid actually hitting it.
"The only way any car was going to avoid that incident is if the car has an infra-red night-vision system to see in the dark, which obviously Uber didn't spring for in the car's vision system."
They sprung for something even better: Lidar and RADAR. Although, neither seemed to work.
Agree with Oneman2Many on this - based on the dashcam footage this looks like an instance where a self driving car should have had a massive advantage over a human driver - that of being able to detect objects moving into the path of the car that could not be detected with the human eye, in this case due to darkness.
That said, the person crossing the road appears to have done so without observing the oncoming car.
In the video the pedestrian/cyclist-walker certainly does "appear from nowhere". That said, the video does seem to crush the blacks a bit. While in general, the high contrast of night driving does make for difficult imaging (and seeing), I wonder if you'd been there in person, the shadows would have seemed quite as dark as on the video?
While I suspect the camera gamma-curve could have been more appropriate, I also wonder whether the headlamp beams were overly-dipped, or could have had a better profile.
In the UK our headlamps spill a lot to the left (because we drive on the left) and would have picked out the pedestrian at the side of the road before they entered the line of traffic. In the States of course, their lights would be configured the other way.
It seems fairly clear that the woman with the bike made an error of judgement; it feels like the car should have been able to do some more though...
This post has been deleted by its author
These links put the incident in a very different light to the dark dashcam footage:
https://www.google.com/maps/@33.4363673,-111.9425082,3a,75y,324.7h,86.2t/data=!3m6!1e1!3m4!1sseIHdIkV5FyYzACTGokvBg!2e0!7i13312!8i6656
https://imgur.com/gallery/XQrAB
The road widens at this point, although you can't see that in the original dark dashcam footage.
The lady with the bike was crossing her fourth lane by the time she came to grief.
- From the pedestrians' perspective, it's a lot harder to time/judge how fast a car is coming when she left the kerb so much earlier. It was not a wise place to cross (and is forbidden by signs) - but the her error in judgement of timing is much finer / more forgivable than other reports or a view of the dashcam alone would suggest. As pedestrians, we've all made similar misjudgements a handful of times in our lives. The car would only have had to slow by 5-10 miles per hour a few 10's of yards earlier, and the lady would have made it across the road; it would have been close, but she'd've been ok.
- As far as the car's navigation cameras / systems are concerned she most definitely did not "step off the curb into the path of the car" - she'd already crossed 3 lanes, and should have clearly been identified, and identified as being on a collision course.
The car definitely should have been able to prevent this tragedy.
One self driving autonomous tin can kills one person, 20 million other cars controlled by humans kill 20 people. And I doubt the relatives of the deceased would consider it insignificant.
Go back to basic maths class.
Self driving cars are just a stupid idea that wont be ready for 20 years. Expect more innocent road user to get wiped out until they realise that a wetware component is a necessity or the public finally pipe up and say no more beta testing deaths.
Traveling by aircraft is statistically MUCH safer then by car, so where are the pilotless aircraft?
Would YOU get one?
It was her fault for jaywalking (there's a reason that's illegal), not the AV's. Hell she probably would have gotten hit anyway even with a human behind the wheel since those tend to be texting, getting a BJ, or otherwise not paying attention while driving. AV's will ultimately be better than human drivers, pedestrians just need to pay attention when near roads if they don't want killed by robot cars. Same reason it's illegal to walk on railroad tracks, a train won't and can't stop just because you are standing in it's way.