I don’t trust it, not at all. Nothing this side of Star Trek is actually capable of driving a car as well as a human.
Tesla driver blames full-self-driving software for eight-car Thanksgiving Day pile up
A serious accident on San Francisco's Bay Bridge has been blamed on Tesla's "full-self-driving" software by the driver, and the US government is investigating. The smash-up occurred around lunchtime on Thanksgiving Day on the Bay Area's second most famous bridge when the Tesla Model S reportedly slowed suddenly from the …
COMMENTS
-
-
Friday 23rd December 2022 05:28 GMT Duncan10101
Hmmmmmmm
I'm not so sure. Not because autonomous vehicles are especially good, but because human drivers set a very low standard for safety.
If you look at your car, it's smothered with safety devices, each of which is there because we learned (the hard way) that people really aren't that good at driving. Seat-belts, air-bags, anti-lock brakes, traction control, crumple zones, bumpers, alerting devices that stop you from wandering across lanes when you fall asleep at the wheel, the list goes on. That's just on-car ... what about all the surrounding stuff, like lane markings, traffic-control-lights, "Pedestrian Crossing" signs, flashing beacons, and barriers that stop you driving into the path of a train (not that drivers don't go around those!)
People crash their cars all day, every day. The difference is that you don't get news articles like "Man blamed for crashing his own car". But with an autonomous vehicle crash, there are stories aplenty ... even if (as is this case) the full facts aren't known yet. Really this is a story about a driver blaming his autonomous car. "Er ... it wasn't me, the computer did it". Indeed, if you watch the accompanying video, it seems likely that the driver was doing something unsafe, that the software wouldn't be expected to handle. I'd rather see the outcome of the investigation, but by that time the incident will be forgotten and all Joe Public will remember is "Nasty unsafe Tesla car crashes itself and injures innocent people."
One positive of all this human-inability-to-drive is that the safety devices work very well for computer-caused crashes too. An 8-car pile-up in a tunnel only resulted in once person going to hospital, so I'd call that a win for standard safety devices.
Ultimately, we need more data. And I'd like to see a follow-up story once the facts are known.
-
Friday 23rd December 2022 08:15 GMT Anonymous Coward
Re: Hmmmmmmm
but because human drivers set a very low standard for safety.
It is this facile statement that is at the root of Tesla's poor driving, and their failure to make it do something valuable (full self parking anyone?)
Humans are pretty good at driving. If Teslas were _actually_ better, then Tesla would be making real raw data available to academic studies that would have proved just that, in the 4 years since Tesla made claims of superiority.
Not doing what they are doing, which is keeping all the raw data secret, and having "Honest Elon" simply spout that they are biggly btter than humans, with no independent support at all.
Tesla behaves far more like a pharma company with a totally marginal drug, than people who really have something 5x better to offer.
-
Friday 23rd December 2022 09:25 GMT myxiplx2
Re: Hmmmmmmm
The fact that the Tesla apparently slowed suddenly but didn't crash yet 7 humans managed to pile into it and other cars rather disproves your argument.
At least 7 cars here were driving too close, or not paying enough attention, and ploughed into other vehicles. The initial trigger here may have been the Tesla, but poor human decision making caused the crashes.
-
-
Wednesday 28th December 2022 18:38 GMT Si 1
Re: Hmmmmmmm myxipix2
I don’t know about the US, but in the UK, if you crash into the back of the car in front of you it’s your fault, not the fault of the driver in front. There could be any number of reasons he performed an emergency braking manoeuvre, it’s your responsibility to leave a large enough gap in front that you have enough time to react and stop before hitting the car in front.
That all these drivers failed to do that shows that humans really aren’t very good at driving safely. It should also be noted that the Tesla didn’t hit anything, the only ones causing accidents here are the human drivers.
While I’m sure the Tesla self-driving mode has plenty of shortcomings, (MKBHD did a great video of recording his drive to work with his Tesla in full autonomous mode), I think this incident says more about other drivers than the Tesla.
-
Friday 23rd December 2022 12:42 GMT Version 1.0
Re: Hmmmmmmm
I drive in the USA, but I always drive at the speed limit so virtually all the other cars, including all the police cars, just go fast past me all the time. If I need to slow down I always look in the mirror quickly to make sure this kind of problem doesn't happen. I wonder if that was the Tesla problem, the software was only looking in front of the car in detail, not behind it? If so that might be a fixable issue in a new Tesla update.
I have a red sticker on the back on my car that says, "If this sticker is Blue then you are driving too fast"
-
Friday 23rd December 2022 13:19 GMT Anonymous Coward
Re: Hmmmmmmm
Given that the Tesla was at the root of the accident it makes the rest of your argument superfluous. The Tesla did what it often did in self driving mode (i.e. fail disastrously) and thus caused accidents. A human driver would NOT have caused this pile up on account of not having failed a task which is actually not that hard for humans given the volume of traffic vs. the accidents.
What you're also forgetting is that human picks up a LOT more situational awareness than a Tesla. It's easy to forget because we do this automatically and subconsciously (which takes time to develop, hence the initail information overload for a learner driver). We can diminish that by things like loud radios but that is NOT the default. We pick up sensory data of road conditions and grip on the road via the steering wheel, we pick up visual data via eyes and mirrors, we hear the traffic - there's a lot of data there that your brain processes without you even being aware of it. More importantly, the brain of a good driver also runs continuous analytics on that data and assesses a whole matrix of consequences. You spot cards behind you which gives you an idea how hard you can hit the brakes without getting it in the neck, you see how people drive around you and can thus draw conclusions as to what they're about to do (which, I must add, could be wrong but you re-assess the situation accordingly).
It is utter BS that we drive worse than a computer, as Tesla's autopilot keeps proving if you correlate the correct data versus the Tesla selected analysis. Yes, we react slower, but you should look at that in function of the fact that we (a) take that into account and (b) get it right most of the time so we do not need split second corrections - so it's actually a lot more efficient.
Yes, we're fallible as human beings. That doesn't mean computers are really better. Unless, of course, you're selling something...
-
Friday 23rd December 2022 13:26 GMT John Robson
Re: Hmmmmmmm
No - the cause of the accident was people behind the tesla not leaving braking distance.
The cause of that is probably pushing idiots cutting into people's braking distances in order to advance an addition 18" along the road, meaning that people don't want to leave enough distance to stop in, for fear of someone else cutting them off.
So no - the cause is in no way "vehicle braked", it's "drivers failed to leave sufficient space to be able to stop in the distance they could see to be clear".
-
-
Friday 23rd December 2022 14:26 GMT John Robson
Re: Hmmmmmmm
Well I manage it consistently. Drove past five separate crashes on the motorways yesterday, had at least four people just cut directly in front of me, despite there being no particular reason to cut across my bonnet rather than at least a few tens of yards down the road.
Also sat and watched one car (take a wild guess at the manufacture) switch between lanes 2 and 3 a dozen times inside two minutes, behind the same two vehicles, both of whom were driving at the posted (and camera enforced) 50mph limit.
Also watched a van try to push in front of a vehicle in lane 4 by undertaking, run out of space so they went to lane 2,1,2,3,4... and were stuck in the traffic jam precisely one vehicle further ahead.
Beside which... I seem to recall writing a whole paragraph about why driving standards have slipped so far. Just because you don't keep a braking distance doesn't mean it's suddenly some else's fault if you run into the back of them.
If someone had dropped a concrete block off the roof at the tunnel exit then I would anticipate a couple of vehicles being scuppered by that, because on a motorway you do assume that your braking distance includes some of the tarmac past the vehicle ahead, since that vehicle will also have a braking distance. But that's not the case here, the Tesla may well have slowed quite quickly, but that's what the human drivers behind failed to deal with.
The irony here is that if they'd been using even a basic adaptive cruise control mode they'd probably have been fine.
-
Saturday 24th December 2022 08:43 GMT Anonymous Coward
Re: Hmmmmmmm
Well John, I suggest that you try braking hard in a tunnel for no reason, and see if the beak lets you off the dangerous driving charge because "the cars following me shouldn't have hit me"
Humans get charged and convicted for doing this.
That said, my money is on the driver being fast asleep while Mr T drove, and had one of those dog twitching dreams and swerved his car across the tunnel creating mayhem, only waking up amongst the wreckage in time to blame the car.
And if on the off chance that wasn't what happened, it will be, after Tesla techs have extracted the "data"
-
Saturday 24th December 2022 14:54 GMT John Brown (no body)
Re: Hmmmmmmm
Well John, I suggest that you try braking hard in a tunnel for no reason, and see if the beak lets you off the dangerous driving charge because "the cars following me shouldn't have hit me"
At this stage, you are assuming the Tesla braked "for no reason". Maybe it did, maybe it didn't. If it was for no real reason, then the Tesla drive bears some responsibility of course, but the following drivers who then hit him and each other are ALSO due charges for dangerous driving because, as stated many times above, they were driving too close to stop in time. Maybe the Tesla braked hard because something fell off of a vehicle in front, tyre tread tearing off a truck, badly loaded items, wear and tear causing something to fall off, some twat throwing rubbish out the window. We don't know yet, but you are assuming the Tesla or the Tesla driver did something wrong.
And yes, some of us DO drive with a safe stopping distance in front of us. Those of us driving 1000+ miles per week are fully aware of written stopping distances and actual stopping distances, ie it's very, very rare for the vehicle(s) in front to come to a sudden and abrupt stop so take that and conditions into account. IME, most drivers are sensible and careful, but there are so many out there that it only takes a small percentage to make it seem worse than it is. And we always remember the idiots who do stupid things because they are the exceptions. We rarely take note of or remember the 1000's of other drivers we saw that day who caused no issues.
-
Friday 6th January 2023 03:54 GMT Anonymous Coward
Re: Hmmmmmmm
Ok JB, what is a sufficient distance? 3 second rule? Kind of depends on what you're driving and what the person in front is driving. If the car in front is a GT3 with carbon ceramic brakes and you're just driving a run of the mill sedan then chances are if they stamp on the brakes you WILL be hitting them in the arse on account of their vastly superior braking ability.
The old adage that if you hit the car in front then you are at fault is a complete myth. It's a starting point for the investigation, but if you stamp on your brakes for no reason and the car behind you hits you then you can be found at fault - and I have that on advice of a senior and very experienced traffic cop. If they were driving reasonably and you unreasonably then you are to blame.
-
-
Sunday 25th December 2022 19:31 GMT ChrisC
Re: Hmmmmmmm
Yes, the vehicle/driver in front might be found liable depending why they suddenly slowed, but that doesn't give the following vehicles/drivers a get out of jail free card - regardless of why the first vehicle slowed, if another vehicle rams it up the arse then that's entirely on the driver of that second vehicle for not leaving enough space to respond safely.
-
-
-
-
-
-
-
-
Tuesday 27th December 2022 21:23 GMT ChrisC
Re: Hmmmmmmm
And the one time that reason ISN'T visible to you will be the time you're thanking whichever deity you choose that you left enough space to react without any damage being incurred to your vehicle or to yourself...
Doesn't matter how many times you get away with stuff whilst driving due to other vehicles normally behaving in ways you can predict, because it only takes one slip of judgment, one completely out of the blue event, one other driver doing something fully off the wall, to change your life in an instant.
-
-
-
Saturday 24th December 2022 17:10 GMT genghis_uk
Re: Hmmmmmmm
Let's forget the self driving element for a second and consider that it was a normal car and the engine blew - this would cause a similar rapid deceleration and the same 7 cars would have ploughed into the back of each other. Would you say the other drivers were too close or not paying attention then?
If you crash into the back of someone it is almost always your fault - even if the car you crashed into was a Tesla!
-
Sunday 25th December 2022 16:52 GMT Anonymous Coward
Re: Hmmmmmmm
There's a difference between a valid cause and one that isn't.
If you suddenly go full on the brakes on a motorway you're acting outside parameters considered normal, and as such you CAN end up with the blame for the consequences, especially if you do so in sight of a camera.
There have been a number of court cases about such incidents. They're not easy, but if independent witnesses or camera evidence confirms you suddenly stopped your insurance will have to pay up. If the results include injury or lethality you could even face criminal charges.
-
-
-
Friday 23rd December 2022 10:39 GMT Boris the Cockroach
Re: Hmmmmmmm
Quote:
Humans are pretty good at driving.
No they bloody aren't.
Think how bad the average driver is, then make a note that 50% of drivers are worse than that.
We're talking the people who use 2 cable ties to defeat the 'hands on wheel' device , then climb into the back seat of the car and then blames the tesla for driving into a tree (I call this darwin in action)
But even without a tesla or its drive assist (or any other drive assist), people are almost completely useless at driving their cars..... eg last weekend, pissing with rain, heavy spray on the motorway.... nearly everyone is driving at the same distance apart at the same speed as they would on a bright sunny day even though you can barely see 100 yrds... and there were even a couple of cars with no lights on.......
All I see when tesla mention the drive assist is just another device like the seatbelt, airbags, and anti-lock brakes that protect the occupant of the car from the actions of the stupid around them.....and the stupid in the driving seat.
-
Friday 23rd December 2022 13:27 GMT Anonymous Coward
Re: Hmmmmmmm
Given the volume of traffic without autopilots-that-really-aren't I don't think you should write off human drivers that easily. Yes, there are some spectacularly bad drivers out there and that's usually because they drive tactically, not strategically (i.e. they don't plan ahead well) but given that most decent drivers can step into any car and get moving safely without more than a few minutes of adaptation time I think humans are still well ahead - exactly because we can plan and adopt strategy on the fly, even if we've never been to the location at hand before.
I think we're at least a decade away from any computing resource that can do the same, Tesla sales BS notwithstanding. By the way, there are other companies engaged in this, but they're a lot more careful by, for instance, not enlisting every road users without their permission in a public alpha test that can (and does) get them killed.
-
Friday 23rd December 2022 14:35 GMT John Robson
Re: Hmmmmmmm
"I think we're at least a decade away from any computing resource that can do the same"
Which is why it's still at the supervision required stage.
And I'm looking forward to it being not at that stage, even if only on limited roads (i.e. motorways, where there are limits on the types of vehicles allowed, and physical separation of vehicles in different directions, and all free flowing junctions).
Because even that "minor" aspect of driving being automated would be a significant benefit in terms of driver fatigue/alertness.
Watching FSD real world tests is actually very interesting - it does a remarkably good job in a wide range of mixed traffic conditions. Still seems to struggle a little bit in road works, but I'm generally very impressed. No that doesn't mean I think we can start surrendering licenses and rely on computer driving yet.
-
-
Wednesday 28th December 2022 13:13 GMT vogon00
Re: Hmmmmmmm
Humans are pretty good at driving.
Some more than others. It depends how much situational awareness you can maintain, and how much you trust your fellow road users to keep to the generally accepted norms.....whatever they are. Me, I assume that all road users, including myself, are human and fallible and allow plenty of space (actually thinking-and-action time).
A good friend still suggests that the best safety device would be a large, very obvious driver-facing spike mounted in the middle of the steering wheel. That should keep people focussed:-)
-
-
-
Friday 23rd December 2022 08:41 GMT Jou (Mxyzptlk)
Re: Hmmmmmmm
> anti-lock brakes, traction control,
Those two are independent from driving skills. Now matter who good you are at breaking or acceleration: The force is never even split among the tires. First the mechanics of the car, different wear on the tire, and different street conditions between the four tires. All that stuff can never be in perfect balance. Only a machine can handle that.
As for the rest: Yep, human stupidity problem.
-
-
Saturday 24th December 2022 12:46 GMT Anonymous Coward
Re: Hmmmmmmm
You must be trolling.
Traction control has zero impact on speed other than when grip conditions become marginal, and then you really should slow down anyway.
The only situation during normal road use in which traction control should be switched off is if you're driving in fresh snow. Oh, and you must kill it when you're stuck in snow or mud and want to see-saw your way out of the rut, otherwise your power will be cut at the exact moment you need it most (happened to me in a Ford rental, and the manual only showed how to disable that on page 197 or something which is really fun when its erroneous actions have just caused you to slide into a snow bank at midnight).
-
-
Saturday 24th December 2022 18:59 GMT Anonymous Coward
Re: Hmmmmmmm
In my experience they run them in every country. Absolutely worth attending, even just to brush up on your reflexes.
One of the most important driving courses is skid pad training. I am in two minds about the idea of making that part of driving license training as I think it's more effective if you have a few months actual driving under your belt so it tunes reactions you already have.
That said, some driving courses are just fun to do, but they're a tad more expensive than average, also because they often involve specially equipped vehicles. And J turns are kinda hard on your tires if you later train with your own vehicle :).
-
-
-
-
Friday 23rd December 2022 09:24 GMT Andy 73
Re: Hmmmmmmm
In the middle of that comment there's the phrase "I'd rather see the outcome of the investigation"... yet it's surrounded by a long list of reasons why it's definitely the driver's fault.
It's reasonable for people to be interested in the root cause of this accident, because we're on the cusp of introducing a radically new technology to a vast number of drivers. Great claims have been made, and it's being used as a marketing tool for auto makers, so there are high expectations which may turn out to be false. Because accidents are relatively infrequent compared to the number of individual miles driven, it's very hard to determine statistics from individual events - so every event is going to be newsworthy until trustworthy statistics can be produced.
In short, whilst we should be cautious reaching early conclusions, we should not dismiss the possibility that this complex new technology introduces new risks to driving. We can certainly make claims that from a hardware point of view it can react faster, steer more accurately or whatever, but driving is not a purely mechanical process - as shown here, its a co-operative activity and still relies on the behaviours of humans to account for the limitations (both software and physical) of trying to make a few tons of metal move swiftly through a complex environment.
In fact, it would frankly be amazing if such a significant change in the way cars were controlled *didn't* change the risk profile. It may be more or less safe than a human driver, but we can be pretty sure that the type of accident a machine driven car gets involved in is likely to be different from a human driven one. We may need to pick our poison at some point.
-
Friday 23rd December 2022 10:24 GMT Neil Barnes
Re: Hmmmmmmm
I think also that there is scope for locale comparison: the way people drive in the US is not the same way people drive in the UK or the same way people drive in DE... and neither do people drive the same way on an autobahn/freeway/motorway as they do in the streets of their hometown or the cross country roads in the north of Scotland.
A lot of this may be down to how people are expected to behave; I don't know how much of this finds its way into the software though I am sure they have considered it... but having the same logic cope with the DE indicate and pull out in front of a car doing 200kph in the outside line vs the UK overtaking in the middle lane because the outside lane is full - practical observations, not necessarily rules of the road, just what happens.
Of course, 90% of drivers consider themselves better than average...
(p.s. on a Tesla, can you set the cruise control to a speed which is illegal on a given road? If so, why? And come to think of it, the same question applies on any other vehicle with inbuild GPS speed maps.)
-
-
Monday 26th December 2022 11:43 GMT Anonymous Coward
Re: Hmmmmmmm..low Bay Area driving standards??
Compared with where? Italy? France? The M25? Delhi?
LA drivers are a lot better than Bay Area ones, because, well, its pay attention or suffer the consequences. I've seen a pickup truck forced onto the shoulder on the Ventura Freeway because he was being an idiot. And no one would let him back on. I've seen big rigs scare the living crap out of idiots doing really stupid lane changes. Plus the local LAPD / CHP do LOTS of traffic stops. Because there is usually plenty of space on the shoulder for pull-overs. The rest of the time everything mostly flows at 70mph plus at traffic densities that would reduce a Bay Are freeway to stop and go.
While in the Bay Area you have a large immigrant / out of state population and low / no traffic stops. Due to the lack of hard shoulders on the busiest freeways. 101/80/580/880. Makes a big difference. So drivers dont pay attention. Then you have the sky high medication rates for certain demographics. Which accounts for most bad Marin country drivers. Move away from the urban cores and the driving started getting better very quickly.
The out of state plate drivers fall into two groups. The mostly very good drives from the Pacific Northwest. And the god awful drivers from Back East and Florida. See a NY/NJ/PA/MA or FL plate and expect some really terrible driving. As bad as the Russians but not as suicidal. Southwest drivers tend to be OK. Although AZ plates often mean old and random.
-
-
Friday 23rd December 2022 13:37 GMT Helcat
Re: Hmmmmmmm
"on a Tesla, can you set the cruise control to a speed which is illegal on a given road? If so, why? And come to think of it, the same question applies on any other vehicle with inbuild GPS speed maps."
How do we know the maps are correct? Or that the positioning is correct?
I know of two spots around where I live (UK) where GPS can get your location wrong and assume you are on an adjacent, lower speed, road. One location is for a 40mph road, the other is a 70mph limit - the parallel road is 30mph. Do you really want a car to decide you're speeding and drop your speed from 70 to 30? That's... kind of a problem.
Then there's the question as to how accurate the data is. My current car has speed sign recognition, and some system to know what the speed limit is when it's not got speed signs - and it displays these as a displayed value and a mark on the speedometer. It happens to be the most inaccurate system I know to in the car, and notably keeps reporting roads as 20mph if it can't get a decent signal... which sounds suspiciously like what happened with the Tesla. It also reports speed limits of 45mph and 90mph, neither of which are valid in the UK, so there's something distinctly wrong with the system making it useless.
So the question comes back to: How does the car know what the speed limit is on that specific road, and is it reliable enough to use to limit the car's speed? And that means the answer to your question is: The system isn't reliable enough.
-
Friday 23rd December 2022 13:46 GMT Chloe Cresswell
Re: Hmmmmmmm
I have a Q30 with sign recognition.The weirdest one I've seen yet has to be a national speed limit sign, overlaid with KM.
And that's before you drive on the motorway, where it doesn't "see" signs on the gantry, and then tells you the limit is 50, because it's read the sticker on the back of the lorry to your left...
-
Saturday 24th December 2022 06:27 GMT Neil Barnes
Re: Hmmmmmmm
I feel you may be applying more to the concept of 'cruise control' than I do. I refer to something merely that maintains a speed, and nothing to do with lane following or warning or with automatic speed adjustment; simply speed control as a mechanism to save the driver's right ankle from strain. I expect the driver to be aware of his surroundings and alert to changes; if the speed limit changes, then it is the driver's responsibility to change his speed. As you point out, the last thing wanted is a device that suddenly slams the anchors on if it decides the speed limit is different.
If the system thinks for whatever reason that the speed you're trying to set is over the local limit - either because it actually is, or because of faulty deduction or observation or location information or whatever, all it needs to do is politely say sorry, you appear to be going too fast, try again later. My point is that I know of none that do this, and I wondered why not.
-
Saturday 24th December 2022 18:54 GMT skswales
Re: Hmmmmmmm
I've had one of the 'speed sign recognition' cars as a courtesy car. Wrong far too much of the time. And GPS maps can't know what's right. I updated my satnav just before the car went out of warranty and quite a lot of the speed limits were wrong even then.
Up here a load of 20 mph signs were dropped seemingly permanently in place early on during COVID as the local council wanted to encourage people to walk locally on the roads without pavements. Of course the temporary speed order for these signs lapsed over six months ago but the council can't be bothered to remove them (I checked with them)...
-
-
Tuesday 27th December 2022 21:47 GMT david 12
can you set the cruise control to a speed which is illegal on a given road?
"illegal" is a big elephant in the middle of that sentence.
There is a sense in which something is 'illegal' even if the courts and police do not enforce it, but there is another sense in which 'legality' is what the courts decide it is.
Is the 'right to bear arms' conditional on a 'well formed militia'? Does it include the right to operate an anti-aircraft battery? Who decides what the law means, and after it is decided, do the words "legal" and "illegal" apply to what the courts think, or to what you think the legislation "clearly means"?
-
Sunday 1st January 2023 04:09 GMT Ghostman
Re: can you set the cruise control to a speed which is illegal on a given road?
Such as the marijuana (non)enforcement in many areas even though it is still illegal by Federal Law.
Right to bear arms Is Not conditional for a well regulated militia. The right to bear arms is given to THE PEOPLE, not the militia.
the armed populace is part of the militia.
-
-
-
Saturday 24th December 2022 15:04 GMT John Brown (no body)
Re: Hmmmmmmm
"its a co-operative activity and still relies on the behaviours of humans "
In particular at some junctions where eye contact, politeness and sometimes self-preservation are what decides who goes first. We've probably all seen the various videos showing self-driving cars get completely stumped by some junctions, the locations and movement of other vehicles and, of course, unexpected roadworks. Self-driving cars, as they are now, could really only work if all the other cars are also self-driving AND they ALL communicate with each other, whatever the brand, with standard and open protocols, not all using proprietary and patented systems and protocols. On that note, do Teslas within range of each other talk to each other and cooperate or are they all acting independently? Tesla like to talk up how one cars experience and "learning" is passed on to all other Teslas. I wonder how true that is and how/when that "knowledge" is aggregated and passed on?
-
-
Friday 23rd December 2022 14:26 GMT NerryTutkins
Re: Hmmmmmmm
Full self driving is a total nonsense. You're supposed to sit there doing nothing, but watch the car, ready to step in and avoid an accident, if it misfires in some way. What is the point of FSD, if you just free yourself up to have to watch the car driving? You might just as well drive!
Clearly, the people driving these things are watching movies, fiddling with their phones, etc. like they probably did before, but feeling way safer because the car is "driving" for them. Until it isn't.
If they cared about safety, teach people to drive properly (driver ed in the US is a joke), start people with manual transmissions like in Europe so you have to learn to drive properly and don't have a hand free to text etc. make driving tests difficult so driving as a skill is valued and respected.
Then have the hi tech in the car watching the human and stepping in in case of emergency and not the other way around. Because robots don't get bored, they're able to step in at a moment's notice, if humans aren't engaged in driving, there is no way they will be fast enough to intervene in the likely event the car messes up.
Electric cars are great, but FSD is pointless until we can really trust it. All the while the driver is supposed to be there paying attention, it frees himr/her up to do what?
-
Friday 23rd December 2022 22:06 GMT Anonymous Coward
Re: Hmmmmmmm
If they cared about safety, teach people to drive properly (driver ed in the US is a joke), start people with manual transmissions like in Europe so you have to learn to drive properly and don't have a hand free to text etc. make driving tests difficult so driving as a skill is valued and respected.
Interestingly, there appears to exist a parallel observation in the flying business. Apparently, commercial pilots that have progressed from private planes have a far better instinctive understanding of flight dynamics* than those who started on big simulators to get their commercial license. I'm not sure where I picked that up, but I think it was a report on a plane that crashed exactly because of a combination of factors that only someone with a feel for flight dynamics would have immediately understood - as nobody in that plane had they, they took decisions in an emergency that condemned the plane and its passengers to irrevocably crash, leading the loss of life of all onboard.
Back to cars: I agree that understanding the dynamics of what's under the hood is a good idea, but it is becoming increasinly less possible to do so. EVs simply don't have gears or at most two, and those are operated automatically. It's really only ICEs that may still have them, and those too are more and more supplied with multi-level automatic gear boxes (which I don't like, because especially the super economic 1l engines that car rental companies are now forced to buy tend to switch gear so late that you kick it into semi-manual just to get rid of the screaming noise, and that's just the passengers).
As with everything (and that includes IT), some knowledge of fundamentals does actually make you better at it. Having an active interest in them even more so.
* The expression "flying by the seat of your pants" actually has its origin there
-
Saturday 24th December 2022 10:25 GMT Alan J. Wylie
Re: Hmmmmmmm
Apparently, commercial pilots that have progressed from private planes have a far better instinctive understanding of flight dynamics than those who started on big simulators to get their commercial license. I'm not sure where I picked that up, but I think it was a report on a plane that crashed exactly because of a combination of factors that only someone with a feel for flight dynamics would have immediately understood
Physical, or traditional, piloting skills are typically developed through extensive experience flying small aircraft which have little or no automation. These aircraft force a pilot to develop an intuitive understanding of how airplanes behave in various regimes of flight, and anyone who cannot develop these skills will wash out at an early stage. Captain Marc Dubois no doubt had these skills: between 1977 and 1987, he obtained type ratings on no less than 17 different light aircraft and accrued thousands of hours flying them. If he had been in the pilot’s seat when the airspeed indicators failed on flight 447, there is little doubt that he would have reacted correctly: he surely had an intuitive understanding that, in the absence of any configuration changes, the plane will continue to fly on its previously established trajectory, even if all the instruments are lost — a sort of airman’s object permanence.
-
-
Friday 23rd December 2022 22:11 GMT Anonymous Coward
Re: Hmmmmmmm
The most dangerous issue is this illusion Tesla sells that you can somehow take over from the car after you have just been encouraged by the manufacturer via games and video not to pay attention to the road.
Building up and maintaining situational awareness takes time and continuous effort. It takes far more time than the car gives you to become aware of what is happening around you. If the car self drive is giving up it means your problem is imminent, which translates as making YOU responsible for the dangerous situation you find yourself in, while the car manufacturer has a nice log entry saying that they handed off which absolves them from the consequences.
That's the prime reason I don't trust Tesla's statistics.
-
Sunday 25th December 2022 14:37 GMT Fruit and Nutcase
Re: Hmmmmmmm
that you can somehow take over from the car
There is a very simple solution.
Strap Musk into a Tesla with the latest iteration of their self-driving software on a test track. Get the vehicle to up to speed and get it to go on a course where it will encounter a situation which it cannot handle - Tesla hands over to the human (Musk) who tries to correct/react, but, the situation is one which has been engineered such that the result is fatal if not acted in the time that Tesla has mandated as being possible.
-
Sunday 25th December 2022 16:06 GMT Anonymous Coward
Re: Hmmmmmmm
Musk is probably subject to Key Person insurance, which insures people critical to an organisation. That comes with conditions that dictate you should not willingly get yourself in dangerous situations, and I would love to hear what that insurance thinks about him using a Tesla running FSB.
It could well be that he's not even allowed to use it or risk voiding his cover, but you can bet on it that that would never be made public.
(and no, you don't need to be a gazillionaire to be so insured - in some areas of business it's standard risk management to cover the people with skills/expertise/IP critical to a project until the project is ready and documented)
-
-
Tuesday 27th December 2022 20:08 GMT gnasher729
Re: Hmmmmmmm
I would actually like the opposite- car taking over in rare situations. I had a situation where I was driving along happily on the motorway and suddenly a car in the front took a 90 degree left turn. Straight from the middle lane across the outer lane into a field.
I was so gobsmacked i didn’t react. Lucky enough I was far enough away and nothing actually to do, but otherwise I would have driven straight into the mess. A self driving car with good programming and lots of processing power would have known the situation around my car and plotted a path avoiding any trouble.
-
-
-
Friday 23rd December 2022 20:05 GMT Ian Mason
Re: Hmmmmmmm
It isn't "a story about a driver blaming his autonomous car", it's a story about a whole bunch of drivers who were driving too close for the speed they were doing blaming the car at the front.
What the reason for the lead car's speed reduction is largely immaterial, whether that be Tesla software responding wrongly, responding correctly, or the driver stopping for a good or even a poor reason. If the following drivers had kept safe distances for their speed the whole thing would have been a non-event.
-
Sunday 25th December 2022 14:10 GMT Camilla Smythe
Re: Hmmmmmmm. Yeah but....
Is the tell not in the article in as much as Tesla say the software has had an early release to people with a good driving record and now the tards, Idiocracy Reference, are going to get it. So they started out by giving it to what looked, in statistical terms, like sensible people and, because statistics, eventually someone identified as being sensible, chose to drive full auto in an environment where they should have had a heightened sense of awareness but they were having a wank....It seems to me that Tesla, and AI in general, may not be all that in as much as their offerings may work higher up the IQ curve but, when they move down that curve, things rapidly start going wrong.
-
-
-
Friday 23rd December 2022 05:06 GMT Bebu
Capable of driving a car as well as a human?
"Nothing this side of Star Trek is actually capable of driving a car as well as a human."
Not saying much - clearly humans are a bit rubbish too - what happened to the one car length per 10mph rule of thumb for a safe stopping distance? Even 1 car length per 20km would do - given brakes and tyres have improved since I learnt to drive although attention spans have deteriorated.
Actually I recall the Enterprise in a write-off bingle with a planet in one of the sillier movie sequels.
-
Friday 23rd December 2022 09:46 GMT Anonymous Coward
Re: Capable of driving a car as well as a human?
I try to follow the 2 second rule - I find it quicker to judge. That’s 2 seconds minimum - a bit more if traffic allows, and a lot more in poor conditions or I think the driver in front is too close to the one in front of them. Not perfect (nothing is) but has served me well in >40 years of driving.
Mainly U.K. and continental Europe, though. I once drove on the Houston ring road where there was a minimum as well as a maximum speed limit - traffic was bunched up close and trying to keep what seemed a safe distance for me led to other cars cutting in front of me to fill the gap (and cars behind honking to show their annoyance at my leaving a gap). ISTR the road had 6 lanes for much of the length I drove it, and no rules on only passing on the offside. I was glad when I was able to get off that road.
With that in mind, it doesn’t surprise me that a car slowing causes a pileup. The US is the only place I’ve encountered minimum speed limits to maximise traffic density.
-
Friday 23rd December 2022 17:18 GMT Antron Argaiv
Re: Capable of driving a car as well as a human?
Slowing down to 20 in the high speed lane, immediately after merging (which is what I understand the Tesla did) is almost a guarantee of a pileup.
Sure, the humans shouldn't have been following so closely, but I rarely see that kind of caution. And you ARE at fault if you do something like that (merge, then hit the brakes). Of course Tesla will probably claim the data isn't recoverable for some reason...
-
Friday 23rd December 2022 18:11 GMT nobody who matters
Re: Capable of driving a car as well as a human?
"........The US is the only place I’ve encountered minimum speed limits to maximise traffic density............."
I am not sure whether it is to maximise traffic density, but there are places in the UK where there is a mandatory minimum speed limit - probably most notably in the Mersey tunnels - minimum in the Kingsway Tunnel is 10mph, whilst the minimum for the Queensway Tunnel is 10mph in the left hand lane and 20mph in the right hand one.
At the same time, there is also a bylaw which requires drivers to leave a minimum 23 metre gap between vehicles when moving, and 2.75 metres when stationary!
-
-
Friday 23rd December 2022 22:24 GMT Anonymous Coward
Re: Capable of driving a car as well as a human?
I follow the "look at least 3 cars ahead" rule. I adjust my distance according to how far ahead I can see, and if it's very busy I will also not often venture into what is known as the "fast" lane because in my experience it makes very little difference in how fast I get from A to B.
By staying in the lane with HGVs (or, on a multi-lane, at least one lane away from "fastest") I can keep a good distance without people piling in because they see a gap, and as a bonus it places me in a position where I can generally see far ahead and so spot jams and slowdowns way before they become an issue and can so roll out. It makes for much smoother and more relaxed driving which prevents dangerous fatigue and accidents.
Especially when I'm driving long distance I sometimes just bail out of the mess at a petrol station and have a cup of coffee and something to eat, allowing the peak to pass. Risk management is simply using your brain, and I say that as someone who had training to drive *very* fast - just because you can doesn't mean you have to, and it also saves a ton of fuel or battery power. That doesn't mean I'm a slow driver, but I don't see the point of harsh start and stop manouvres when there's actually nowhere to go, that's just silly..
-
Saturday 24th December 2022 11:35 GMT I could be a dog really
Re: Capable of driving a car as well as a human?
I follow the "look at least 3 cars ahead" rule.
Exactly. And that's something the "the cars behind we're just too close" crowd are ignoring. Any half decent driver will be looking ahead, and even if they don't realise it, thinking about what the other drivers are likely to do. So if you are all comfortably driving along at the same speed, and you can see several cars ahead, and there's no obstructions - then you are mentally evaluating that "the other drivers are going to continue more or less as they are". As someone wrote above, part of your own stopping distance will be tarmac in front of the car in front. If one car then brakes hard with no valid reason then that's the driver doing something totally unexpected - and I suspect we could debate for hours, days, weeks, how to spit responsibility for the ensuing pile-up.
As an aside, not far from where I live there is a two lane dual carriageway with a petrol station on it. One day, from a good distance back so I was not involved, I saw a car in the outside lane just brake hard and pull across the other lane to go into the petrol station. I imagine the driver was oblivious to the sea of brake lights and swerving cars as the other drivers had to react quickly to the totally unreasonable and unexpected actions of the idiot in that car. Luckily for him there were no collisions - if there had been, I'd have been quite willing to be a witness to his culpability in whatever happened.
As another aside ... Just the other day I was driving into town, and the safe gap I had left between me and the car in front was clearly an invitation to the driver of a road sweeper to just pull out in front of me - forcing me to come to a complete stop before he got out of my way and completed the turn he was making. So, on the basis that you should be able to stop before hitting anything, do those in the "if you run into the back of someone who brakes hard for no reason then it's your fault" camp also say that if someone pulls out in front of you (in clear contravention of the rules of the road) then it's your fault if you had not slowed to a speed where you could stop in (say) 10 feet ?
-
Saturday 24th December 2022 22:23 GMT Terry 6
Re: Capable of driving a car as well as a human?
No. There's a total difference between stopping ahead of a car and pulling in front of it. An issue that gets dodgy if the car driver that pulled in ahead of you claims that he was already there/had been for a certain amount of time first. It's then that the front facing camera is invaluable.
The UK application is that you have to leave enough time and space ahead of you to be able to stop if the car ahead of you stops. For whatever reason. That just doesn't apply if the car ahead cuts in front of you.
-
Sunday 25th December 2022 21:14 GMT ChrisC
Re: Capable of driving a car as well as a human?
"Exactly. And that's something the "the cars behind we're just too close" crowd are ignoring"
Not really. Keeping an eye on what traffic is doing further up the road can give you early warning of an emerging scenario, but that doesn't mean you can just assume the vehicle immediately ahead of you won't still end up needing to do something that isn't predictable based on that advance observation.
Ultimately, what's happening further up the road may have some bearing on what the traffic immediately around you ends up doing, but seeing a free flowing road all the way off to the horizon doesn't give you a free pass to get right up the arse of the vehicle in front just because you can't see anything which ought to cause them to suddenly throw out the anchors.
I still remember vividly driving home on the motorway, in good visibility and relatively light traffic level. Everyone was making good steady progress with nothing visible ahead to suggest any need to even start easing off a little, let alone the need to go full on emergency stop.
And then the car ahead of me suddenly, literally in the blink of an eye, disappeared in a thick cloud of oil smoke as it's engine decided to nuke itself. I genuinely have no idea how quickly that car slowed down, because for the first few seconds I couldn't see anything. I do however know how quickly my car slowed down, because outside of my driving test that was the first time I'd ever done a full on, pedal to the floor and keep pushing, emergency stop.
Did the vehicles behind me have any inkling I was about to do that, based on either the manner of my own driving or that of the vehicles ahead of me in the time up to the point where the other cars engine let go? Nope. And yet I came out of that incident unscathed, which is entirely due to the vehicle behind me keeping a safe distance to *my* car regardless of how good things looked further up the road immediately prior to that other cars engine letting go.
"then you are mentally evaluating that "the other drivers are going to continue more or less as they are""
Are *probably* going to, but still always be prepared to deal with them doing something completely different...
-
Tuesday 27th December 2022 10:56 GMT Anonymous Coward
Re: Capable of driving a car as well as a human?
I know that keeping distance is mandated, but I prefer to also keep a route free to evade where possible. It is not always possible to leave the full distance because you actually *increase* the risk in queue traffic as morons see that as an invite to ram in front of you, effectively causing the hard brake manouvre you were so diligently trying to avoid.
In addition, you can bet your last cent that if there's an accident as a result, the aforementioned moron will blame you, ironically, for not keeping distance to keep his insurance cover.
It's one of the reasons vehicle cameras become more common - driving standards aren't as bad as you can see on Russian videos on Youtube but it seems we're getting there..
-
Wednesday 28th December 2022 14:00 GMT I could be a dog really
Re: Capable of driving a car as well as a human?
I do however know how quickly my car slowed down, because outside of my driving test that was the first time I'd ever done a full on, pedal to the floor and keep pushing, emergency stop.
But, you still used some road distance to stop - to the car behind also got to use some of that road distance in which to stop. That was the point I was making. Some argue that you need to be able to stop before hitting a car that magically comes to a complete standstill as though a brick wall had just appeared in the road in front of you.
I recall driving in the Netherlands a few years back - and finding that the locals assumed that the gap I was leaving as a safe stopping distance was merely wasted space that should hold another few cars. I don't advocate that lack of spacing ...
As with most things, there are trade-offs. We could pretty well eliminate road accidents and the corresponding injuries and deaths - by the simple process of eliminating all roads and road traffic. But the social and economic costs of that would be massive and would without trying to devalue the cost to those involved in accidents, massively outweigh the benefits. Just think about the health and other benefits we gain from (just picking one example) the ability to access a wide range of wholesome and fresh foods all the time - rather than only those that are within walking distance and when they are in season. Though the effects from Russia's invasion of Ukraine have shown that perhaps we've gone a bit too far down the specialisation and just in time roads.
There is a tradeoff between using lots and lots of road space - but that reduces the capacity of the road in terms of vehicles/hour (or whatever). And on the other hand, bunching up so as to maximise capacity - but then leaving no margin for the unexpected.
And all the time it's a matter of probabilities. I don't suppose many people drive all the time as though a commercial aircraft is suddenly going to crash in front of them - but it's a possibility. So we have to drive to what is reasonably likely - your example of the car in front suffering a dramatic engine failure is quite an infrequent occurance.
-
-
-
-
Friday 23rd December 2022 05:34 GMT First Light
FSD ≠ Autonomous
According to the Graun, "Tesla has repeatedly said its advanced self-driving technology requires 'active driver supervision' and its vehicles 'are not autonomous'."
So WTF does Full Self Driving actually mean then? Because it sure sounds like it means you can bingewatch that endless Turkish Netflix show Diriliş: Ertuğrul (448 episodes!) while not paying any attention to the road whatsoever.
And after all the drama at Twitter, do investors and regulators still trust Musk's judgement? If I had a Tesla I would continue to let other drivers have the Fully Self Delusional experience for some time before using it myself.
https://www.theguardian.com/technology/2022/dec/22/tesla-crash-full-self-driving-mode-san-francisco
-
-
Friday 23rd December 2022 09:30 GMT werdsmith
Re: FSD ≠ Autonomous
This sudden phantom braking is not unique to Tesla. It happens on other makes of car with driver aids. I’ve kind of learned what conditions cause it and when to expect it. But your hands should be on the steering wheel and then button to engage/disengage the driving aid directly under your thumb. I’ve had to do it a couple of times. Not on Tesla though.
-
Friday 23rd December 2022 10:08 GMT Steve Button
Re: FSD ≠ Autonomous
well isn't that just the worst combination? You are mostly not needing to focus on the driving, but every now and again you need to take over at short notice, without warning? That would mean you are far less likely to be able to focus on the road, as your attention will naturally drift away after many tens of miles of not having to do anything. Compared to if you are actually driving the thing anyway.
When you say "a couple of times", how many miles of driving are we talking about roughly?
-
Friday 23rd December 2022 20:10 GMT werdsmith
Re: FSD ≠ Autonomous
well isn't that just the worst combination? You are mostly not needing to focus on the driving, but every now and again you need to take over at short notice, without warning?
Errm, that's how these systems are supposed to work. It's in the law.
And it really is no big deal in practice.
-
-
-
Saturday 24th December 2022 11:55 GMT I could be a dog really
Re: FSD ≠ Autonomous
One can react in a fraction of a second
Perhaps - if you are alert and have an up to date situational awareness. But if you have that, the self driving isn't doing anything useful (like allowing you to relax and let it do the boring stuff). It's a well known issue that anywhere where there is automation (it's definitely a known problem in commercial air transport) then the humans who are supposed to be keeping an eye on things do not stay alert. So there you are, letting the automation do it's job - and suddenly you are faced with an unexpected situation, the system beeps and hands you back control, and it will take "a while" for you to scan around and work out what the current situation is. So yes, you may react in under a second, but it'll take you a good while longer than that to work out what is going on and what you need to do.
One example that comes to mind is a bit from the film Sully: Miracle on the Hudson. During the hearings, other pilots are put in a sim and subjected to the same double engine failure - some interested parties* are keen to show that it was pilot error to ditch the aircraft when it could have made a landing on a runway at an airport. Sully points out that those pilots have the advantage of knowing that the engine falure is going to happen, so have already determined what they are going to do - while in reality you just get a load of alarms and have to work out what's happened before you can work out how to deal with it. Once some "thinking time" is introduced into the sim sessions, all the other pilots ended up crashing the plane into the city having failed to reach the airport.
* For example, the insurers would be keen to be able to lay the blame on any other party for the total loss of the aircraft by ditching vs the alternative of landing it in a repairable fashion on a runway.
And not all that long ago, a commercial aircraft over-shot it's destination because it's thought the pilots had fallen asleep while the autopilot was flying. It really is a known and serious problem keeping the crew alert when "there's little to do".
And of course, if you are flying on autopilot and it beeps to tell you it's handing back control - then not only are you probably beyond in situational awareness, but the autopilot has put you in a position where it can't fly the aircraft. So basically it's a case of "oh shit, I can't fly this, your problem now to retrieve this possibly terminal situation" :-(
-
Saturday 24th December 2022 12:30 GMT Anonymous Coward
Re: FSD ≠ Autonomous
One can react in a fraction of a second
Only with situational awareness up to date. Otherwise you have no idea if you should brake, accellerate (which is sometimes also an option), steer the car in a certain direction, come up with a damage mitigating strategy so you go for damage to protect the humen that has suddenly shown up in your headlights or quickly switch on the screenwipers instead - none of these decision point will be known to you the moment you have to take a decision that can literally mean the difference betwene life and death for someone. In other wors, the likelihood of you taking a WRONG quick decision is so high as to render the ability meaningless.
This is not just a Tesla thing, it is becoming standard on new cars, but if you prefer more accidents, by all means fight it.
But the "fight it" is exactly the problem. You try and navigate through some of the more interesting junctions in London without that auto brake thing panicking and trying to throw out the anchor. Unless you kill it in advance (read: you plan your journey and actively monitor the current situation, i.e. are far away from autonomous driving) it means you actually have an EXTRA thing to worry about during driving.
I am 100% for technology that makes my life easier as a driver. I love HUDs because it means I can keep my eyes even more on the road, I hate touch screens because I have to take them away from the road to operate then, and I very much hate the idiotic idea that I have to drive with my head in a 45º angle to keep an eye on my speed so no Tesla 3 for me (but, if I recall correctly, I have seen this insane idea to put the speedo out of your line of sight in another car as well).
However, if you start putting tech in a car that you have to fight with to keep driving, that cannot be justified, even if some EU morons have decided it should now be mandatory (and whoever mandated that it should re-enable every time you start the car should receive percussive education about freedom of choice). You should not have to battle with your car because that's pretty much the antithesis of safety.
-
Tuesday 27th December 2022 19:08 GMT Richard 12
Re: FSD ≠ Autonomous
I guarantee that you cannot react appropriately in time under these circumstances.
If you're actively driving, then most drivers can respond in time because the event is a plausible future for something you're already aware of. Your subconscious can therefore take an appropriate action.
If you are not, then it takes several seconds to react. Your conscious mind is running about three seconds behind reality, so that's the very fastest you could do so.
Slamming on the brakes is plausibly an action your subconscious might take, regardless of whether or not that's appropriate.
For example, it's quite plausible that the Tesla computer did something surprising, which caused the driver to slam on the brakes despite that being a really bad idea.
-
-
-
-
-
Friday 23rd December 2022 12:00 GMT Will Godfrey
Re: FSD ≠ Autonomous
However, on my Skoda Fabia cancelling sudden braking isn't possible - unless you go through a complex menu system to disable it... every time you start the car.
Where it is most likely to happen is where a car in a right hand lane is about to turn and just touches the brakes at the point of turning, and even if stopping dead I would still just pass it on the left.
On two occasions I was nearly rear-ended, and received much honking, gesticulating and swearing.
-
Friday 23rd December 2022 13:54 GMT Helcat
Re: FSD ≠ Autonomous
I've had similar with my previous Skoda (Karoq) - only it also got confused by rain and slammed the breaks on, nearly causing the motorbike behind me to go into my back. I know of one person who has had the break assist save them from a crash - and they admitted they weren't paying attention at the time. Everyone else I've spoken to about break assist has had similar comments to yourself - the system engages when not needed or when the driver was already responding to a situation where evasion or breaking was necessary. Oh, and I had someone turning left from the left hand lane trigger the break assist - I think it was more because the corner of their car moves as if it's going to cut across the front of my/your car, which causes the breaks to engage.
My new Skoda has a button on the steering wheel to turn the system off without going through the menu, so it seems they've learned something at least.
-
Saturday 24th December 2022 15:27 GMT John Brown (no body)
Re: FSD ≠ Autonomous
As I am in the market for a new(er) car in the next 12 months, I've very greatful for this and similar comments on what to watch out for what to ask about and what to check for on any vehicle that I may be interested in. To date, my only gripe with "automation" I've experienced has been fucking stupid automatic headlights that come on far too early, never ever automate sidelights which would more appropriate and always default back to auto when you start the engine. I wasn't aware of auto-braking and lane-keeping being things that might default to on.
-
Saturday 24th December 2022 22:45 GMT Terry 6
Re: FSD ≠ Autonomous
The lane keeping one is a bit of a bugger. We keep ours on, but there are a number of local places where we have to fight with it a bit, because,for example, you need to infringe a wide white line slightly to avoid colliding with a long row of parked cars that pretty much reduce the road to less than the width needed.
-
Tuesday 27th December 2022 14:06 GMT Anonymous Coward
Re: FSD ≠ Autonomous
Try avoiding anything with urinating indicators.
There's nothing that says "fashion over safety" more than delaying the visibility of the second most important external indicator lights on a vehicle. I have noticed that some car manufacturers are now only fitting them at the rear of the vehicle, no longer at the front so maybe some sanity is returning but if I had an option I would very much stick with normal on/off indicators.
Safety first, gimmicks much, much later.
-
-
Tuesday 27th December 2022 18:41 GMT John Brown (no body)
Re: FSD ≠ Autonomous
I think he's referring to the "rope/chaser light" or "Knight Rider" style indicators where instead of a light flashing on/off its a bar-like arrangement that lights in sequence from inside to outside to indicate which direction the car is turning. Pointless bling of course, because the location of the flashing light is legally mandated such that it already indicates the direction of turn. They are probably also hugely expensive to replace with OEM only parts too.
-
Tuesday 27th December 2022 20:43 GMT Anonymous Coward
Re: FSD ≠ Autonomous
It's a description of the current fashion for animated left/right indicators on cars:
1 - they're yellow
2 - in most cases they point downwards
3 - as the intensity takes time to build it takes the piss with safety as it delays the provision of critical information ot other road users.
I'm now waiting for some designer idiot to add soft glow to brake lights and so undo the extra msecs we have gained with LEDs.
I would like to note that it will be pretty much impossible to NOT see them as urinating now you've got that idea in your head, an excellent BOFH start for 2023..
:)
-
-
-
-
-
-
Friday 23rd December 2022 21:02 GMT Timop
Re: FSD ≠ Autonomous
I think automatic braking is mandatory in all new cars sold in European Union 2020 forward or something like that.
I haven't got my system to trigger even once, couple warnings that bought me time to brake by myself. Not sure if it is because I've set the warning trigger to pop up as early as possible.
I think at least in Toyotas swerving steering motion should cancel the automatic emergency braking. But the radars are absolutely not foolproof.
-
Saturday 24th December 2022 00:32 GMT Terry 6
Re: FSD ≠ Autonomous
The automatic brake stopped our car a couple of times. A car cut across in front of me trying to get to a petrol pump that was free ahead of me. And my wife was driving past some bushes on a very narrow curving lane and the sensor disliked a bush that was encroaching into the road.
-
-
Monday 26th December 2022 22:47 GMT Justthefacts
Re: FSD ≠ Autonomous
I’ve had similar issues that make me think again, on a Suzuki Swift….No self drive at all, just an emergency braking system.
But here’s the problem: you come up behind someone who’s stopped and turning right at the lights, and there’s simultaneously someone in your left lane. So *you* know you’re going to do a smart little jink manoeuvre around the stopped car. But your car doesn’t. So it slams on the emergency brake, which is the worst thing possible because there’s someone behind you who is even more surprised than you are.
I avoided being rear-ended by literal millimeters, as he skidded towards me. Actually both I and the car behind thought we touched, but we couldn’t see any damage.
My manoeuvre on its own was fine, if “smart”, I’ve done similar without thinking thousands of times. I can’t see how to make the cars response “better”. But suddenly I have to second-guess the car, and drive every junction as if I have a jittery old lady learner driver with a boat anchor in the passenger seat.
Turns out, there’s an answer. You can switch the system off…but it switches itself on again every time you turn the engine off. So, every time you start the car, you *must* remember to turn the system off, otherwise you have a lethal
Boeing MCAS “helping”. Forget just one single time, and be rear-ended suffering the pain of chronic whiplash for the rest of your life. Nice.
-
-
Friday 23rd December 2022 15:23 GMT hoola
Re: FSD ≠ Autonomous
And more to the point the liability has to be with Tesla.
If you cause an accident by braking for no reason and there is clear evidence then the person at the front is deemed to be at fault.
In this case Tesla as the people responsible for the software. There has to liability sorted out so that the companies pedalling this shite are liable. Not only that they are liable to the point that it hurts and there are also real people in that chain of liability who can be prosecuted.
Until we sort that out, this functionality should not be allowed on the road. This is being driven by what are essentially tech companies that have already shown a complete lack or attention to quality control, responsibility and accountability.
If you buy a piece of software you have a right to use it with no guarantee that will function as intended, no comeback if it causes you loss and no expectation that if it breaks it will be fixed.
-
Friday 23rd December 2022 18:26 GMT nobody who matters
Re: FSD ≠ Autonomous
".........If you cause an accident by braking for no reason and there is clear evidence then the person at the front is deemed to be at fault........."
May be so where you are, but gefinitely not the case in the UK. Rule 126 of The Highway Code specifically states that a driver must be able to stop in the distance that they can see to be clear (ie, before the next blind corner, or before hitting any vehicle in front of them) and specifically refers to a situation where the front vehicle suddenly slows or stops. You would be very hard pushed (and almost certainly unsuccessful) trying to prove the driver in front as being liable even if he were to make a sudden stop.
This has in the past led to crooks deliberately causing accidents to make spurious insurance claims by waiting for someone to get too close behind and then slamming on their brakes. Fraudulent or not, the law still regards the following driver to be at fault for not allowing a sufficient braking distance behind the vehicle they are following.
-
Saturday 24th December 2022 00:25 GMT Terry 6
Re: FSD ≠ Autonomous
Even, indeed if the car ahead rolls back a few inches, like when stopped at a junction on the top of a hill, and it bumps into you, the driver behind is deemed responsible because he was too close to the vehicle ahead, even while stationary. I know, because such an occurrence was the only blemish on my late father's driving record.
-
Saturday 24th December 2022 12:03 GMT I could be a dog really
Re: FSD ≠ Autonomous
Again, not universally true. In the days before dash cams were common then it would be hard to defend such a claim (hence why your late father would have the blemish) - but no matter how close you have stopped, if the car in front "reverses" into you then it is that driver at fault. It's an issue of evidence, and the difficulty of proving it - hence why many people believe what you do.
-
Saturday 24th December 2022 22:30 GMT Terry 6
Re: FSD ≠ Autonomous
The difference there is between rolling back, in which case, ( unless the driver failed to act on the roll back), the car behind is culpable, and moving backwards under the driver's action, e.g. in reversing. If you reverse into a vehicle behind you it's your fault. If you roll back more than a short, reasonable distance, it's your fault i.e.. by not acting to stop the roll back within a reasonable time.
-
-
Saturday 24th December 2022 18:58 GMT Giles C
Re: FSD ≠ Autonomous
Well I sat in a queue of traffic, and the lorry (7,5t) in front of me which was the cause of the spot realised they were too far forward of the turning so put it into reverse and went into my car, which then went backwards into another car behind it.
I hit the horn as soon as the reversing lamps came on but it still needed a new bonnet and rear bumper.
So most of the time I would agree with you but when the idiot in front engages reverse and drives into you…,
-
Saturday 24th December 2022 23:00 GMT Terry 6
Re: FSD ≠ Autonomous
That's different. A vehicle reversing is, in effect, driving towards you backwards. A driver that reverses back to pull round a vehicle ahead and hits something behind him is at fault just as one who reverses to get into a slip road is. Even with a roll back - as caught my late father out- it's only a short roll back that leaves the driver behind at fault. If the driver ahead fails to halt his vehicle rolling backwards in a timely manner and infringes a reasonable gap left by the car behind then he's likely at fault just as if he'd deliberately reversed. My father got caught because there was a short 2 car length, curved and steep slope at the end their street where it met the main road, so he was partly round the curve and too close to the vehicle waiting to turn into the main road. If he'd stayed off the curve the car might not have hit him- or if it rolled back far enough to do so would have shared, or got all the culpability.
-
-
-
Saturday 24th December 2022 10:33 GMT OhForF'
Rear end equals fault of car in back?
The second driver should have regulated his speed and distance to be able to stop in time but the driver in the front should not have created a dangerous situation by slamming on the brakes. Local law here will share the liability between both drivers - is the blame really solely assigned to the car in the back in the UK?
Would be interesting to see what our courts decide if the reason for the sudden speed reduction was not the driver but the FSD assistant.
In my opinion Tesla should not be allowed to call it "Full self driving capable" unless they take liability for everything that happens while the vehicle is in FSD mode and at least 3 seconds after it signals the driver he has to take back control.
-
Saturday 24th December 2022 18:15 GMT John Brown (no body)
Re: Rear end equals fault of car in back?
"is the blame really solely assigned to the car in the back in the UK?"
Absent any other evidence or offences, generally yes. Although if it's a no injury bump, odds are the Police will take no further action and write it up as "insurance companies to deal with it".
-
Saturday 24th December 2022 12:00 GMT I could be a dog really
Re: FSD ≠ Autonomous
Rule 126 of The Highway Code
The highway code is not the law. And in the UK, while there is an assumption that the person rear-ending another car is at fault - it is not enshrined in law. And yes, people can, and do, "get done" for causing collisions by sudden and unnecessary braking.
-
Saturday 24th December 2022 18:20 GMT John Brown (no body)
Re: FSD ≠ Autonomous
"The highway code is not the law."
Sort of true. It's a plain English interpretation of the law and generally accepted as "the law". If you have a dispute over it, you can go to the statute books for the definitive definition and if necessary, get lawyers and a judge involved if you feel really strongly about it. But odds are that a judge will point to The Highway Code and suggest you go read it again.
"Many of the rules in The Highway Code are legal requirements, and if you disobey these rules you are committing a criminal offence. You may be fined, given penalty points on your licence or be disqualified from driving. In the most serious cases you may be sent to prison. Such rules are identified by the use of the words ‘MUST/MUST NOT’. In addition, the rule includes an abbreviated reference to the legislation which creates the offence.
Although failure to comply with the other rules of The Highway Code will not, in itself, cause a person to be prosecuted, The Highway Code may be used in evidence in any court proceedings under the Traffic Acts to establish liability. This includes rules which use advisory wording such as ‘should/should not’ or ‘do/do not’.
-
Saturday 24th December 2022 22:36 GMT Terry 6
Re: FSD ≠ Autonomous
From a statement quoted off the RAC about the highway code and the law.....
“But the advice it offers can be used as evidence in any court, to establish liability."
Can you be fined for breaking the Highway Code?
Yes, you can be fined for breaking the Highway Code.
My italics.
So, it's not the law, but you can still be fined for breaching it. Which in effect means it is the law - but with a bit of flexibility to allow for circumstances.
-
Wednesday 28th December 2022 13:40 GMT I could be a dog really
Re: FSD ≠ Autonomous
Err, no. You cannot be fined for breaking the highway code as the highway code is not the law. But what you are actually saying is that you can be fined for a real offence and failure to follow the highway code used to support the conviction. There is a significant legal difference there.
Example. You are driving very close to the car in front, such that a collision occurs. You would not be charged for failing to leave a suitable gap according to rule blah of the highway code - you would be charged with something like careless driving, with the failure to follow the highway code being used to support the charge. It may look the same to an inattentive mind - but it's really not a case of being fined for breaking the highway code.
-
Saturday 31st December 2022 23:41 GMT nobody who matters
Re: FSD ≠ Autonomous
"""........Err, no. You cannot be fined for breaking the highway code as the highway code is not the law........"""
Errr, actually, yes you can.
I think perhaps you misunderstand the meaning that is assigned to the 'Law' and a 'Code of Practice'
The Law is what you are required to do,
A Code of Practice is what you should do.
Infringement of the law will lay down the offence and the penalties that apply, and infringements of the code of practice (in this instance The Highway Code) will support conviction of an infringement of the Law. In the case of rear ending a vehicle in front of you, failing to follow the advice in the Highway Code (by following too closely and not allowing yourself to stop safely in an emergency) will support you being found legally at fault in the event of you hitting the car in front.
I think that your line of argument is at best, splitting hairs.
-
-
-
-
-
-
-
Friday 23rd December 2022 05:50 GMT martinusher
Happens all the time on CA freeways
People get too close (because if you don't some clown will dive into the gap between you and the car in front) and then they can't see far enough ahead to stop safely when the traffic stops. What happened here sounds like "Full Self Driving Mode" disengaged and the driver didn't take over. So the car slowed and the people behind him piled into each other.
But, hey, its a Tesla so its got to be Musk's fault.
-
Friday 23rd December 2022 06:54 GMT Anonymous Coward
Re: Happens all the time on CA freeways
Previous owners who have slagged off the dear leader... aka Elon the Almighty Twittler have found it very difficult to buy another Tesla from the company. With their direct sales model, it can happen. Big Brother Tesla will send you to Room 101 until you are repentant and only then will they let you back into the cult.
The above post is meant as sarcasm.
-
-
Friday 23rd December 2022 11:47 GMT Mike 137
Re: But, hey, its a Tesla so its got to be Musk's fault.
"confident in the knowledge that those words mean a thing his product doesn't do"
Clearly, fully aware.
"Previously, use of the software had been restricted to select customers with good driving records" presumably those most likely to feel able to take over in emergency (not that they would be able to in reality in most cases).
-
-
-
Friday 23rd December 2022 06:35 GMT PhilipN
Whose fault...
...if the front car is rear-ended by a dickhead following too close for the speed and/or not paying attention? In the world of human driving the front car may have had to stop suddenly for a good reason so why should it be anyone's fault other than the dickhead? If a motorcyclist in front of me drops his bike - completely accidentally - it's happened - am I supposed to run him over?
If not a "good reason " - no different, software or not. We are supposed to be vigilant at all times so it is wrong to assume the guy in front will never brake suddenly.
The only exception I can think of is if for example the SuperCyberCar slips into reverse and shunts all the trailing vehicles backwards.
-
Friday 23rd December 2022 07:35 GMT Alan J. Wylie
Re: Whose fault...
The video mentions "unsafe lane change" just prior to the incident. If someone swerves in just in front of you and slams their brakes on, it's not your fault for not having left a large enough gap. Exactly this happened to a friend of mine driving an HGV on a country road. Some impatient idiot stuck behind him finally got a chance to overtake, did so then slammed his brakes on, presumably to "teach my friend a lesson". It was the last think he and his friends did.
Mind you, that only applies to the first car. The rest seem to have been too close behind that one.
-
Friday 23rd December 2022 19:53 GMT Androgynous Cupboard
Re: Whose fault...
Funny story. I was walking home one night (technically, I ran out of cash and got out the taxi early) when I saw an altercation between a bus driver and a car at the ligths - strong words. The car driver took off ahead of the bus then slammed on the brakes - the bus hit him, not hard, but enough for the car driver to think he'd just got the other guy fired, and a payout.
Over stumbles me, almost but not completely blind drunk, to inform the bus driver that I saw the whole thing, give him my number and toddled off into the night. A week or so later I spoke to someone from the bus company - sure enough, the car driver was claiming all sorts of damages. I gave a statement - I did explain my overall condition but perhaps this wasn't relayed to other party, what a shame. Apparently he pushed it all the way before pulling out at the last minute, just before the court date. I presume it cost him an absolute fortune and made him uninsurable in the future, all of which pleased me greatly.
So there are exceptions :-)
-
Tuesday 27th December 2022 20:24 GMT gnasher729
Re: Whose fault...
You can be too drunk to drive a car, but still absolutely fine to observe things around you. And in my psychology course I was told the first thing alcohol does is turn off a little bit in your brain that tells you not to do stupid things. Which wouldn’t make you a bad witness at all.
-
-
Friday 23rd December 2022 07:39 GMT Sceptic Tank
Teslas (or actually any EV's) aren't common in Southern Africa [1] – I've never seen one. But I think I will treat them the way I do Cash-in-Transit vehicles[2]: stay as far away as possible and head for an alternative route as quick as possible.
[1] EV's will never work here. Distances are huge, charging stations are few, charge cables are likely to be stolen before you had a chance to lock your car, and the power is off for 6 to 12 hours per day because the state utility company has gone to the dogs.
[2] There's a good chance that the cash van will be robbed, and the criminals will absolutely not hesitate to shoot anybody in the vicinity.
-
Monday 26th December 2022 17:10 GMT Fruit and Nutcase
Given that Musk hails from SA, and he's produced flame throwers, may be he'll licence "Blaster" and offer it as an option on Teslas for the SA market
-
Friday 23rd December 2022 08:28 GMT Jou (Mxyzptlk)
"unsafe lane change" vs "unexpected baking" - that will be interesting!
If the driver changed lane into a way to small gap its the Tesla-drivers fault.
If the system detected tome phantom and hit the brake it is the FSD-system's fault.
As for the "have to keep a safe distance" issue of the drivers behind the Tesla:
If a driver changes into my lane, directly in front of me, and then breaks it is the drivers fault who made that unsafe lane change.
But if there was enough space and I simply reacted too late it is my fault.
Solutions: DASHCAMS! lots of them, in front and back - like I have. There will always be some other side which may lie profoundly.
-
-
-
Monday 26th December 2022 18:12 GMT Jou (Mxyzptlk)
Re: "unsafe lane change" vs "unexpected baking" - that will be interesting!
Those tomes must be stored safe and secure. The library, due to digitizing, is changing its role from a place where people can read books to a place to store books once they are digitized. And there is nothing wrong with that. For quite a while comics and magazines are scanned with minimum width of 2560 pixel. Recently you can find a lot 4000 pixels width and 6000 pixel height, which is somewhat equal or above what the eye can see if you hold the sheet on front of you + being short sighted (i.e. built-in-loupe). With an additional layer of surprisingly good OCR on top of it so you can mark text just as you are used to. No need for paper any more. Peferably stored in an unrestricted format. I prefer .epub since it is just a glorified zip file which allows you to correct typos or OCR errors annoying you. Same reason for liking .CBZ .CBR...
Once you've experienced reading a book on a large screen, i.e. above 40 inch 4k, sitting straight up instead of a small book or reader, which require to have uncomfortable sitting position, you don't switch back unless you have to.
-
Tuesday 27th December 2022 20:58 GMT Anonymous Coward
Re: "unsafe lane change" vs "unexpected baking" - that will be interesting!
Once you've experienced reading a book on a large screen, i.e. above 40 inch 4k, sitting straight up instead of a small book or reader, which require to have uncomfortable sitting position, you don't switch back unless you have to.
Nope, I use a 43" LG here (4K) and if I want to read a book I still prefer my Kindle, or my iPad but that gets too heavy.
I also have physical books, but they don't travel well (and neither does the 43" :) ), generally the Kindle travels with me.
-
-
-
-
Friday 23rd December 2022 08:37 GMT Nigel Sedgwick
Human Versus AI Drivers
I was taught, as a UK driver, to be able to stop in time if the nearest thing ahead (or the limit of your visibility) suddenly turned into a brick wall.
My family has years of experience of driving in The Netherlands, where (seemingly like CA) nearly all drivers drive too close to the vehicle in front. On one unfortunate occasion, there was a sudden slowing; everyone for several cars in front shunted the next car - ours did not, by being stopped in time; but the car behind (and several behind that) carried on the shunting game. Police attending wondered why our car was only damaged at the back!
On the reported incident with a Tesla in CA, it is quite possible that the automatic emergency braking function had a false alarm - as do drivers from time to time. However the following vehicle contributes to the cause if driving too close.
Personal additional view. A good driver on good form can identify another vehicle (even cars away) that is just looking for something to crash into - and decide to get further away from it. Self driving vehicles are extremely unlikely (ever) to be able to match that skill, but they can contribute positively sometimes - by for example never falling asleep at the wheel (or otherwise becoming too inattentive).
Different circumstances favour different 'drivers'. The game of this aspect of car design safety is to get closer to all the advantages while getting further from all the disadvantages.
Best regards
-
Friday 23rd December 2022 09:41 GMT David M
Re: Human Versus AI Drivers
I was also taught not just to allow enough stopping distance for me, but also for the vehicle behind. So if something is following too close, I try to allow extra space ahead of me so that I can give the idiot behind more chance to stop. A rear-end crash in those circumstances wouldn't be my fault, but it would still be damned inconvenient.
This seems like a precaution that 'full self-driving' software could handle quite easily.
-
Friday 23rd December 2022 13:10 GMT TitterYeNot
Re: Human Versus AI Drivers
I was also taught not just to allow enough stopping distance for me, but also for the vehicle behind
Yes, it's even in the UK highway code, which is why instructors teach people to look in the rear view mirror before slowing down or stopping.
- use your mirrors frequently so that you always know what is behind and to each side of you
- use them in good time before you signal or change direction or speed
-
Friday 23rd December 2022 14:08 GMT Unoriginal Handle
Re: Human Versus AI Drivers
"it's even in the UK highway code" but most drivers don't read it after they pass their driving test and couldn't tell you accurately what's in it now.
There are about 43,000 road deaths in the US annually. In the UK, 1700. 330 million versus 67, give or take. If I have my maths right, if the UK figure translated to a US population size then that would be about 8,300 US road deaths - one fifth of the current toll. The standard of driving has to have an impact on that number.
Driving isn't brilliant in the UK - I've taken my continued driving education well beyond what most people think would think is rational so I have a rough idea - but 5 times fewer deaths adjusted for population?
As for "Full Self Driving" - human drivers, even given all their frailties, have a much higher intuitive awareness of their surroundings. All they now need to do is to concentrate fully and properly on the task of managing a significant chunk of metal driving down the road, plan quite a bit more ahead, and the road death toll would come down in the UK.
-
Saturday 24th December 2022 12:31 GMT I could be a dog really
Re: Human Versus AI Drivers
and the road death toll would come down in the UK
Which unfortunately it isn't IIRC.
My personal view is that government initiatives over the last decade or two have actively lowered the standard of driving - in a way that was entirely predictable, and for some is a "I told you so" situation. As a result, while the statistics were improving steadily, the improvement more or less stopped, and in some ways started to get worse.
When I learned to drive, much emphasis was put on "assess an appropriate speed for the location and conditions", and speed limit enforcement had (mostly) quite a degree of judgement from experienced coppers (drive like a total tool while just a little over - ripe for being pulled; drive in a safe manner while not "taking the p**s" fast - either get ignored or perhaps given a few words of advice). But then some idiots decided that teaching people that "you're safe if you stick below some arbitrary* number on a sign" would improve things, and replace experienced traffic police with "pass/fail" automated enforcement** - so now we have a generation of drivers who were taught "compliance" rather than judgement, as if setting a speed limit can account for changes in conditions (50 still OK if it's snowing or freezing rain ?). And the same in a number of other aspects. A friend is a driving instructor, and he told me a few years ago that they'd had to change their teaching from (OK, not exactly, but I think you know what I mean) "don't be a dick and crash" to "comply with these rules".
* Yes, many speed limits are arbitrary. I think we all know roads that were (for example) once considered fine for the national speed limit, but then suddenly because unsafe for anything above (say) 30. Round my way, a lot of villages went from no signage at all (so only the "30 in a built up area" rule) to blanket 20mph limits - simply because there was a load of central government cash being thrown around, conditional on road safety schemes which are only allowed where the speed limit is 20. I also know of roads where the speed limit was lowered for no reason other than to push traffic to other roads (e.g. one where a Trunk A road was downgraded and traffic pushed to the new bypass), or where it's for noise reasons, or ... So no, speed limits are not all "for safety".
** And some of the stats used to prove the efficacy of speed cameras are little more than numerical sleight of hand.
-
Saturday 24th December 2022 13:16 GMT Anonymous Coward
Re: Human Versus AI Drivers
I picked up somewhere that the study that concluded that speed cameras always reduce fatalities arrived at the opposite conclusion when in draft form, after which the authors were instructed by their government customer to change it so it ended with the conclusion the government wanted. That's why the original report was at least 50% advanced mathematics - that was camouflage. In other words, a lot of policy decisions which used that document to justify the measures were based on a falsehood.
I think it's been updated since so the original may not even be online anymore, but it was interesting to learn.
-
Wednesday 28th December 2022 13:34 GMT I could be a dog really
Re: Human Versus AI Drivers
I could believe that.
For anyone who didn't pick up on what the numerical slight of hand is ...
Speed cameras were (at least to start with) placed at locations with an elevated accident rate - basically hit a high rate for a couple of years, get a speed camera. In simple terms, roll a dice and if you roll a 6 twice then you get a speed camera. You can try this yourself - a spreadsheet will make it quicker and easier.
Roll a dice for many locations, one roll represents the accident rate for a year. Work out the average rate for each site over (say) the last 3 rolls. If you roll a 6 twice, then assume a speed camera installed. Work out the accident rate for the years AFTER the camera is installed. You'll find that the high average rate (includes two 6es remember) for each camera site will generally revert to the mean - which for a standard dice is 3.5.
Worked example. Suppose you roll 3, 6, 6. The average over the 3 years is 5. So when the rolls turn out closer to 3.5, after installing the speed camera you can show that the accident rate reduced from 5 to 3.whatever and say it's the speed camera that caused it.
But as for the sites that got a speed camera on a brand new road - rather blew the arguments out of the water for anyone with an analytical mind.
And the massive irony is that the "gatso" was invented by Maus Gatsonides to help him drive faster in competition.
-
-
-
-
-
-
Friday 23rd December 2022 17:35 GMT martinusher
Re: Human Versus AI Drivers
The thing the Tesla can't do -- along with a whole bunch of human drivers -- is to anticipate likely problems from a combination of experience and looking for clues in the greater environment.to anticipate likely hazards. Its what's called in aviation 'situational awareness' and unfortunately its not something that's easily taught because if it was then people would know instinctively when it was safe to use automatic driving features and when it is not. GM is trying to hedge its bets with its version by combining it with GPS so that it can only be used on known roads where its safe for the system to be used (this carries its own issues, though).
The fundamental problem that Tesla has is that the likely buyer and user of their advanced features will include people who are both successful -- they can afford the vehicle -- and demanding -- they expect stuff to always work the way they want it to. They're also not likely to be easily taught anything. So you end up with a perfect storm of imperfect features and arrogant users -- what could possibly go wrong?
-
-
Friday 23rd December 2022 09:04 GMT Phil O'Sophical
Not limited to Tesla
My ordinary car has an auto cruise control that will adjust to the speed of the car in front. There's a junction near home where the road forks on a curve. Cars turning off across the oncoming traffic have to slow or stop in the turn lane, those on the main road can continue round the curve. Every time I do that, as I pass on the inside, the cruise control brakes unnecessarily. It clearly gets confused by the slow traffic and the curve, and can't work out what 'lane' it's in. I now know to cancel the cruise before I get there, but it was a surprise the first time.
-
Friday 23rd December 2022 09:53 GMT SonofRojBlake
I've done three different defensive driving courses when working for companies that expected me to drive company/hire cars. One of them, delivered by a couple of former traffic cops and another advanced driving instructor, was pretty hot on "controlling" the driver behind you. By this they meant making sure that not only did you leave a safe distance in front, but also made sure the car behind was far enough back. If necessary, tap the brakes to wake them up - it's important they see the brake light. If they continue tailgating, slow down a little, then speed up a little to establish a gap. If they come back up behind you, slow down again, to the point that they overtake. The thinking was that if they're that much of a twat it's safer to have them in front of you than behind. Also it was repeated several times that on any given journey, risky manoeuvres will save you seconds, only, and are therefore simply not worth it. I've used the "tap brakes, slow down, let them pass" thing several times in the past 30 years. The vast majority of the time, the person behind gets the message and leaves a safe gap. Only once or twice have people reached the point of angrily overtaking, and only once have they then slammed on when they pulled back in. Because I'd covered the brake, everything was fine, and I hope they felt better. I was directly behind them in the queue at the next junction and followed them out, so they'd risked a crash to save literally less than five seconds off their journey time.
What I concluded from these courses (among other things) was that you shouldn't have to be that vigilant when you're driving, and most of the time you can get away with not being. Most of the time.
There are certain things that make me hyper-vigilant, certain things you can do that will make me watch you like a fucking hawk if you're driving near me because I actually expect you to be shit at driving without warning. Wearing a hat in a car? Tick. Pram or dog visible in the boot? Tick. Just recently, I've added "driving a Tesla" to the list.
-
Friday 23rd December 2022 12:11 GMT Eclectic Man
What is the scope of FSD?
I just wonder what the scope of the 'Full Self Driving' mode is? Does it take account of how close the vehicle behind is driving? IN the UK, the Royal Society for the Prevention of Accidents advises slowing down gently if the person driving behind you is too close for the speed. This can cause considerable aggravation to said driver, but does help to reduce the [possibility of a multi vehicle pile up.
Can anyone enlighten me, please?
-
Friday 23rd December 2022 14:12 GMT Unoriginal Handle
Re: What is the scope of FSD?
I'd be interested in the source of that comment. As a current RoSPA ROADAR tutor I've not heard it said.
I would advocate, as another poster has said, increasing the gap in front of you (significantly) to cater for the vehicle behind if they're showing signs of impatience or being too close or some other thing. And if they want to overtake, let them.
-
Friday 23rd December 2022 16:02 GMT Eclectic Man
Re: What is the scope of FSD?
I heard it on the radio (BBC) some years ago when a member of RoSPA was being interviewed. I certainly do try to let irritatingly close behind drivers overtake, the problem arises when everyone behind expects to overtake me and I end up driving at 45 mph on a motorway instead of my preferred 56 mph due to being cut up by so many *^*&^&*s.
Happy Christmas driving everyone!
-
-
Sunday 25th December 2022 08:10 GMT Unoriginal Handle
Re: What is the scope of FSD?
"Surely (assuming more or less constant speed) the only way of increasing the gap in front of you is by slowing down gently, making the distinction between the two purely semantic?"
Yes, temporarily, and it doesn't need to be by a huge amount - so the gap between you and the car in front gets bigger. Then you can add the 2mph or whatever it takes to match the vehicle in front.
-
-
-
-
Friday 23rd December 2022 13:59 GMT Lil Endian
Musk: "Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen..."
Unbelievably remiss of Musk/Tesla!
So it's deemed acceptable to beta test software (and 2000Kg +/- of hardware), on public highways, by anyone, without them being trained for such?
As it's a driver's responsibility to ensure their vehicle is road worthy, it's also remiss of them, by default, to use these systems in beta, as they cannot claim to ensure it's viable.
IMO drivers wanting to use FSD/DA should require a licence as such. It's a different skill set, especially with regards to cognitive load. Give a single engine Cessna pilot a Super Guppy, see how it goes!
-
Saturday 24th December 2022 13:27 GMT Anonymous Coward
You made me realise there's an interesting unknown in all of this: what does the insurance think?
Anyone with FSD installed should in principle be paying more as they're actively using a system with no external formal validation that it does what it says on the
tintablet - also because an accident with this activated will result in a lot more moolah for lawyers in case of dispute.Interesting question, and as yet thankfully not my problem. It could be my problem if I ever get into an accident with a Tesla - I'm pretty sure my insurance lawyers would have fun with that one.
-
Saturday 24th December 2022 20:13 GMT Lil Endian
Yep. Although I think the issue of an appropriate licence is a factor, ie. you are licensed to operate an FSD vehicle. No licence, no insurance.
[NB. By FSD below, I mean really FSD, not Musky stinking hype.]
Whether the driver's premium on a conventional car is more or less than on FSD would be dependant on the proportion of each type of vehicle on the road. If we assume that all vehicles on the road are FSD, we would hope that they'd be more capable of "working better with each other", ie. safer[1] - otherwise, what's the point? Until (approaching) 100% saturation of FSD, the ratio of non- to FSD premiums will depend on the (comparative) efficacy proven IRL. Which, as we know, insurance companies love to calculate, and base their profits on.
It's the driver that's insured, not the technology. Until FSD licences are a thing, or RL efficacy of FSD is proven beyond any doubt, I cannot see that FSD premiums should be anything other than humongous. Live testing on public roads?! Fuck off.
[1] The AI problems that need overcoming here are ridiculously complicated to consider, let alone solve. It's not just other vehicles, it's pedestrians, falling debris, cross winds... It's not happening any time soon.
-
-
Tuesday 27th December 2022 21:15 GMT Lil Endian
Yep, in a perfect world.
What we're getting in to here is: vehicle unworthy, premium upped. Punter pays?! (Read "vehicle" as driver/supplier/law.)
The red flag waving geezer was nonsense in 1865. The car wouldn't crush a turnip back then (the turnip moved faster). Now it's two tonnes of mass bearing down on you under a marketurds concept of AI.
Stick that red flag somewhere now.
-
-
-
-
Friday 23rd December 2022 14:10 GMT TaabuTheCat
Appropriate use?
Having owned a Tesla for three years and been through many versions of FSD (and in all fairness, usually improving with each release), there is zero chance I'd ever be using FSD in a tunnel. Your options if something goes wrong are less than zero. I also encountered the phantom braking many times over those three years and learned to immediately hit the accelerator when it happens to tell FSD to get its head out of its ass. Still, it's very easy for someone behind you to think you're brake-checking them and I'm kind of surprised there haven't been more road rage incidents reported based on this behavior. I lived in Texas when I owned my Tesla. You can guess the popularity of being brake-checked.
-
Saturday 24th December 2022 18:50 GMT John Brown (no body)
Re: Appropriate use?
"there is zero chance I'd ever be using FSD in a tunnel. "
Yeah, that struck me as odd too. Also the suggestion the Tesla made one or more "unsafe" lane changes, which I'd not expect a Tesla on FSD to do (although I have no experience with the system). Part of Teslas automation includes GPS, which tends not work in tunnels and AIUI, this was a big, long tunnel.
-
Monday 26th December 2022 09:38 GMT Lil Endian
Re: Appropriate use?
I'd like to see how the Teslas fare in Craeybeckx Tunnel, on the E19 going in to Antwerp from Brussels.
I've heard tell it's one of the most dangerous sections of road in Europe (can't find stats to back that up though, sorry). Regardless, it certainly is a nasty bit of 1600m tunnel, with sweeping curves first to one side and then the other. Not the forte of Teslas, I understand.
You don't even need to involve other vehicles to kill yourself.
As I don't condone testing FSD software on public highways, and I doubt the authorities, or road users, of Antwerpen would appreciate shutting the tunnel for a few years for Testa testing, perhaps the Musky one could splash out on a replica closer to home...
[Icon: "..." said the Model S.]
-
-
Friday 23rd December 2022 14:45 GMT Christopher O'Neill
Tesla autopilot sudden braking happens all the time
Even without full self-driving, sudden braking has been a 'feature' of Tesla's standard traffic-aware cruise control for years. You can quickly press on the accelerator before the vehicle loses too much speed and regular Tesla drivers are probably used to hovering their foot close to the accelerator when passing HGVs, going through tunnels etc.
I've not noticed this issue on other cars I've driven with TACC.
-
Monday 26th December 2022 22:23 GMT Anonymous Coward
Re: Tesla autopilot sudden braking happens all the time
I have, in a Hyundai Kona rental. Scary rubbish.
Not as scary as the lane assist which happily run you into a load sticking out of drivers who don't keep to their lanes as it looks only at the lane, not at traffic but still, if you leave it enabled you not only have to deal with complex traffic but plan to fight the car to boot. As soon as you get near any complex junction you're better off disabling it (which, of course, is not just a simple button but requires navigating a menu - did I mention you were doing all this in complex traffic?).
Thanks to the loving care of EU nitwits pretending they have a clue, that shambolic rubbish gets re-enabled every time you start the car so by default this rubbish is adding to a driver's cognitive load instead of making life easier.
Trust me, after a 600km drive you're more than eager to subject the mandating morons to percussive education. It would be a most satisfying way of knocking in a cricket bat ever.
-
-
Friday 23rd December 2022 22:58 GMT Sleep deprived
Is a crash-inducing behavior acceptable to others?
Even if AutoPilot FSD software were perfectly functioning - and not in beta testing by paying consumers - is it acceptable to deploy it on public roads among most other (human-driven) cars unable to react to its unexpected maneuvers, thus inducing accidents behind? My father used to tell me that when walking in rows, if you think you're the only one with the pace, you should question yourself.
-
Friday 23rd December 2022 23:32 GMT Throatwarbler Mangrove
In fairness to Tesla . . .
I recall driving down I-280, a major Bay Area freeway leading from San Francisco to San Jose, and encountering a sudden slowdown in traffic in the vicinity of Palo Alto. The first indication I see of the cause is a BMW overturned on the side of the road. I then notice a second BMW facing backwards up the hill of the median strip, followed by two other BMWs at the side of the road with collision damage, and then a final BMW off the side embankment. That's right, it was a five-BMW accident. I will leave drawing conclusions, hasty or otherwise, about the nature of BMWs and their drivers to the other commentards.
-
Sunday 25th December 2022 17:29 GMT Anonymous Coward
Re: In fairness to Tesla . . .BMW have a R&D facity in Palo Alto.
Its on Hamiliton.
I have seen lots of research prototype cars on 280 south of 92 over the last 15 years. You will see them during non commute hours. Usually between 11 am and 3p.m on weekdays when traffic on 280 is very light. I've seen them occasionally over on 101 but as traffic is much heaver at all times on 101 a single prototype car will cause as much traffic flow disruption on 101 as a fully laden big rig hauling a very heavy load.
Over the years I have seen Tesla, Google, Apple, GM (Cruise), VW and BMW research vehicles on 280. Probably a few I missed but I knew what vehicle type / test hardware config those guys were using at the time.
As for the BMW pile up. As someone who has driven / commuted 280 over the decades in BMW's (E23's, E34's etc) the only time you might possibly see five BMW's together at the same time must have been part of a single convoy from a single location. Dealer or R&D facility. Because I have never seen such a grouping. Even with the ubiquitous Prius you never see more than three or four together in a clump on 280. And you will see far more Mercedes than BMW's on 280. And its been that way since the 1980's. The Peninsula / South Bay has always been Mercedes territory. And now very much Audi territory too.
-
Saturday 24th December 2022 00:32 GMT TDog
UK law
You have to remain sufficiently behind the vehicle in front to stop safely. As a frequent motorway user I can assure everyone that this is followed religiously*
*Def: religiously; not seen as terribly important since the glorious revolution but makes bods feel self justified so long as it doesn't have to change their behaviour
-
Sunday 25th December 2022 17:08 GMT Anonymous Coward
Only dangerous idiots change lane in the TI tunnel. Or use Tesla "Self Drive".
That's the bottom level, west bound, Treasure Island tunnel. Not many accidents on that part of the Bay Bridge over the decades. Accidents are usually out on the decks. Or at the 580/880 interchange. Because anyone who drives it knows you do not change lane in the tunnel. Ever.
Except to move out of the far right lane for traffic getting on at the on ramp at the tunnel exit. As you exit on the SF side. Same on the top level, heading east. Right lane on-ramp by the tunnel exit. The off ramps for TI are by the tunnel entrance entering the far left lane..
I always add extra space between myself and who ever is front as we turn the curve to enter the tunnel. Because on a fairly regular basis someone not paying attention will hit the brakes when they enter the tunnel when they are startled by the huge change in light levels. As does the much sharper curve than on the old Cantilever Section.
Anyone using self-drive on any part of the Bay Bridge should be up on a felony CVC charge. Pure criminal negligence. Due to the change in levels in and out of the tunnel, the TI on ramps, and the totally atypical lane traffic flows. Center lanes slowest , outside lanes fastest. Both left and right. East and West bound. This is not the San Mateo Bridge, or San Rafael Bridge were traffic lane speed follows the usual right to left pattern of slowest to fastest and the only disturbance is the up grade middle section slow down.
Product liability law moves slowly in California but it's a given that sooner or later Tesla and other self drive companies will be hauled into court for negligent manslaughter. And will be found guilty. Because the technology has already killed a whole bunch of people and will kill a lot more until it is made illegal.
The only reason it exists is because it has proved so far to be a very lucrative play for a bunch of second tier VC's. No other reason. "Self-drive" has been a VC play pure and simple. But none of those worthless scum on Sand Hill Rd will ever see the inside of the court to answer for all the people they have indirectly killed and maimed.
Just like with Theranos.
-
Monday 26th December 2022 18:03 GMT Anonymous Coward
I've said it before and I will say it again. We have hardly any self driving trains within the confines of a much more controlled environment than a road for a variety of reasons.
If trains can't do it, cars should not be anywhere near this problem at all.
I don't dispute the autopilots fault for doing the wrong thing, but also California traffic, poor on/offramp design and USian driving standards have culpability too. Posted limits basically ignored and spacing between vehicles an alien concept even at speed.
-
Monday 26th December 2022 22:17 GMT NITS
tinted glass
It's a bit difficult to see what's going on 3 cars ahead, when so many cars have glass tinted so dark that you can't see through them.
And don't get me started on the eejits who black out their tail and stop lights. These are typically the ones that also have too-bright headlights (or, even worse, off-road LED floodlights that don't even control the pattern). I just slow down, or even pull off, and let the bullies pass.
I've heard it advocated that vehicle headlights, and window glass, should have polarizers. If everyone's lights, and windshields, were polarized 45 degrees upward and to the right, then you'd see objects from your headlights, and from vehicles headed in your direction, just fine. But oncoming vehicles' headlights would be attenuated, and thus not as dazzling. Thoughts?
Is tinted glass a problem outside the US?
-
Sunday 1st January 2023 04:30 GMT Ghostman
You can drive as careful as you can--
and use all the "manufacturer installed safety features" and can still have situations where some other idiot tries their best to get you killed.
My Camry has the adaptive cruise control. If I'm on an Interstate highway, I turn it off, unless there is very little traffic. I've had the thing on and kept the distance the radar thought was right. Someone wanting to pass the car in front of them decided to swerve into my lane and pass. The "adaptive" monstrosity quickly applied the brakes and almost stopped the car in 70+ mph traffic. After several times of this happening, I shut it off and drove normally, and closed to a space where others wouldn't normally think of trying to get in. BUT, being in Atlanta, Ga traffic, those folks think over four feet is a merge lane.
I also did away with the "lane error" gizmo. Driving on a curved bridge that was 100 feet above the Chesapeake Bay, and a low retaining wall, I wanted to stay as close to inside line as I could. The damn thing kept trying to make me steer right as I would sometimes get close to the line. By trying I mean it actually was making the steering wheel pull to the right as i was pulling to the left. When I could get off the highway, I went through the manual and disabled it through the dash controls.
Next car will be a used car for about 2K, spend about 15K to get it where I want it, and not deal with all the so called "safety features".