Me? A cynic?
A cynical person might believe Tesla are full of crap and Musk is a snakeoil salesman who lurches from crisis to crisis with ever more outlandish, hyperbolic claims.
The US National Highway Traffic Safety Administration (NHTSA) has written to Tesla as the automaker's electric cars keep crashing despite a recall to fix problems with the Autopilot software. The NHTSA opened a recall query to investigate the remedy's effectiveness for the earlier recall in December, which affected more than …
Teslas are objectively the safest cars on the planet, as measured by safety agencies around the world, including Euro-NCAP and IIHS. Now even the safest cars are going to have some percentage of accidents, and as there are a lot of Teslas on the road, that very small percentage still adds up to a noticeable absolute number. Of course those accidents need to be investigated, and even the safest car on the planet can still be improved, but the existence of that handful of accidents doesn't invalidate the objective facts of how safe Teslas are.
Teslas are objectively the safest cars on the planet...
A similar view has been expressed in the case of the "Smart Motorways" in England...
National Highways - the agency in charge of smart motorways - says the latest data shows that "smart motorways are our safest roads".
https://www.bbc.co.uk/news/av/uk-68865188
They opened up the hard shoulder/breakdown/emergency lane of what were the safest roads in the country and put technology in to alert drivers of obstructed lanes - yep, that's going to work
if you make something sufficiently and obviously dangerous people actually take more care.
There was an episode of Top Gear a very long time ago where Clarkson said words to the effect of; The way to get people to drive slowly was to put a great big spike on the steering wheel where the airbag is - which would then make the vehicle to be driven very slowly in order for the driver to avoid their head getting impaled on it
The issue at hand seems not to be the safety record, but the failure mode. It's great that your car doesn't often hit people, but if when it does hit people it tends to kill them, then that's still a problem.
It's even more of a problem if these are actually avoidable accidents and there is a critical flaw in the safety systems that should be able to prevent them.
It's even more of a problem if your CEO has been claiming that those cars can drive themselves autonomously without harm or damage. This is a (fairly crucial) test of his ability to meet the requirements of responsible corporate behaviour when flaws clearly still exist in the vehicles.
One way to avoid this would be to stop making enormous cars that weigh upwards of a Ton..
Every car these days looks as if it has been inflated like a balloon compared to previous models..
There's a kind of 'arms race' mentality in car safety - You the driver are safer if your car is bigger and heavier than the car/person you collided with.
And in America there's an even bigger issue with the legislation: Apparently most SUVs are considered to be "light trucks", and therefore get around much of the safety legislation designed to protect pedestrians from cars.
I heard that cars are getting wider, on average, at a rate of about half an inch per year. Which is why they'll no longer fit in the garages or parking spaces of last century.
Someone needs to put a hard limit on that, unless we want to be condemned to widen everything every couple of decades, forever.
"Teslas are objectively the safest cars on the planet"
Downvote because you're conflating two entirely different things.
Quote from elsewhere on the web: The Model 3 is rated with the best possible crash rating from NHTSA in the USA - and the IIHS said it was the safest car in the world.
So what this is saying is that the model 3 is the car you're most likely to be able to walk away from if you are unfortunate enough to be involved in a crash. In that way, perhaps it is indeed a very safe vehicle, for the occupants.
However driving assistance with a tendency to go awry and kill other people... does not in any way make it a safe car. Quite the opposite. Usually, with cars, accidents that lead to fatalities are due to mechanical problems or something the driver messed up (or both, if they neglected servicing). In this particular case, if the intelligence goes wrong then it's the car itself, and it doesn't matter one bit how wonderfully safe the car is for the people inside if it ends up inadvertently killing people outside.
The reference to Musk being a snake oil salesman is because while all this autopilot stuff is impressive, it's maybe not yet ready for public roads and it needs to be hammered into people's heads that it is a driver aid, not a driver replacement. Which means, for the love of Christ, stop calling it stuff like "Autopilot" and "Full Self Driving", because - at the current state of technology - that's still an outlandish claim.
On an "otherwise empty road".
Who should I indicate to? The squirrels in the trees?
Regardless, lane keeping assist is menace on narrow country roads where one routinely needs to cross the centre line..... and turning it off involves pissing about with a touch-screen menu every time you start the car ('cos it won't stay switched off).
Tesla drivers are the worst. They bought a Tesla because they can't/won't drive safely, and are counting on Autopilot to do all the work. Musky says it works and all the advertising says it's amazing. These people get on the road, engage Autopilot, and stop paying attention to driving because Autopilot doesn't require them to. Now you have Teslas crashing into everything because Autopilot is a sham.
No he's not, he said "the existence of that handful of accidents doesn't invalidate the objective facts of how safe Teslas are"
Now sure, since Teslas like most EVs are heavier than an equivalent ICE vehicle they are relatively safer for the occupant - same reason moms like to buy oversized SUVs in the US to protect their children at the expense of the children in the sedan they run into.
Having the highest accident rate is almost certainly because of Musk's marketing around Autopilot/FSD making owners believe it is more capable than it really is. The article from the link I posted made that clear with the large number of accidents resulting from "autopilot disengaging" (i.e. it was being misused by someone not paying attention who was not ready to take control when it encountered a situation it could not handle)
He's a rather mediocre techy guy acting as the absolute worst salesman in the world, masquerading as his idea of a "cool guy", propped up only by bankrolling otherwise-failing companies (Tesla was near bankruptcy several times) based on a MERGER with Paypal many decades ago that made him an accidental billionaire.
I have never found any redeeming feature in the man.
How about just removing the certifications from the cars so they cannot be sold in the normal way.
They become "kit cars" (Q Plates in the UK)
Or just disable the entire steaming pile of shite that is "Autopilot".
There really does come a point when these tech companies (and Tesla is a tech company) need to understand that responsibility actually does exist.
If these issues were with an established car manufacturer this would not be happening.
I would also go one step further and mandate that any critical control has to be a manual switch with tactile feedback. I am sick of all this touch screen shite where you cannot adjust where the blower is blowing because you have to press a touch screen that requires LOOKING AT.
Maybe I am just a grumpy old fart......
Touchscreen controls are the worst. I've had several rentals where the heating/cooling controls were only accessible by touch screen, Which is not where you want them when you are hurtling down the autobahn and you've realised that you need more heating. They're a nasty cost-saving measure and objectively less safe than old fashioned knobs, switches and levers which you can work by touch alone.
None of which stopped another Tesla fanboy on a social media thread I was in from insisting that anyone who objected to them was a luddite who didn't understand technology.
I've got Tesla's new marketing slogan:
I came for the car, but I stayed for the cult.
A recent article I read in a magazine had this little snippet:
"Euro NCAP has finally recognised that confusing and complicated touchscreens can cause drivers to take their eyes off the road for too long so, if car makers want to earn a 5-star rating, then from 2026 some key functions will have to be controlled by classic switches and knobs instead."
Sounds like the Telsa will need a redesign
"Who knew that getting rid of LiDAR in favor of much less reliable optical scanning would have such a deleterious effect on safety?"
I don't think that Tesla vehicles ever had LiDAR. They did have RADAR, but that was discontinued a while ago and was even removed from the car in some cases when people brought their car in for service.
Since the Wizard never gave Elon a brain, he doesn't understand how they work and why humans rely on vision primarily and can get away with it. We still use our other senses to feel out our environment. If somebody enters a room we are in but aren't in our line of sight, we can sense their presence and infer who they are through hearing and even smell. Even if we don't hear them, we might feel them if they block air movement. There was a study on hearing where people put on headphones and blindfolds and a "hole" was moved through a sonic environment. It was very creepy and I have to wonder why that hasn't been tried on a scary movie. It also points up that we don't understand precisely all of the ways we comprehend our environment.
Since the Wizard never gave Elon a brain - I reckon the wiz should hand in his wand but then I recall he was actually just a cast away shonky snake oil merchant. Anyway his line in brain was probably the charcutier's rejects. As the Scarecrow, Musk must be the ultimate straw man and as such you would think he might desist from playing with fire.
he [Musk] doesn't understand how they [brains] work and why humans rely on vision primarily and can get away with it.
Musk and the legions of his betters that aren't familiar with neuro- or cognitive science are not alone in this. That includes most of the current AI circus.
The image the retina captures and sends to various parts of the brain is pretty rough - The difference between that image and what we "see" both verges on miraculous and is terrifying at the same time.
I don't know about anyone else but driving a vehicle is easily the task that for me requires the greatest concentration and thought and I have over 50 years of practice.
Is any of Musk's FSD wet dreams ever going to detect a childs ball rolling on to the road from between parked cars and then going to slow down preparing for emergency braking?
Think of what is involved here - construction of a plausible model of the immediate future based on the unconscious or intuitive understanding of the human world - in a word imagination.
You've absolutely put your finger on the problem with self-driving software; it can't imagine or predict. A good many years ago on a country road I saw what looked like a couple of firelfies up ahead (it was a pitch-black night, no street-lighting or moonlight). The weird cyclic motion confused me though until I came up with the accurate hypothesis that I could see the reflectors on pedals for a bicycle being ridden with no lights. I slowed down and was able to pass said cyclist safely. Presumably an FSD Tesla would have ploughed into him at 50 mph.
And then legions of fan boys would have said the cyclist deserved it for not having any lights. Well, doubtless the cyclist has some responsibility for his own safety, but equally car owners have a responsibility not to trust control of their death machine to a half-witted piece of software demonstrably not up to the task.
A good many years ago on a country road I saw what looked like a couple of firelfies up ahead
I've had a similar experience riding my motorbike at night - saw what looked like a motorbike coming towards me to shifted left to ensure that we would pass safely (small country road, no streetlamps). Realised at the last minute that what I had assumed was a bike was in fact an older car (round headlights!) with only the nearside headlight working..
Fortunately I still had enough time (closing speed was only about 30mph) to get right to the side of the road so that the mono-lamp car could go past. It didn't even move aside so I suspect that, despite my headlamp working properly, the driver hadn't even seen me. Given the time of night, I suspect the driver was on the way home from the pub.
The problem is that none of these enhancements to driver attention monitoring solves the real problem with this level of driver assistance (not actual "auto pilot", despite what Musk is trying to sell). The problem is that it takes time for the driver to react, and on roads and highways, that time just isn't there.
In aircraft autopilot works very well because there is much more time to react to an autopilot disengagement. You are typically not 1-2 seconds from colliding with something or someone when autopilot disengages. If you are, then you have big problems. Also, with aircraft, pilots are trained in human factors, and we understand how to handle the handover from automation to human control. Normal drivers are not trained for any of this, and reading a Tesla owners manual is not proper training.
Air travel also has a dedicated ground crew working to coordinate traffic and will definitely be in touch if you start veering too close to anyone else.
I sometimes wonder if we're doing autonomous driving wrong. Instead of every individual car trying to make independent decisions, maybe have a central system that acts similarly to air traffic control for airlines. Sensors in the road and car tell it where other cars are, how fast they're going, all that yummy telemetry, and then it can coordinate all the traffic on the roads. Then you don't have a case where cars with a SoC specced for having LiDAR sensors are suddenly expected to take on the additional processing required for optical only scanning, on top of everything else they manage. You just have one massive cluster where you can always add additional nodes if you need more processing power.
"Sensors in the road and car tell it where other cars are, how fast they're going,"
Like that isn't going to go wrong. The Smart Motorways' sensors are often offline for days/weeks/months at a time and many don't require digging up the road to repair. If they can't be bothered to fix non-functional CCTV cameras watching the roads, how can it be expected that in-road sensors will be maintained?
What you described is part of the spec for PRT (Personal Rapid Transit). The vehicles take care of operating themselves and keeping on the guideway while a central computer tells them how to dispatch, where to turn and how fast to go, but not exactly how to do those tasks. In a closed system, that can work nicely. Even a human driven service vehicle can be accommodated in the system. Before UltraGlobal wound down, they had plans that would allow private ownership of vehicles so somebody could drive to the city with the system and log into the computer for autonomous control. Parking, charging and pickup would all be taken care of. It made a lot of sense for dense downtowns. Not only could people be moved around, so could cargo. Vehicles could go into buildings so a load of dairy products being sent to a grocery store could stop in the warehouse area of the store, be unloaded and the vehicle would be on it's way while those items get moved quickly into the refrigerators/freezers. Passenger vehicles available to the pubic would not have access to those stops and portals could be secured. Employees of the store could be given a pass so they could be routed to the back room where they could get on and off in safety.
And even if they are, there are Really Bad Failure Modes if there's an equipment failure. That failure might be in the V2V communication, or it might be something purely mechanical like a part falling off or an axle shearing.
Cars currently are not, broadly speaking, well-maintained. Many jurisdictions have periodic inspections to try to ensure a minimum level of upkeep, but that's not a perfect system, and other jurisdictions lack it. There's nothing currently stopping me from driving any old piece of junk from Michigan or New Mexico (where there are no inspections) to, say, Massachusetts (where there are).
The problem with this suggestion is that it assumes that all the self-driving vehicles only have to cope with other cars. But in the real world they need to cope with pedestrians, stray dogs, runaway prams etc. And the other problem with this suggestion is that lobbyists will argue that people, not cars, are the problem and we should have strict laws and segregation keeping them away from motor traffic.Which would make our cities even worse hellscapes than they already are.
If that seems improbable, here is your reminder that the US has jay-walking laws, legislating against people being able to cross roads, because of lobbying by motor manufacturers.
An aviation expert will correct me if I'm wrong, but didn't some of the early airbus planes lean too heavily into automation and they had to dial it back for causing the same problems as we see with Tesla software - it did too much causing pilots to disengage so that they then had problems resuming control when needed.
Software like Tesla autopilot is guaranteed to be dangerous. It works most of the time so drivers disengage. And then when it says "hey, I don't know what to do, help me out human", the human hasn't been maintaining context and hasn't got time to rebuild it.
Air France 447 is one tragic example. The captain was sleeping after a long night; the other two crew members were inexperienced; the plane came out of autopilot after the computer lost airspeed information, and they kept trying to climb while the plane was stalling.
Musk stands in path of a Tesla on "Autopilot" - lets start at 5 mph and then repeat at 5 mph increments. If and when the Tesla fails to detect Musk is in it's path and runs him over, then, that is the threshold speed the "Autopilot" can be certified for. Oh, and best have an ambulance/medivac helicopter on standby during the exercise
And he has to do it with a randomly chosen example for every hardware revision of every model, under varying lighting conditions including with the car heading towards the sun when it's low.
And then redo it with every software update that affects the system
I like the idea of the snake oil salesman proving he trusts it, but I don't trust him to actually do it under conditions we know the Tesla's have issues with*, and with a vehicle that hasn't been specifically chosen for the task and made sure it's working correctly.
*I seem to remember the Tesla cameras have issues with the sun being "wrong" and the tesla fans saying "well humans get dazzled as well", ignoring the fact that humans can adjust the position of their head/eyes and do things like drop the sunshade, and will typically slow down if dazzled.
*I seem to remember the Tesla cameras have issues with the sun being "wrong" and the tesla fans saying "well humans get dazzled as well", ignoring the fact that humans can adjust the position of their head/eyes and do things like drop the sunshade, and will typically slow down if dazzled.
That's also part of the beauty of something like LiDAR. It doesn't get dazzled by the sun, so regardless of time of day, or other weather conditions*, if you're coming up on some other solid object, it can trigger a warning.
* IIRC, LiDAR doesn't work so well in fog, but then neither does optical cameras or human eyeballs, and we're still talking about one use case where things are basically the same and all the other use cases are vastly improved.
Forbes Dec 2013: Tesla drivers had 23.54 accidents per 1,000 drivers. Ram (22.76) and Subaru (20.90) were the only other brands with more than 20 accidents per 1,000 drivers for every brand.
Interesting link, thanks.
If you click through that link to the site where they got the data from there are some other interesting bits in there.
RAM really not coming out looking very good - topping the charts in a lot of those results - more incidents (not accidents) than any other brand.
BMW has the most drivers caught driving under the influence of alcohol.
And, as you mentioned, Tesla, which are the most accident prone.
Also, a small point but important, your subject line says it's electric cars, but that study covers both EV and ICE - which makes it a much more interesting dataset.
"RAM really not coming out looking very good - topping the charts in a lot of those results"
With a big heavy vehicle it does seem like you would see them in more accidents as they are not nearly as maneuverable as a small compact car. They take longer to stop with a higher center of gravity.
I'm not surprised about the incidence of drink driving with BMW. The person bought the car thinking it would make them more desirable and interesting and instead just drained their bank account and the depreciation has driven them to drink. If I meet somebody, I really hope their dating me doesn't have anything to do with my car other than having one I can use to take them on dates. Have you seen what it costs for a pair of Rammstein concert tickets? If they would rather go see Wishbone Ash at the local club in the BMW..........Not that WA isn't a kickin' show.