
"a recall of 830,000 Autopilot-equipped Teslas"
That's going to be a whole bunch of smug rich people pissed off.
First-of-its-kind research on advanced driver assist systems (ADAS) involved in accidents found that one company dominated with nearly 70 percent of reported incidents: Tesla. The data was presented by the US National Highway Traffic Safety Association (NHTSA), the conclusion of the first round of data it began gathering last …
The only hardware that's at fault is the sales literature that started with it being called "Full Self Drive" and "Autopilot".
While terms like autopilot are being used people will treat it like an autopilot, aka not paying sufficient attention
"Doubt it would be a hardware recall..."
It might be if fixing the issues will require more than tweaking some code. Elon is dead set against using any sort of Lidar. Didn't he also want to nix radar too and just use cameras to sense the environment? Besides sensors, new code may need more CPU horsepower to operate.
Their 'Dear Leader' will soon be running for Congress. Then he'll follow the path of No 45 and ban all comments that are critical of Tesla, Space X and of Emperor Elon the merciful.
Now that he's joined the GQP, he has lost a lot of cred with my Tesla owning friends.
Yes, I am not a fan of his badly made cars and yes, i owned one, a 2019 Model S
He cannot be President or Vice President because he is not a natural-born citizen of the USA. Other political offices lack that requirement and some offer effective protection from charges of securities fraud so it would not surprise me if he ran for some other office.
Several days have past without him making a stupid tweet so perhaps his lawyers have convinced him to STFU before he digs himself into an even deeper hole with Twitter. (More likely the PEA for Boca Chica has got his focus back on rockets.)
But that would be unpleasant for the company and its customers. Customers who don't know what it is will be unhappy they're having something taken away from them. The company would be unhappy with the headlines that they have removed a feature with an update. Much better for the company if they can do something to hide that they've disabled something by implying they fixed a part, just like every car manufacturer does sometimes.
The 'recall' simply involves an over-the-air update. Do you really think that NHTSA will force Tesla to disable Autopilot on all its cars? Besides, GM's system is no better. They just have fewer crashes because they have sold fewer cars. Their engineers are interchangeable with Tesla's.
On pure numbers alone, any comparison needs to be based upon *Miles Driven* not per vehicle. A vehicle that's seldom driven with an ADAS system is not comparable to one that is driven long distances under varying conditions.
Only Tesla has automatic reporting of crash data, for everyone else it's voluntary and annocdotal.
An analysis of data is a good thing and should be done, I applaud that.
A bad analysis with faulty data is nothing more than a hit piece.
as well as graduated by driving environment and conditions.
It's easier to not prang into other objects if those are by and large moving at the same speed and in the same direction compared to getting around in a city, and bright sunny days take less effort to deal with than dark rainy/foggy nights.
It would also be interesting to know how many are multiple accidents with the same human 'driver', or if any accidents are with two or more Teslas with engaged 'autopilot'. And, indeed whether any of the collisions involved drivers using the software correctly (i.e., keeping alert, watching the road and keeping both hands on the steering wheel) rather than, in one case, sitting on the rear seat.
https://www.independent.co.uk/news/world/americas/tesla-backseat-param-sharma-san-francisco-b1847445.html
And it would be equally useful/interesting to see how many accidents were avoided due to "autopilot" being more aware of the surroundings than the "driver".
I'm sure Tesla have those numbers too. Or at least numbers on how many times the car took over before the meatbag behind the wheel could react.
"Accidents Avoided" is literally impossible to measure, thus any such attempted statistics must be filed under "unsubstantiated marketing"
I avoided thousands of accidents today by leaving the car parked.
The only useful measures are based on miles driven per "road type" and preferably speed range - freeway driving is safer per mile by design, as there's fewer hazards.
The emergency breaking and lane keeping features can tell when there was lane incursion or stoppage that was avoided. (Car detected lane hazard, car applied breaks instead of airbags=avoided accident). The cameras can catch a lot too, but would need a ton of post processing to tease stuff out of it. In some of the systems these incidents were uploading back to the manufacturer to provide feedback. Not sure to what degree that is still the case with non fleet vehicles now that the features are not considered a private beta. Privacy issues with grabbing info from private owned cars are a dumpster fire in their own right.
So while it's a messy data set, it's not an empty one. You can also look at the differences between average and severity of accidents per road mile vs human drivers to get a different estimate by an alternate measurement. That would require a lot of clean data though, and currently to get an unbiased comparison between manufacturers.
""Accidents Avoided" is literally impossible to measure, thus any such attempted statistics must be filed under "unsubstantiated marketing""
Not only that, but adding accidents that might have not happened through reliance on automation means accidents added to the statistics.
Better driver training would be more of a help. Let's say somebody is turning across your path at an intersection, if you watch accident videos on YouTube, you see most people will try to avoid the smash by trying to maneuver around the front of the other car rather than behind it. Some work on a simulator with a good instructor would get people to make a better choice. The driver assist systems might wind up putting more people in situations they could have avoided if they were paying attention.
Not only that, but adding accidents that might have not happened through reliance on automation means accidents added to the statistics.
Also, this NHTSA report is about ADAS level 2, where running into an obstacle ahead tends to be well within the capabilities of the already widely used common collision avoidance systems that (try to) keep you from running into the vehicle in front, a bicyclist crossing the road, an elk (or Anne Elk) (EeeeecchhhUm) blocking your path, and such. So just lumping those systems together would skew the numbers rather a lot.
> The only useful measures are based on miles driven per "road type" and preferably speed range
You would also need to include driving conditions as well, e.g. dry, wet, good visibility, poor visibility (fog and rain levels), day time, night time, dusk, dawn, etc.
Or at least numbers on how many times the car took over before the meatbag behind the wheel could react.
That meatbag might already have reached the conclusion that it didn't have to react as the car had already done so.
So, a meaningless data point.
And this was covered in TFA. Car makers aren't coughing up that info, so they can't do an analysis on data they don't have. However, TFA also notes how a spokeshole was actually pointing out these shortcomings in the data and telling people to be careful drawing conclusions.
Yeah, at least the Reg's version of this include more of those caveats and details. When the person giving the presentation calls out how people shouldn't jump to conclusions based on the data, the press should probably listen to them. This report has had the wider media in full arm flapping mode, and most of the version of this story cut out all of those details.
None of them mentioned the broader human-drive accident rate either. How many of those not-quite million Teslas would have been crashed if the Autopilot system didn't exist, and how many had it and crashed under manual drive anyway?
Seems like too many of the people covering this went out of their way to avoid giving people the information to make a reasonable comparison.
"...because companies required to log it are only reporting accidents, not the total number of vehicles produced or on the road, nor the mileage driven by each vehicle. In addition, the research said that some crashes can be reported multiple times due to submission requirements, and that data may be incomplete or unverified."
This basically means the study is completely useless. So of course el reg turns it into a hit job against Tesla.
and thats about it
100 teslas driven 100 miles per day will have more crashes than 5 fords driven 10 miles once a week. its the law of averages.
We need better figures... say a tesla will crash every 10 000 miles driven and a ford will crash every 100 miles driven (dont get your panties in a twist this is an EXAMPLE)
However both will crash if the dumb ass 'driving' it enguages the drive assist, then climbs into the passenger seat to read a book (after cable trying the safety switches on the steering wheel.....)
Found On Road Dead.
Heard the Spanish version first, actually, in Colombia: Fabricacion Ordinario, Reparacion Diario.
Which was from the driver of a 20+ year old 1950's Chevy the engine of which did a most convincing impression of a xylophone when going uphill or trying to overtake a truck. As it was a taxi it must have had several odometer overflows during those years, and probably more than the number of engine overhauls.
Tesla's most recent crash report gives a little insight:
In the 4th quarter, we recorded one crash for every 4.31 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.59 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.
So is the conclusion that rich people are simply better drivers? Or are people with new cars more careful? Or do more Tesla owners live outside the US and are just better drivers than their American counterparts?
Either way, the numbers suggest Teslas are simply safer with autopilot switched on.
The Tesla report seems subject to the spin effect of the now-reported "autopilot turns itself off seconds before a collision" information.
If the Autopilot had done that, and it get's reported as a "non autopilot" collision, that's going to significantly skew Tesla's reported numbers.
We'd have to have data that was based on "autopilot engaged for N seconds before the crash". The NHTSA data seems to be using 30 seconds. We don't know what the Tesla report is based on.
You mention spin if reference to this, but not what you think is being spun.
All the companies driver assists will warn the driver to take control if the vehicle detects a problem or hazard that is outside of the parameters it can handle. People have started making an unthinking talking point out of this 1 second vs 30 seconds thing. 30 seconds at highway speeds is more than half a mile. So in terms of accidents, which you rarely see coming half a mile away, that window isn't very relevant. 1 second is only about 100 feet, which is pretty short in terms of both distance and reactions. But that 1 second window isn't the whole picture either, as the self drive system may have already started warning the driver before it kicked out, as well as decelerating or taking other actions.
That said I agree the devil is in the details. The NHTSA has more of that data then they released here, and most of what they have on the Teslas, came from Tesla. So while(as Hill who the article mentioned) they warned us not to jump to conclusions based on this report, it looks like they have the data and may address these questions in the future.
"All the companies driver assists will warn the driver to take control if the vehicle detects a problem or hazard that is outside of the parameters it can handle."
I don't see how those systems can evaluate those situations in a way that leaves enough time for the driver to realize it's all on them to sort out, all of a sudden.
If I see traffic backing up ahead, I can slow down, move over or take some other action to give myself choices. How soon before the smash will an automated system figure out that traffic has come to a standstill? A few meters or enough in advance to safely brake the car?
Tesla loves to promote those numbers and try to claim its autopilot is safer than human driving. The problem is that people are most likely to enable autopilot in the simplest driving situations - like on an expressway. They are least likely to enable it (if it is even possible to do so, hopefully not) on a windy two lane highway at night during rutting season while it is raining. So of course those numbers for autopilot vs human accidents are apples vs oranges to the extreme.
There is one situation where all automakers' level 2 driving aids such as autopilot may (and I have no data to back this up either way, this is just a guess) do better than humans. Stop and go driving, like during rush hour! It is easy for people who become distracted for a half second to not notice the car in front has slammed on their brakes, or someone from an adjoining lane has cut their way into the two car lengths you had in front of you, until it is too late. Presumably Tesla and the other level 2 driving systems would fare better, as even if they aren't entirely bug free at least there is no risk of them becoming distracted by a noisy kid in the back seat or a taking a quick look at a phone to see who is calling/texting.
Either way, the numbers suggest Teslas are simply safer with autopilot switched on.
These numbers suggest that Tesla Marketing can make up unqualified numbers just like any other corporation, when no one independent has raw data to be able to verify it, because they have worked for years to keep it as secret as they can.
If Teslas were really so safe, the company would pay academics to analyse the data and publish credible papers proving it, and trumpet those results from the rootops.
That they don't, is clear proof that the numbers won't stand up to even paid analysis by shills.
Some people have pointed out that "miles driven per driver-assisted crash" would be a useful comparison for different brands of driver-assist systems. However, there is another comparison that should also be made: "miles driven per crash per driver-assisted crash versus miles driven per crash for non-driver-assisted crash".
As a hypothetical example, if it turns out that Tesla's driver-assist system is more crash-prone than other brands of driver-assist systems, then that would be be embarrassing for Tesla, but if it also turns out that Tesla's driver-assist system is less crash-prone than non-driver-assist vehicles, then Tesla's system should be applauded for the increase in safety it provides (and the other brands of driver assist systems should be applauded even more).
It is frustrating that the NHTSA has released a meaningless claim. This frustrating behaviour is also to be expected, not necessarily because the NHTSA is conducting a smear campaign against Tesla, but because such meaningless claims are commonplace among humans (and the NHTSA is staffed by humans).
Tesla have so far refused to publish any meaningful data, so all there is to go on are the anecdotes of "Autopilot" making dangerous moves - it has literally steered drivers into danger or even a crash more than once.
How often that occurs is unknown.
To be fair, nobody else publishes meaningful data either.
Both Tesla and the government are sitting on that and a ton of other relevant data. Ditto for Waymo, and the other majors. And the evidence of autopilot misbehavior is sparse, not non-existent. Video and police reports aren't anecdotes.
Plenty of well documented cases of all of them lurching like a spooked horse, numerous papers on adversarial interference with the car's sensors that was reproducible, and all the road trials data that has been made public. Also and endless litany of human caused stupidity, with similarly pervasive documentation.
That said I suspect we both would like to see more data transparency, clearer standards, etc. A centralized place to get at the public info wouldn't hurt either.
You would need to match like for like. If you enable autopilot for 200 miles on an expressway on a clear day, and then drive yourself for 200 miles at night in the rain on a country road, those are not comparable in any way.
Nor would 200 miles of rural expressway vs 200 miles of expressway passing through/around large cities, with tons of road construction lane closures.
So often tjhat the wrong statiskics are used for headlines. Obviously the only relevant figure for comparison is number of atuopilot incedents på unit of distance ( or possible hours driven). It is the same with the annual death toll ‘shock’ headlines that the number of people killed in road accidents is xx higher than last year. Which by itself, gives no meaning at all.
Unless I am missing something, there are no links to the data released in the reg article.
Without the data how can we check that this article and the articles it links to are quoting the right figures or the relevance of those figures.
Please link to the source of the data.
While the NHTSA data is useful, the Federal Trade Commission needs to step in and declare that the term "Autopilot" is deceptive advertising and that all references to it in advertising or consumer facing materials (e.g. brochures, owners manuals,etc.) must be changed to Semiautopilot.
to Boltactionpilot and eventually Muzzleloadingpilot before returning to Corkonastringpilot?
Seriously though, what these systems need is some actual mandated training. Peoples mistaken ideas on how these systems work isn't going to be fixed by a name change. Don't let a Tesla owner unlock the self drive till they've passed a simulator run that puts them through realistic use of the driver assists and tests their responses.
Bonus points for Tesla to release it on Steam as a marketing tool. Call it drive your dream car or something.
To be fair the "cruise control" on my motorcycle is a glorified spring clamp, but I know not to trust it at any level and only break it out on long straightaways to alleviate the wrist cramps.
"Still, based on data it had from automakers that isn't included in the NHTSA research, the AP said Tesla's crash rate per 1,000 vehicles was still substantially higher."
That's the only relevant figure. Not sure why the headline says 70%, as that's utterly irrelevant. What's the ACTUAL figure per 1000 vehicles?
'First-of-its-kind research on advanced driver assist systems (ADAS) involved in accidents found that one company dominated with nearly 70 percent of reported incidents: Tesla.' Read that statement carefully. Since Tesla 'dominates' in sales of cars equipped with ADAS technology, why would it be surprising that it would also make up a large percentage of cars involved in such accidents?
Fucking insane to call it an autopilot. no normal low IQ person will understand what that acutually means.
Of course fuckwits won't pay attention, and in any emergency situation will not be ready, human fuckwits act like that all the time.
might as well have called it "suicide pilot"
What might be more interesting is an analysis of crashes involving fatalities. Based on purely anecdotal accounts, a lot of those have involved pedestrians. Car and driver relatively okay, dead pedestrian. I know I've narrowly avoided pedestrians doing stupid things on several occasions. Autopilot might be less good at avoiding a guy in a wheelchair yelling obscenities in the middle of an intersection (an actual recent incident I had to avoid).
Of course, there is the infamous accident only a few miles from Tesla's headquarters, where the autopilot was confused by a complex intersection involving a left-exiting carpool lane going onto an overpass, diverging from the lane continuing straight. Unable to choose which lane was needed, the Tesla instead plowed full-tilt into the sand barrels, rather than possibly making an erroneous choice of lane.