And if the accident statistics are really bad...
vision-and-decision neural networks might be replaced by backseat drivers.
America's National Highway Traffic Safety Administration will now require details of any and all crashes involving self-driving cars from automakers within one day of them knowing about the accident. “NHTSA’s core mission is safety,” Steven Cliff, acting administrator of the transportation watchdog, said this week. “By …
Such reports on minor incidents have been required for some time - I used to collect them. However I noticed that around half of them were so vaguely worded as to completely prevent any significant analysis. Certain well known chief offenders' obfuscation percentage was quite a bit higher.
"It will be significantly more difficult to obfuscate the fact that car needed to be towed, or that someone ended up in the hospital"
Indeed so, Pascal, but if we look at the NTSB report on the 2018 Tesla crash into a "crash attenuator", even the NTSB stresses as many other factors as possible over and above the basic flaw that the "autopilot" doesn't (or didn't then) consider a bloody great square plate covered with yellow and black chevrons as an obstacle, and misunderstood the diverging carriageway border lines at the approach to the off ramp as a "lane. It steered into the fallacious lane, accelerated toward the chevroned plate and hit it at 70 mph. The driver was killed, the car was completely demolished and the battery caught fire (twice: first at the scene and later in storage awaiting the NTSB inspection).
The point is that everyone (apparently including the NTSB) is keen to exonerate the technology wherever the primary blame can be assigned elsewhere.
"The point is that everyone (apparently including the NTSB) is keen to exonerate the technology wherever the primary blame can be assigned elsewhere."
Erm, what are you smoking? Their summary of Probable Cause is this:
"The National Transportation Safety Board determines that the probable cause of the Mountain View, California, crash was the Tesla Autopilot system steering the sport utility vehicle into a highway gore area due to system limitations, and the driver’s lack of response due to distraction likely from a cell phone game application and overreliance on the Autopilot partial driving automation system. Contributing to the crash was the Tesla vehicle’s ineffective monitoring of driver engagement, which facilitated the driver’s complacency and inattentiveness. Contributing to the severity of the driver’s injuries was the vehicle’s impact with a crash attenuator barrier that was damaged and nonoperational at the time of the collision due to the California Highway Patrol’s failure to report the damage following a previous crash, and systemic problems with the California Department of Transportation’s maintenance division in repairing traffic safety hardware in a timely manner."
The *first* thing they blame is limitation in AP. They also blame the driver for not paying attention (quite correctly), then blame AP again for not having sufficient awareness of driver attention.
Then they point out that the death might still have been avoided had the barrier that was hit been intact, since it wasn't correctly reported, and noted that such damage is routinely not fixed promptly anyway.
That's all the blame for the incident at the hands of AP and the driver (with some additional words for AP). Then something that should really get the relevant authorities to report and fix safety critical infrastructure promptly - because we would have had even more information about the incident if they did they basic job.
Yours is indeed one (but not the only) possible interpretation. But the summary notwithstanding, if you read the entire report you find a consistent and repreated concentration on all the other factors, so the balance of the mindset is clear.
BTW I don't smoke (not unless I get very hot indeed).
Looking further I still don't see anything that suggests any other interpretation...
"One month of Tesla Carlog data was reviewed for the time when the driver was making his morning trip to work. Although GPS information was not available in the Carlog data, the NTSB―using information about the driver’s daily patterns and routes―identified two similar incidents that occurred on March 19,
2018, and on February 27, 2018. The data showed that during both incidents, Autosteer induced a steering action to the left, which appeared to be in the vicinity of the gore. The Autosteer action was followed within 2 seconds by a driver-induced steering correction to the right, overriding Autopilot functionality. During these two previous incidents, the driver’s hands were detected on the steering wheel."
So the driver knew that AP struggled at that location.
Furthermore:
"Crash and maintenance records show that the attenuator at the crash location was damaged or repaired more frequently than any other left-exit crash attenuator in Caltrans District 4―it had more than double the repairs of any other location. Traffic collision data show that in the 3 years before the fatal March 23, 2018, crash, the attenuator was struck at least five times, including one collision that resulted in fatal injuries.
The crash attenuator at this location was struck again on May 20, 2018, about 2 months after the crash involving the Tesla."
That suggests that many human drivers crash at this location - so we should ban all human drivers right?
One thing that bothers me a bit about incidents like this. Conceptually, the driver is supposed to monitor the car and override it if it is about to do something dangerous. But who among us is actually capable of discerning that their vehicle is going to try to kill them, then working out and implementing an alternative course of action -- all in a few hundred milliseconds.
It's not like the car computer fades the music on the entertainment system and announces. "Hey Dave, you see that bridge abutment up ahead? I'm going to crash into it ... three ... two ... one ... adios amigo"
"Pulling into a lane that doesn't exist should be pretty easy to spot if your eyes are on the road,"
I think that's more than a bit naive. Ever had a front tire come apart while traveling at speed? It takes a second or so to realize there is a problem and a bit longer to figure out what it is. Fortunately, the first thing that comes to mind -- hitting the brakes and trying to maintain directional control is probably the optimum approach. But everything happens really fast and you are not going to be doing much analysis during the critical time interval.
Yes, if you suspect that Elon's monstrosity is likely to attempt self destruction, you MIGHT have an appropriate response keyed up to implement. And it MIGHT save your life. But then if you mistrust Autopilot, why would you be driving a Tesla?
Erm - he had already corrected this particular error twice in the two months leading up to this incident - so it shouldn't have been a surprise at all...
It also started moving out of lane with five seconds before collision, and was completely out of lane 4 seconds before the collision - that's significantly more than "a few hundred milliseconds".
That's plenty of time to assess the situation and take corrective action.
The more I look into this the more I see the primary cause of the incident as driver distraction, matching the other 6 collisions into that barrier over three years. But this one, unlike the others, is being assessed properly as a collision that should never happen. Can you imagine the response if the same degree of rigour was applied to all road collisions?
But then if you mistrust Autopilot, why would you be driving a Tesla?
Surely this isn't the only possible reason for driving a Tesla.
I have no interest in Tesla's products, which don't suit my use case and which I find gratingly annoying. But if I did, I still wouldn't use Autopilot. I don't feel compelled to use every feature in other cars.
The initial report is bound to be vague. Basically "We've been informed that a Belchfire 500 autonomous vehicle has been involved in a serious accident. We are investigating."
After all, most accidents will occur outside of normal working hours (40 to 50 hours in a 168 hour week -- before allowing for holidays). What exactly do we expect the janitorial/security staff to do when the police call them at 0300 on Christmas morning to inform them of an accident? They'll call their boss ... will call his/her boss ... who will attempt to call someone in the company who actually cares -- if such can be found at a time when half the country is visiting relatives. A well run company will presumably have some mechanism in place to meet the statutory 24 hour requirement. It might even work. (In my experience not all that many companies are actually that well run).
It's surely the ten day report that matters. It will presumably contain actual information.
> Did they leave a gap there for scooters?
No.
Appendix C, clause 1C (p.13), emphasis mine :
C. the crash results in any individual being transported to a hospital for medical treatment, a fatality, a vehicle tow-away, or an air bag deployment or involves a vulnerable road user; and
Definitions, 19 (p.9):
19. “Vulnerable Road User” means and includes any person who is not an occupant of a motor vehicle with more than three wheels. This definition includes, but is not limited to, pedestrians, persons traveling in wheelchairs, bicyclists, motorcyclists, and riders or occupants of other transport vehicles that are not motor vehicles, such as all-terrain vehicles and tractors.
> That's an odd category to include as vulnerable...
Tractors travel at a crawl, relative to normal road speeds, when forced to use roads, such as moving from one field to another, or work site to another. They are not designed to travel on roads as a normal environment, so do not have many of the normal safety features other vehicles would have. They are not very maneuverable, have tires designed for working in soft fields rather than properly treaded for road use, bad visibility, could be extra-wide, etc. Of course different models have different sets of capabilities, but we are probably talking the lowest-common-denominator situation.
Indeed. And that's why in rural parts of the US it's not uncommon to see signs warning drivers that there may be farm vehicles on the road. I encounter such vehicles frequently.
And while tractors are large, a car ramming into one at high speed would be very dangerous for the tractor operator. Dynamics are dynamics, and tractors aren't so massive that such a collision wouldn't transfer enough energy to cause significant acceleration. And that doesn't take into account the possibility of subsystem failure, such as an accident rupturing a tractor's tire and causing it to overturn.
Tractors are very large - typically over around seven ton (first suggested typical mass on google), and frequently towing something significantly heavy as well.
Their operators are usually in an elevated position, away from any collision.
Compare that with a pedestrian, or a cyclist (motor or otherwise), or an equestrian...
They aren't in any sort of usual definition of a vulnerable road user.
Good. I've seen one Tesla on my parents street. It turned onto the end of the street, kept turning, drove over the curb onto the grass (still turning), about the time it would have hit the sidewalk the driver realized things were going sideways and jerked it back on the road. Nothing there to confuse the system, no snow, the road and grass were nice distinct colors. I do not put any trust into these systems!