Re: OlaM
"...IMHO if you manually tell it to do a dangerous thing, it stops being an autopilot at that point. Aircraft autopilot follows routes, with set safe altitudes, and terrain-following radar to avoid collisions..."
Autopilot is pilot automation - but only REDUCING the pilot workload. If someone or something is watching your heading and altitude, it can give your brain cycles to look into why your #3 engine is running a little warm. It helps you follow your flight plan easier too, getting you from one waypoint to the next and lining you up for a nice landing on the runway you punch into the flight computer.
Now realize that they still load navigation data onto those things with 8" floppies on some older variants, and you'll start to get an idea about the technical limitations of what an autopilot can do. Your average 737 doesn't have the latest Nvidia Volta GPU in its telemetry rack - it's a step up from vacuum tubes and loading programs by tape or punch card.
Aviation autopilot also has the benefit of a mandated air traffic control in Class A airspace, so that comes with an external nanny in the event something goes off. You may have also noticed the lack of erratic turns and road maintenance issues on your last flight (although the turbulent air potholes can bee a real pain).
Getting a self driving car to work without human oversight is a HUGE effort, and is has never even been attempted at commercial scale in aviation or marine markets (there are recent attempts to get systems working but nothing is being deployed like Waymo on the water or in the air).
"...Tesla's tech shot off into a barrier..."
Sure that's one damning way to look at it, but as an engineer I also look pragmatically at the longer phase that was used to describe this situation: "it was following a car and then the car went through an interchange and then the Tesla drove into a barrier". I also noted in the NTSB statement that the following distance was set to minimum...
So with those two data points, I immediately apply my expertise in driving in the S.F. Bay Area, where our roads are crap, and so are our drivers (on ballance).
I can imagine a scenario where, the Tesla was following a car that decided late, to take the interchange a different way, and made a moronic move across the gore point which was poorly maintained (we are lucky if Caltrans fixes a crash barrier or guard rail within a month let alone a week - now take a guess how bad the paint markings are...). Now in my imagination I can see the Tesla following closely behind that idiot who jumped across the gore - with the lane markings partially obscured by the closely followed car in front (the idiot's car). In that case, the car was probably simply following the car in front into the gore that was poorly marked, and once the idiot completed his late freeway change to the other direction across the gore, the Tesla realizing he was crossing a line decided not to follow him. From there the line which would have been the left solid line may have come back and the car thought it was in a lane and tried to center itself (remember it's narrow at the tip of the gore). Another part of the code, facing south and brightly lit from head on by the sun (because that's the direction that interchange faces) no longer saw a vehicle in front - many reasons possible, a Tesla has previously not seen a truck trailer across lanes because of poor lighting, I'd speculate that the camera based vision system Tesla chose still can't see in adversarial lighting conditions. Then with no one in front the car attempted to return to the un-obstructed cruise control set point (even though the speed limit is 65, people will do 90 because they feel like it - Tesla drivers out here love their electric torque and drive like jerks.
So the steering would be trying to center itself on the two lines it detected, and the cruise setting would not think there is a car in front so would speed up to the set-point.
To me, this looks like a failure of localization (GPS-only can easily be off by 3-meters, or about a full lane). Without having a high resolution map and LIDAR to compare your position on the road, to known fixed things like concrete barriers, bridges, and road signs - relying on radar within a few miles of Moffett NASA Ames Federal Airfield which has the civil aviation surveillance radar for the S.F. Bay area, isn't a good idea - and that pretty much leaves you with visual navigation to keep you in lane.
See previous comment about which direction the forward facing camera was pointing relative to the sun, and our terrible road maintenance practices in California and the obscuring of oncoming obstructions by the car in front. If anything I'd be surprised if Caltrans doesn't get a bit of a spanking on this for road conditions, and then Tesla being dragged over the coals for not having good enough sensory perception and geo-localization.