Re: Don't be naive
> Is anyone naive enough to think that these car companies are going to be able to create self driving cars that work reliably?
I can't (and am not) commenting on the rest of your post.
Regarding "is anyone naive enough" - yes: I am.
I'm a software / hardware engineer, I studied robotics and computer vision (and various AI including natural language engineering) at university - and in the space of a few months we had things working remarkably well in a very academic environment.
As the various sayings go, 90% of the effort and duration is the first 90% of the project. The *other* 90% of the effort and duration is the last 10% of the project.
I'm putting aside concerns (which I share) of the software getting hacked or having severe bugs (like it suddenly overflows an integer and the car takes an immediate left turn) - I'm just talking about the capability of software driving. I can see it getting there. If Google can 9/10 identify an image of a dog as a dog - and if my £20 a month phone handset can overlay a 3D image of a frog's anatomy on the desk in front of me, I sure as heck expect a car to be able to safely navigate a road - with or without line markings or otherwise.
Two things shake me up about this story:
1. The car's sensors (or the interpretation of the data) didn't recognise a stationary block of concrete in front of it. That's a catastrophic accident, and they *must* resolve it. I don't care if it's called autopilot or "smart brake support" (I believe is Mazda's branding) or anything - it terrifies me that the systems didn't see it. I am however willing to give them the benefit of the doubt - and say that's just part of the illusive last 10%. Which brings me onto point 2:
2. We're living in the most dangerous period of autonomous cars (IMHO). It's the period where cars are *almost* capable of doing something interesting (like driving me up the road as well as I can) but they're not capable of doing it without a human overseeing it (and being ready to immediately take control). For this, I cite Uber's recent news. Perhaps the problem is made worse by the naming of the technology (Autopilot in this case) but I'll readily admit I've had a near miss on the M11 using standard cruise-control in my Mazda 3. Driver 8 cars ahead tapped his brakes, I had just that little too much relaxed my concentration from looking 8 cars ahead to gazing at the car immediately in front - when I found myself with half the time to react.
3. That crash barrier/crumple zone should have been repaired/replaced 10 days earlier. Or a temporary speed limit should have been put in place (like 30MPH). Seriously, whoever is responsible for ensuring that a car hitting that barrier at the legal speed should not result in a death has to bear some of the responsibility here (IMHO).
Anyway, rant over. I'm impressed by Tesla, I'm fascinated by SpaceX, I thoroughly enjoy Elon's enthusiasm. I'm not trying to be a fanboi. I hope I've been sufficiently objective in my words above.