
"A.I. is hard."
That is all.
Evidence continues to mount that the Tesla's Autopilot system probably isn't anywhere near as bad as breathless reporting would have it, with the third reported accident in the US since the feature was launched in October 2015. Given the accident rate human drivers experience – the US road toll for 2014, for example, was more …
Apparently grammar is hard as well, looking at the guy's posts in the forum. Half the replies are trying to figure out just what the hell he's trying to say.
Also they make the point that the owner's manual clearly says the autopilot does not detect stationary objects in the road when driving over 50mph (which sounds like a pretty bogus limitation to me...)
You mean like a broken down or stopped on traffic lights car stationary object ?
I expect the guidance system to have a hard time detecting narrow, non-metallic objects in the dark (no radar signature, little contrast from the background), and the restriction is based on that.
And if you're doing 60mph (100km/h) in an area with traffic lights, running into a statio9nary object will quite likely be the last thing you do. Ever.
"Autopilot did not detect a wood stake on the road, hit more than 20 wood stakes, tire on front passenger side and lights FLYED away..." (my emphasis)
It's not very different to what we hear in the UK every day. For example:
I was sat. They was sat. I was stood. They was stood.
A few years ago a Dyson technician came to our home to repair a grinding noise coming from the machine. A few minutes later he replaced some parts and switched the machine on. Everything was tickety-boo. He switched the machine off and said "That's put an end to them racket noise."
20 wood stakes? On highway 55? I can't see how unless he means the fence posts set back some 30-50 feet from the edge of the road? If I remember right about the only things on the edges of 55 are occasional traffic signs and a short guardrail that goes over a wash but it's maybe 50 feet long. Nope, I don't see how it could be 55 especially since at 60 mph it wouldn't take much more than ten or fifteen minutes to traverse the entire length of it and he should be able to stay awake that long.
I-90 is a different matter but again the stakes are still typically some 50 or more feet from the roadway edge. My guess is that he simply took a wrong turn and wasn't where he thought he was. I also notice that the picture showing the location is in simplified Chinese so perhaps the grammar spill is a bit understandable.
> I was sat. They was sat. I was stood. They was stood.
> "That's put an end to them racket noise."
Actually, those two are not examples of bad grammar but simply non-standard but quite correct regionalisms. The former is suggestive of Estuary English, while the latter, in which "them" functions as a determiner rather than a pronoun, is a feature of a large number of dialects and even idiolects (mine included).
Just because your car can hold it's lane and not run in to traffic doesn't lessen your responsibilities on the road. If I was that officer I would write the driver a ticket for driving without due care and attention.
I have a car with adaptive cruise control, that doesn't mean I don't have to be ready to use the brakes; it just means I don't have to use them myself as often. Most of the times the cruise control uses my brakes is when a car changes in to my lane in front of me.
If I had a Tesla I would probably use the autopilot a lot, I would still pay attention to the road though.
To be fair, it's pretty much only the name which they have got wrong. It's misleading.
Basically, Tesla's "Autopilot" is an intelligent, highly advanced cruise control system. They need to rename it as such to stop people completely relying on it and turning off their own brains.
Do Tesla claim that it would be OK for the driver to watch a Harry Potter video at 60MPH? No? Didn't think so. It's "Traffic-Aware Cruise Control".
Warning: Do not depend on Traffic-Aware Cruise Control to adequately and appropriately slow down Model S. Always watch the road in front of you and stay prepared to brake at all times. Traffic-Aware Cruise Control does not eliminate the need to apply the brakes as needed, even at slow speeds.
Warning: Traffic-Aware Cruise Control can not detect all objects and may not detect a stationary vehicle or other object in the lane of travel. There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.
Warning: Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.
Warning: Traffic-Aware Cruise Control may misjudge the distance from a vehicle ahead. Always watch the road in front of you. It is the driver's responsibility to maintain a safe distance from a vehicle ahead of you.
Warning: When you enable Traffic-Aware Cruise Control in a situation where you are closely following the vehicle in front of you, Model S may apply the brakes to maintain the selected distance.
Warning: Traffic-Aware Cruise Control has limited deceleration ability and may be unable to apply enough braking to avoid a collision if a vehicle in front slows suddenly, or if a vehicle enters your driving lane in front of you. Never depend on Traffic-Aware Cruise Control to slow down the vehicle enough to prevent a collision. Always keep your eyes on the road when driving and be prepared to take corrective action as needed. Depending on Traffic-Aware Cruise Control to slow the vehicle down enough to prevent a collision can result in serious injury or death.
Warning: Driving downhill can increase driving speed, causing Model S to exceed your set speed. Hills can also make it more difficult for Model S to slow down enough to maintain the chosen following distance from the vehicle ahead.
Warning: Traffic-Aware Cruise Control may occasionally brake Model S when not required based on the distance from a vehicle ahead. This can be caused by vehicles in adjacent lanes (especially on curves), or by stationary objects.
Traffic-Aware Cruise Control is particularly unlikely to operate as intended in the following types of situations:
• The road has sharp curves.
•Visibility is poor (due to heavy rain, snow, fog, etc.).
•Bright light (oncoming headlights or direct sunlight) is interfering with the camera's view.
•The radar sensor in the center of the front grill is obstructed (dirty, covered, etc.).
•The windshield area in the camera's field of view is obstructed (fogged over, dirty, covered by a sticker, etc.).
Caution: If your Model S is equipped with Traffic-Aware Cruise Control, you must take your vehicle to Tesla Service if a windshield replacement is needed. Failure to do so can cause Traffic-Aware Cruise Control to malfunction.
Warning: Many unforeseen circumstances can impair the operation of Traffic-Aware Cruise Control. Always keep this in mind and remember that as a result, Traffic-Aware Cruise Control may not slow down or may brake or accelerate Model S inappropriately. Always drive attentively and be prepared to take immediate action.
Warning: Traffic-aware cruise control may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.
It's "Traffic-Aware Cruise Control".
That may well be its proper and legal name, but even Tesla and Musk refer to it as "autopilot" in Tweets and other media on a regular basis. On the other hand, anyone clever enough to have earned enough money to buy a Tesla really ought to be clever enough to know that it's not a bloody AI autopilot.
Add technology that forces people to pay attention so they can't use their laptops like the idiot reporter did who wrote a story about his use of Tesla's autopilot, or the idiot driver who was apparently watching Harry Potter.
Simply have sensors in the steering wheel that require a hand remain on it, and every minute or so have a light flash on the windshield via HUD that requires some sort of acknowledgment from the driver, like squeezing the wheel or whatever within a second or two. Failure to do these things will result in autopilot warning you that it is disengaging and you will be forced drive it yourself. Then you have no opportunity to drive "hands free" and have to keep your eyes on what is in front of you, rather than on a phone, laptop, DVD player or whatever.
> Did your car manufacturer advertise the car as "drives itself" and "has an autopilot"? Do not think so.
You are confusing marketing bs with rights and responsibilities of operating a motor vehicle on public roads. For all it matters, the manufacturer could claim that they're AI is good enough that the driver could be drunk, asleep or even a minor. Your responsibility is to be in control of your vehicle at all times. Until the law permits self driving cars, that is where it starts and ends.
"Next time I'm on a 747 and the pilot has autopilot on I won't care if he's watching Harry Potter or paying attention to the flight."
You don't have to: there's a reason why there are always 2 pilots and why protocol demands that there's always one pilot present and alert in the cockpit.
> If I was that officer I would write the driver a ticket for driving without due care and attention.
Seconded.
> I have a car with adaptive cruise control, that doesn't mean I don't have to be ready to use the brakes; it just means I don't have to use them myself as often. Most of the times the cruise control uses my brakes is when a car changes in to my lane in front of me.
I spend a reasonable amount of time adjusting the speed of my ACC to cater to speed changes well ahead that the lidar can't see, so that it doesn't have to brake much.
It was amusing to come up behind slower traffic and let it do its thing with a passenger onboard though. The braking could get quite hard with a 20mph speed differential involved.
Close, but...
The current autopilot incidents, and the entire history of human interaction with technology, belies your statement. It's really the opposite: people start out using technology like this incautiously, and as they figure out that it can't do everything, the survivors gradually become more cautious.
But then of course we breed a whole new generation of people who haven't yet learned to be careful, and build a whole new suite of technologies...
Then he was using his laptop while driving, and proudly admitting it in his article. I can't find it now, because googling "tesla autopilot laptop" gives me a million hits about the guy in Florida who died (since a laptop was found in the wreckage)
Remember when Andrew Wiles finally proved Fermat's Last Theorem? The final proof turned out to be just 'a bit' (LOL) more difficult than had originally been imagined by Fermat. Orders of magnitude more complicated.
Self-Driving Cars are probably going to be quite similar. When they finally get one actually finished, one that successfully stays out of the headlines and avoids contributing to 'interesting' tragedies, they'll look back over the intervening decade(s) and then laugh at the vast ratio between their final fully debugged system, and the 2016-era trivial kits that some had imagined would be sufficient.
It will be slightly more difficult on two accounts.
One is: "It is significantly harder than envisaged"
Two is:"The Fermat's last theorem commission".
I will skip one and concentrate on two. When my dad used to be on the PhD thesis review commissions, they had a special "Last Fermat Theorem" reviewer set - specifically for any "Last Theorem" submissions. That set consisted of the worst and meanest basterds in the department and destroyed the submitter at short order. It was 30 years ago at a point where the math community considered any attempt to prove the last Theorem to be a joke and took the piss accordingly.
So, coming back to the driverless AI car. It is indeed the last Fermat theorem - in order for it to be officially certified it will be subjected to a set of tests which no human learner driver will pass. Not now not ever. In fact, it will get an examiner which will fail any human driver. Repeatedly - like the 200 morbidly obese monster I got for my first driving test. The instructor after that told me: "Sorry dude, your chances of passing with this one were about zero. She fails all young males out of principle".
Why do I get the feeling "It was Autopilot's fault!" is the new "A deer ran out in front of me! (*1)" or "My Toyota just kept accelerating! (*2)" ?
Pranged your expensive new Tesla? Blame the buzzword new technology, get lots of press coverage and sympathy.
*1: "...as I was driving home from the bar in muh pickup truck at 1am... I was completely sober, honest"
*2: "...nothing to do with my new aftermarket floor mats, honest"
Or a chap who knackered his BMW and claimed a technical fault.
See, ESP did not pull his car straight when he was trying to overtake several lorries. On a curved road that was covered with a soup-like mix of ice and wet snow. Will those bloody Germans ever learn to build cars?
Exactly. I was taught and it is obvious common sense that the bloke in the driver's seat is responsible for the vehicle, it's passengers and contents and all other road users nearby. It beats me why anyone thinks he may abandon vigilance in any vehicle going more than 3 mph.
Let's turn back the clock a century and put an advanced guard with a red flag out of front. Sheesh!
Stoneshop: "...can't pull in the map..."
Even the worst coder on Earth wouldn't rely on real-time / live map data. Nor would they ignore the huge ratio of mobile data per GB vice disk space per GB.
They'd use the data connection to update maps as required (daily, weekly), and then use today's essentially infinite disk space during real-time access.
> Even the worst coder on Earth wouldn't rely on real-time / live map data.
There must be some worse that the worst coders around then - because a few navigation systems do do exactly that. Some will pre-cache data for the planned route, but there are navigation systems (mostly for mobile phones) that rely on real-time map data downloads.
It's something I tend to note in product reviews, given that mobile coverage round here is so much coverage as fig leafs.
I seem to recall Tesla receiving real-time "autopilot" engagement reports (so when someone turns on their autopilot Tesla knows about it straight away).
So, to me, this sounds suspiciously like the guy is blaming it on the autopilot, but pre-emptively telling us that (due to lack of signal) Tesla can't deny its autopilot was *not* engaged.
Maybe it was his fault and he's looking for something else to blame?
"Maybe it was his fault and he's looking for something else to blame?"
Of course it is his fault because Tesla advise that Autopilot is Beta software and that "The driver is still responsible for, and ultimately in control of, the car.".
Like too many people today if they do something stupid they start looking for someone to blame rather than accept responsibility for their actions.
Obviously, one should be paying attention to the road even while autopilot is engaged, in case the car encounters something the autopilot can't manage. The problem is, it's already fairly difficult for a lot of people to pay proper attention to the road when *they're* driving. But just sitting and staring, for hours on end, without losing focus? It's just not going to happen - not reliably, anyway. It's going ot be tricky until autopilots are good enough that we're actually allowed to sleep through the trip.
"Now imagine the person to be an airline pilot."
Not a valid comparison for several reasons.
- I can't say what it's like in the Blighty, but here in the Colonies all that's required to get an automobile or motorcycle license is passing a written test, an eye test, and a brief driving test. Basically you only have to show passing familiarity with some laws and mechanical operation of the vehicle. Whereas to be an airline pilot takes years and lots and lots and lots and lots of training, plus certification, frequent training refreshers and re-certification. Even a private pilot's license takes a metric buttload of training and cert.
- As somebody pointed out in an earlier post, flying a commercial airline requires two pilots, and at least one must be alert and in the cockpit at all times.
- Flying is not driving. Flying is not LIKE driving. Generally once you're in the air... well, I wouldn't say you have fewer things to worry about, but it's a different set of things. Typical road hazards don't exist; most man-made hazards will be close to the ground, and airborne hazards are few and far between. (Though likelier to be deadly.) Weather is a larger concern, pilot error, mechanical breakdowns.
- Keeping the above in mind, the autopilot is very clever at keeping you on a constant heading and altitude, and in combination with a properly programmed GPS can pretty much fly you from one end of the trip to the other with minimal intervention. No trees, deer or wooden posts are likely to be encountered except possibly at the endpoints of the trip, though bird strikes are more likely (but not that common).
- Oh yeah, and no flying home from the local pub. Again, I can't speak for other countries, but pilots have to be alcohol-free for a minimum of 8 hours (varies -- some airlines may increase that). But as long as my blood alcohol is within legal limits, I can jump into my car and drive -- in fact, I can anyway, just not legally. But they have to catch me to enforce it. Airline pilots? Shown the door.
Thing is, the autopilot doesn't have to worry about turns in the road -- it's not following a road. It doesn't have to worry about pedestrians, dogs, cars cutting you off, discovered checks, traffic jams, traffic signals, posted speed limits, the car ahead of you applying emergency braking... or any of thousands of road hazards or conditions. Icing is a concern, but not slippery roads.
Navigation consists of following straight lines from one point to another, (mostly) unrelated to geographical conditions on the ground. Driving a car is ALL ABOUT ground conditions.
I wish people would stop drawing this parallel, because the two are nothing alike.
"The problem is, it's already fairly difficult for a lot of people to pay proper attention to the road when *they're* driving"
The biggest problem with manual control is "tunnel vision"
It's a problem both in driving and aviation.
Reducing the load allows the driver/pilot to relax and close eyes, OR use the road reduction to scan for a wider set of inputs (which keeps you from being bored)
"But just sitting and staring, for hours on end, without losing focus? It's just not going to happen"
That's why you're supposed to stop for a rest every hour or two. Those signs everywhere saying "Tiredness kills, take a break" aren't just there for amusement. If people ignore that and keep driving until they fall asleep, it's entirely their own fault whether or not they have cruise control.
Tesla do have to confirm that such [few] reports are not competitors trying to 'derail' their still precarious foray into the auto industry.
Despite Autopilot having the ability to sense beyond our limitations, it is not yet advisable to drink and drive using it.
Excusers and competitors have a whole gamut of future put-downs that they can roll out for use against Tesla who'll have to defend against such real and virtual hazards.
With all due respect to Tesla, which I rather like, this would be a bit "early adopter" / v1.0 for me to rely overmuch on their autopilot. Give it 2-3 yrs.
At most a short takeover during a long highway ride, if that. Coming back after one too many at the pub would be nice too, except you'd still get busted @ a cop check so sobriety's likely the better bet.
"Autopilot does use radar.
It also uses ultrasonics.
The richest information source is still a forward facing camera though.
This is going to happen because people can't use their brains enough to accept that a driver aid is and aid to driving. The Tesla is not a self driving car."
So how can radar and and ultrasound get confused by a white sided truck against a bright lit sky?
"So how can radar and and ultrasound get confused by a white sided truck against a bright lit sky?"
It got clotheslined - not scanning more than a couple of feet from the road.
I've nearly been clotheslined - by real clotheslines - when motorcycling, for the same reason. You don't expect to find an obstacle at head height with nothing underneath it.
Regardless of the Tesla's involvement, turning across oncoming traffic in such a way that this _could_ happen is an illegal turn (evasive action required) at best and more likely dangerous driving, in most jurisdictions. In that particular case there's also the question of where did this guy's dashcam get to - he never drove without one, but none was found in the vehicle. Tampering is obvious.
"It got clotheslined - not scanning more than a couple of feet from the road.
I've nearly been clotheslined - by real clotheslines - when motorcycling, for the same reason. You don't expect to find an obstacle at head height with nothing underneath it.
Regardless of the Tesla's involvement, turning across oncoming traffic in such a way that this _could_ happen is an illegal turn (evasive action required) at best and more likely dangerous driving, in most jurisdictions. In that particular case there's also the question of where did this guy's dashcam get to - he never drove without one, but none was found in the vehicle. Tampering is obvious."
I see. I know I'm using the example of the guy who died with his Tesla going under a truck, but surely the system should be looking for obstacles within the height of the vehicle to avoid such a thing? Obviously you could have low flying birds etc, but the driver could also be asleep at the wheel and driving in to a road sign? But they're selling a car that has a feature on it that is called Autopilot, and Joe Public would think it could drive itself. The fact it isn't sold as that doesn't really matter. There are plenty of products sold every day designed or sold to do a specific job but are then used for different things.
And yeah totally agree with the dashcam thing. I would argue now though, however, the next update Tesla come out with will be a way of recording when the autopilot system is engaged and disengaged...
"So how can radar and and ultrasound get confused by a white sided truck against a bright lit sky?"
The ultrasound is a very short range system - a few metres at best - good for basic blindspot and sideswipe detection, as well as basic collision avoidance.
The Radar is forward facing only - and has significant range, but it is a fairly narrow antenna array, and I suspect it is therefore making a pretty much 2-D image of the world - RADAR reflected from a flat vertical surface 12" above the unit will return 24" above the unit - so seeing 'under' a trailer is possible.
Of course we might reasonably expect it to have see the truck first - maybe some warning bleeps when vehicles cross the path?
"Model S is designed to keep getting better over time. The latest software update, 7.0 allows Model S to use its unique combination of cameras, radar, ultrasonic sensors and data to automatically steer down the highway, change lanes, and adjust speed in response to traffic. Once you’ve arrived at your destination, Model S scans for a parking space and parallel parks on your command."
While I agree with everything that's been said here about personal responsibility, one has to wonder whether all this "safety" tech in cars is actually making them less safe. Every day on the motorways I see people looking down at their mobile phones, veering and then correcting. It causes no end of near and actual rear-end shunts. If people already take such risks without the widespread use of lane assist and automatic braking, imagine what they'll do when it becomes ubiquitous.
this is bound to happen we have a precedent with people believing sat nav over their own eyes.
There will always be stupid people, or people who make stupid decisions. It is a problem of driver education and that tendency of people to think well it was ok last time I pushed the boundary so this time I will push it a bit more.
plus some people will watch the advert and not read the Ts & Cs or the manual, leading to issues like this.
Not sure how to mitigate it maybe more driver education or some a new owners courses to make sure they understand the limitations of the new shiny better , and also making sure any frivolous lawsuits are called out as such.
i guess when it is reported that the driver is an idiot that won't capture the headlines in quite the same way.
"one has to wonder whether all this "safety" tech in cars is actually making them less safe"
You bet it is.
When I rule the world, steering wheels will have a six-inch serrated metal spike in the centre, pointing directly at the driver's chest. *That* will ensure that people drive a whole lot more carefully.
I remember, back in the day, seeing people driving in fog, their sat nav glowing a foot away from their faces in the centre of the screen - doing 80 mph. Meanwhile we proles, devoid of the mighty Sat Nav, had to stick to sub 30 mph.
The greatest tech in the universe would be hard pressed to save the stupid from themselves. I am firmly in the no thanks, I'll stay a Luddite brigade when it comes to driving.
"Autopilot" seems like a perfectly reasonable name for it to me. At least if you know what autopilots in normal small planes -- or even $50m ones -- actually do.
The autopilots found in virtually all aircraft will stupidly hold the altitude and direction you tell them, and will happily fly you straight into the side of a mountain or another aircraft.
Remember Germanwings Flight 9525? That was an Airbus A320, with one of the most sophisticated autopilots you'll find in a civilian aircraft. But tell it to fly at 100 ft and that's exactly what it will do.
Traditionally an autopilot's most fundamental task is keeping the plane level and relatively straight so the pilot can spend a few seconds or a minute or two reading the map.
Second is altitude hold. Simple ones just let you maintain the current altitude when you turn it on (same air pressure). More complex let you program a target altitude and will automatically climb or descend to it (but in small planes you have to watch the speed and adjust the throttle yourself).
Third is heading hold. Simple ones just maintain the current heading when you enable it. More complex let you move a "bug" on a kind of compass display to set a new heading. Historically, that's twin engine or small turboprop territory. More expensive still will automatically follow a path directly to or from a "VOR" or "ILS" radio beacon.
Fourth is speed control. Historically this is definitely only bigger planes. Otherwise you set the throttle yourself and you get whatever speed you get.
That's the historical state of the art, up until well under 20 years ago when GPS started to get approved for aircraft navigation. That lets you program in more complex and less restricted flight paths.
But it still knows nothing about mountains. At the most, some autopilots know about the Minimum Safe Altitude (MSA) in the current area -- that is an altitude 1000 ft above the highest point in a 25 mile radius.
As for avoiding other aircraft, the "TCAS" system was introduced around 1990 (but very rare then). This allows planes to pick up the radar transponder replies from other aircraft, determine where they are, and warn the pilot if there is a possible collision. This is now required for aircraft with more than 19 passengers, but in almost all cases it only gives a voice warning to the pilot who must still manually respond. It is also limited to telling the pilot to climb or descend to avoid the other aircraft, never to turn.
In 2013 Airbus announced the availability of retrofitting integration of TCAS with the autopilot across their range. The A380 (only) had the ability for a few years already. Given the reluctance of airlines to spend money, and the size of fleets, this will have made its way into only a tiny percentage of planes by now.
I believe that the term "autopilot" as applied to Tesla's driver assist system, while technically correct, is highly misleading and tends to raise unrealistic expectations wrt its capabilities.
Why is it "technically correct"? Because is basically does what an autopilot in an airplane or a seaship does: it keeps the craft on a preset course. (Actually the Tesla system even does quite a bit more than this, I am not aware of any aviatic or nautic autopilot sporting a collision avoidance system as advanced as the one used by Tesla - after all they don't need it.)
And that takes us straight to my second point: why is the term "autopilot" misleading? Because it makes people think that they can rely on it most of the time. Which is basicly true in the air or at sea, where obstacles are rather few and literally far inbetween and where the autopilot just frees the crew of the tedious routine task of staying on course. The environment still needs to be watched of course, but obstacles do not normally keep popping up all the time and in a split second.
But this is obviously not true on the road (not even on a motor-, free- or expressway) where the average density of obstacles is higher by a few orders of magnitude. The driver just has to watch the traffic all of the time, even under the best of conditions and no matter how advanced the driver assistance systems of his vehicle may be. Which, of course, makes the very concept of an "autopilot" for automobiles somewhat dubious - on the road it's either fully autonomous driving (at the moment not technically feasible) or it's mostly useless. On the road, the ability to stay on a preset course, augmented with limited collision avoidance capabilities just doesn't cut it.
It is true that Tesla clearly states the limitations of it's "autopilot". The problem is that these limited capabilities are just good enough to mostly work in many situations, thus making the driver more and more careless and daring over time, but can and will fail miserably eventually - typically in a situation where the driver has developed enough of the "contempt that goes with familiarity" for the dangers of relying on the autopilot and has, conciously or unconciously, come to expect a level of autonomy the system cannot provide.
You can call this stupidity, which indeed it is. [sarcasm] Or you could call it bravery (the line between the two being fine at times) - after all the odd bloke pushing the "autopilot" to its limits (and beyond) ist in essence nothing but a volunteer test pilot for a system that really wants to be a fully autonomous vehicle when it grows up but for the time being needs a lot of beta-testing and exploration of corner cases even for its most basic sensors and algorithms.
I wonder if there is somewere on the Tesla campus a monument to "our brave beta-testing customers of blessed memory". In any case there should be one. Dulce et decorum est pro progradu mori... [/sarcasm]
If there have been 3 accidents involving a Tesla in the course of a year in which there were 32,000 car accidents, that means there was a Tesla involved in, call it, 1 of 10,700 accidents.
Is more than one car in 10,700 on the road a Tesla, or less? I don't know but my money is on "far less."
> Is more than one car in 10,700 on the road a Tesla, or less? I don't know but my money is on "far less."
According to wikipedia: "An estimated 71,000 Model S cars have been sold in the United States through April 2016". Plus maybe another 5000 Model X. Call it 75000 total.
75000 * 10700 is just over 800 million.
Are there more than 800 million cars on the road in the USA, with population 320 million? No. It's about 260 million, including trucks and buses etc.
So, by that measure, Teslas are involved in 3x fewer accidents than the average vehicle.
I would bet that Teslas are driven at least an average amount of miles, and probably quite a bit more than average.
"So, by that measure, Teslas are involved in 3x fewer accidents than the average vehicle."
That only makes sense if their are only two manufacturers. Tesla and everything else.
Lies, damned lies and statistics ;-)
You need compare either all individual models, all marques or all manufacturers.
This road is more of a local access road next to the interstate. It has no shoulders in the area of the accident and the "wooden stakes in the road" were guard rail supports at a culvert. Generally those stakes are 8" square or so and the guard rail has a total height of about 3 feet. The steel barrier supported by the stakes is ~18" tall. The whole system is designed to keep you from falling into a water obstacle or off a cliff. The road in this location has a total right of way of 60' consisting of two 12' lanes, a 2ft gap to the guard rail and the rest in unimproved ditches up to the private property boundary.
It is a much more scenic and interesting drive than the interstate even though it is within sight of it for the most part. At night however the animals tend to be on this road more and the limited sight lines give you less time to react to them or to other unexpected obstacles. Falling rock, downed trees, parts of other vehicles that have run over aforementioned wildlife, etc.
Generally these roads have a 70mph daytime speed limit and a 65mph night limit. Since this section is limited to 55mph it is a good indication that there are more hazardous conditions than normally encountered. It has more encroachments (driveways) than usual, more contours, and has a history of animal strikes and other accidents that have convinced the State to limit speed.
TLDR: Not a good place to use a feature that is realistically still in beta. Thanks for giving us another data point though.
Tesla has admitted that its autopilot feature was activated when one of its cars crashed on Sunday.
However, the electric carmaker has suggested that the function was not being used correctly at the time.
"This is contrary to the terms of use that are agreed to when enabling the feature and the notification presented in the instrument cluster each time it is activated.
"As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway.
"Autosteer ... is best suited either for highways with a centre divider. We specifically advise against its use at high speeds on undivided roads."
http://www.bbc.co.uk/news/technology-36783345