Doing an Apple?
You're driving it wrong!
Tesla has reached a court settlement over its alleged "essentially unusable and demonstrably dangerous" Autopilot system. The automaker was sued in the US by its electric car owners last year after it failed to meet its own deadlines and targets regarding its Enhanced Autopilot feature – which was supposed to grant vehicles …
Why the photo of the hipster* wearing a tablecloth ?
* If you want to be a proper old school hipster get this record and not be someone who looks like they've just fallen out of the Monty Python lumberjack sketch.
Don't press wild flowers, be cool.
My first thought was that behind that naff photo was a real person- well, I guess hipsters are people too- willing (or desparate enough) to look that much of a tit.
I hope for his sake it paid enough to cover the rent, tuition fees and/or monthly consumption of over-hopped craft IPAs. (Though now his hipster cred is in shreds, he may as well drink Carling.)
At least he wasn't made to wear spectacle frames containing no glass, which is an improvement for Shutterstock.
My first thought was that behind that naff photo was a real person- well, I guess hipsters are people too- willing (or desparate enough) to look that much of a tit.
The irony is, the guy in the article header photo looks exactly like a salesman at our local Tesla dealership (beard and all). Seriously.
is to refund all of the extra money paid up front by those folks(if they so desire it anyway). Perhaps that means turning off some software functionality on those cars if they have some sort of super enhanced beta software that isn't distributed to the rest of the tesla cars. The claims I have noticed mentioned by Musk/Tesla seem to revolve around the software only. Though I admit I don't track them very closely.
Preface: IANAL.
When you enter into a contract (such as when you buy a car), all parties make certain representations. For example, the buyer promises to pay the seller a certain amount and the seller promises that the car has particular performance and features etc.
In this case, the customers paid extra for the promise that they would receive certain additional features by a specified date*. Tesla appears to believe that it's operating in the wild west of software, where features are added and removed at a whim, might not even be fit for purpose and promises of delivery dates are ethereal. I doesn't work that way with motor vehicles.
* While perhaps not common for motor vehicles, this is not an unusual clause to see in a contract. Once when I bought a property, the seller promised to have certain structural certification work completed by a particular date after the contract settlement. There was a financial penalty associated with them failing to do so.
When you enter into a contract (such as when you buy a car), all parties make certain representations... In this case, the customers paid extra for the promise that they would receive certain additional features by a specified date.
Well, if you're going to pretend this is bound by contract law, then surely you should be looking at what the contract stated the customer would receive by way of compensation if said features were not delivered on time.
If the contract didn't specify anything either way, then maybe the customer shouldn't have accepted it in the first place, and maybe Tesla's lawyers should have included a clause stating that said contract's terms contain forward looking statements that are dependent on certain (currently non-existing) technologies being created and legal frameworks being in place within the allotted time frame and as such said features may be shipped later than promised.
PS, IANAL either.
At least in my neck of the woods, in assessing consumer law the courts tend to take the view that a promise doesn't need to be written in the contract to be considered an enforceable condition of that contract. It just needs to have been made by the seller. If, as in this case, it was written down, so much the better (and easier).
If the contract contradicts a promise the seller made, a bit like children crossing their fingers behind their back, I believe the courts often side with the customer too.
For any Aussies caught up with similar misrepresentations, you can thank your lucky stars that the Australian Consumer Guarantees explicitly cover motor vehicles unless bought at auction or from a private seller.
See here.
Specifically some pertinent quotes
"Products must also:
* match descriptions made by the salesperson, on packaging and labels, and in promotions or advertising
.....
* be fit for the purpose the business told you it would be fit for and for any purpose that you made known to the business before purchasing
....
* meet any extra promises made about performance, condition and quality, such as life time guarantees and money back offers"
That said, if I'm spending 6 figures on a car, I'm spending a few hundred in getting a lawyer to draft something which they'll find a little trickier to ignore.
There's some interesting things to read between the lines of Tesla's statement:
Since rolling out our second generation of Autopilot hardware in October 2016, we have continued to provide software updates that have led to a major improvement in Autopilot functionality.
In otherwords, it wasn't finished when we launched it. Nice that, for a semi-safety-critical system.
This has included an extensive overhaul of the underlying architecture of our Autopilot software that enabled a step-change improvement in its machine learning capabilities. Our neural net, which expands as our customer fleet grows, is able to collect and analyze more high-quality data than ever before, which will enable us to roll out a series of new Autopilot features in 2018 and beyond.
Hmm, well, that kind of thing is tricky. You're unlikely to be able to persuade a true expert in neural net / machine learning to work on a safety critical system. They know it's their reputations on the line, and if I understand anyone that I know in the field it's far from ready for safety critical work. So, just how good are Tesla's people who are working on that kind of thing? The best in the field? I'm not convinced...
The customer response to our recent Autopilot updates has been overwhelmingly positive, so we know we’re on the right track.
Measuring the "worth" of a semi-safety-critical system by customer feedback is the wrong way to do it. Anyone killed in an Autopilot related accident can't leave negative feedback.
Measuring how well it stays on track is a better way. Doing that measurement in safe circumstances (i.e. not on the public highway where people can be killed, like has already happened) is the correct way.
And are they being deliberately dense in using the phrase "on the right track", when not being on the right track is what kills people whose vehicle drives them into concrete barriers?
That said, as time passed since we first unveiled Hardware 2, it eventually became clear that it was taking us longer to roll out these features than we would have liked or initially expected. We want to do right by those customers, so as part of a proposed settlement agreement for a class action lawsuit filed last year, we’ve agreed to compensate customers who purchased Autopilot on Hardware 2 vehicles who had to wait longer than we expected for these features.
Er, we're slowly realising that it's an impossible thing to make work properly (or as advertised).
If the settlement is approved by the court, customers will receive different amounts depending on when they purchased and took delivery of their cars. Although the settlement is specific to customers in the US, if it is approved by the court, we’ve decided to compensate all customers globally in the same way. There’s no legal obligation to do so, but it's the right thing to do.
And we're keeping the majority of your money.
And are they being deliberately dense in using the phrase "on the right track", when not being on the right track is what kills people whose vehicle drives them into concrete barriers?
Well, not so many deaths yet, so they are gathering mostly field data, it seems.
The scariest quote in the hold article and at the heart of what is wrong with"Autopilot"
<quote>Enhanced Autopilot feature – which was supposed to grant vehicles the ability to drive themselves semi-autonomously under human supervision.<quote>
People are bad at pastively supervising and then take with no notice when a critical decision has to be taken. It looks to me a large factor in the Uber crash.
The idea that the car can drive itself then when it has a problem seamlessly hand control to the uptil then car passenger is deeply flawed and it is a design that is going to kill people!
Snowy, you've encapsulated the problem perfectly.
The same logic saw the passengers and crew of AF447 die, despite having a trained crew who were supposed to be able to take back control when automation failed. By definition automation allows humans to disconnect - that's part of the purpose - but the problem is that humans are not very good at switching from dozing and disinterest to alert, aware and in control of complex fast changing situations.
AVs will never be a credible product until their control logic accepts that it can't just go "oh shit man, can't cope, good luck mate, it's over to you!".
A big difference between AF447 and a self driving car is that the pilots had a loooong time to identify and take the correct action, and still didn't do it. Handled properly it takes a long time for an airliner to glide down from 40,000ft.
Whereas a self driving car is going to go from "all OK" to "oops, over to you" seconds from disaster. If you're lucky.
So if highly trained pilots can't do it in 30 minutes, what hope for a tired, over worked and over reliant car owner?
"So if highly trained pilots can't do it in 30 minutes, what hope for a tired, over worked and over reliant car owner?"
In the case of AF447, it was about 3 and half minutes to go from max altitude to hitting the deck. Hazarding a guess, based on the voice recordings, the captain actually managed to figure out what the issue was about two minutes before the impact, at which point it was probably too late to correct the stall.
You're point is still valid, there's still a very short window to make the required corrections, and those corrections need to be done of partial data, otherwise the autopilot would have solved it. Doing this when you've not been really paying attention to the road/flight is going to be especially challenging.
Studying AF447 is helpful because the interaction of several automated systems, and exactly how they are reporting, is vital to understanding what is happening. Plus communicating to the other pilot what you think is happening means you don't start giving the system conflicting commands.
I use AF447 as a teaching example for exploring what does your software do with errors in input, and what can those run on effect be.
...So if highly trained pilots can't do it in 30 minutes...
As I recall, the time between AF447's autopilot handing control back to the pilots and impact with the Atlantic was about 4 1/2 minutes.
During which time the PIC thought he was doing the right thing by holding it in a stall, and the Captain recognised what was wrong about a minute after he entered the flight deck. They seem to have been addressing the problem correctly for about 1 1/2 minutes - which turned out to be too late to avoid the crash.
> During which time the PIC thought he was doing the right thing by holding it in a stall, and the Captain recognised what was wrong about a minute after he entered the flight deck. They seem to have been addressing the problem correctly for about 1 1/2 minutes - which turned out to be too late to avoid the crash.
Slightly more complex than that. The Captain (and the senior 1st officer) very quickly assessed that they were in a stall, and with plenty of time to correct. He, and the senior 1st officer, believed they were addressing the problem, but the junior 1st officer was ignored / did not comprehend that they were stalling, and held it in a stall almost until it hit the water. Hence the confusion of the senior 1st officer, who could not comprehend why they had apparently lost control of the aircraft - he assessed the situation, reacted appropriately, but did not realise he was fighting his co-pilot.
The cause of the crash was that action, although the lack of awareness of that action from the other two pilots was a major contributing factor.
None of which is relevant to semi-self driving cars, in which the control is handed back to one person (who will individually fail to grasp the situation), rather than a team (who collectively fail to grasp the situation).
There is no mistaking what autopilot is by anyone who actually use a real autopilot, or by anyone who read/listened to what Tesla says autopilot is. I did two Tesla test drives, and both times it was abundantly clear. Those who have muddied the waters are those without knowledge.
ORIGINS
Autopilot started with pilots. Pilots who fly aeroplanes. What autopilot typically did for them is aide in maintaining altitude, or direction, should they become incapacitated during flight. It was never intended to allow a pilot to disengage from flying, and the FAA strictly forbids it when there are able-bodied pilots on board.
Can today's autopilot take a plane from gate to gate, (at certain specific airports with the necessary equipment), without pilot intervention, avoiding collisions and bad weather along the way? Yes, but that is not how autopilot is typically used.
TESLA'S AUTOPILOT
Works in exactly the same way. If a pilot sets his autopilot, then falls asleep at the yoke, and the plane crashes, the NTSB does not fault the autopilot; it is pilot error. The pilot ought to have remained engaged unless incapacitated.
The same thing is true for Tesla's autopilot; the driver must remain engaged. Autopilot is intended as an aide, not a substitute for a driver. Do not confuse autopilot for self-driving capability. Anyone who thinks, "I have autopilot engaged, therefore I can read my Twitter feed," is an idiot who ought to have their license revoked.
EXPECTATIONS VS DELIVERY
I am no longer waiting for Windows 95 to finally deliver what Microsoft promised, because I switched to Linux over fifteen years ago. With each new Windows release, they come closer. Windows 10 has almost fulfilled the promises of Win 95, and we had to pay for every upgrade to the current iteration. Not to mention, every iteration came later than promised, and we still no longer have all of what was promised.
Some Tesla users paid for an "autopilot" feature, and received an autopilot feature. They were also told what the expectations of future iterations would offer and a timeline of these features. They have not yet received the expected features which are now beyond their timeline. If Tesla must refund, then Microsoft must do so also, as should about every software publisher. (Ubuntu has refunded every dollar I have paid for software. They have kept what was paid for support) ;-)
CONCLUSION
For Americans to demand a refund for Tesla missing a self-imposed timeline, for software features not yet delivered, is ridiculous. For Tesla to say that if US customers get a refund, then so should everyone else, (same assurances, same delays), is only fair.
You provide a slick argument ... but that doesn't make it correct.
The question revolves around whether the consumer was mislead.
The auto pilot feature was over-hyped from the outset ... hype created by Tesla.
You can't compare it to airplanes, nor Windows.
It was a specific feature that Joe Blogs bought for his car.
He wasn't a highly trained pilot.
The comparison with windows doesn't work either.
Windows was (is) a general operating system, (offering thousands of tasks).
A better comparison would be buying a backup system (to add to windows).
IE. you decide to buy the backup system, but it doesn't work.
It's a mission critical element, that you have specifically bought ... but it doesn't work.
So you get your money refunded.
In reality, the auto-pilot feature was better for donate-ware.
Chuck some money in, and carefully have a play with it, and provide your feedback.
Instead ... pretty much all we heard was 'Tesla Autopilot'.
It seems clear that people were mislead.
... otherwise, why would they hand over $5k ?
"Autopilot is intended as an aide, not a substitute for a driver."
I'm not sure I would have called it "autopilot" as that name brings to mind (of normal people, not professional pilots) certain things that "driving assistant" does not; things helped by the hype and blurb surrounding the product.
This post has been deleted by its author
If they promised feature x that you paid a defined amount floor by a particular date, then i cannot understand what the argument is. They refund your money on request.
As for the thin "autopilot isnt meant to drive without supervision" bs, thats just a legal get-out for Tesla during their Alpha testing development phase whilst people are still getting killed on a regular basis. There's an established name for what they have right now. It's cruise-control. They could have called theirs advanced cruise control and everyone would have known what they meant.
But no. And now morons are on YouTube showing just how little attention they can playing with a high- speed alpha software program on public roads.
Meanwhile Tesla CEO Elon Musk continues to complain about news coverage of car crashes that may have been caused by the Autopilot system.
Is this not a simple case of shooting the messenger, or at least trying to? If this august publication is typical he seems to have succeeded; a thread on the subject a few days ago had more people commenting on the shortcomings of "the media" than the shortcomings of Tesla and its messianic leader.
There may or may not be a law specifying that customers must be reimbursed by money paid up front for goods or services that were not delivered as promised. Nevertheless, isn't it customary to contractually specify the penalty for defaulting on one's obligation? People really write checks for $5K without asking "what happens if"? Oh, I forgot: new and shiny...
...for similar reasons. I bought a Specialized Vado 4.0, which was (and still is) touted as having the ‘Specialized Mission Control App’ - which links via Bluetooth LE and provides granular control over motor output, range, navigation and advanced logging abilities.
Problem is the Vado wasn’t, isn’t, and probably never will be compatible with the Mission Control app - at least on iOS. I requested a refund, Specialized first refused on the grounds that the App wasn’t essential to the riding experience, second it was just around the corner and would be delivered any day soon, and third I’d moved state and therefore wasn’t entitled to a refund outside the first 30 days.
Took the bastards to the Small Claims court, and luckily the judge agreed the bike wasn’t as described, and omitted key functionality which, had I been aware, would have influenced my purchasing choice to the extent that I probably wouldn’t have bought it. I got the purchase price back minus $250 for the 950 ridden miles; which I still believe I shouldn’t have had to pay, but accepted because otherwise I’d have had to spend even more serious money. For the record this was back in November 2017; the app still hasn’t been delivered for iOS.
Bastards. The attitude of “We can advertise one thing, sell you something else and you’ll just have to suck it” Will never buy from them again.
The term 'pilot' doesn't just apply to planes. It can also apply to shipping.
A quick flex of a search engine will find loads of references to yacht autopilots. Single handed long distance sailors use these to let the boat sail itself whilst they go below and cook meals, ablute, and sleep. So there is a use case where an autopilot can be left to run the show whilst the human is asleep. Therefore it is not unreasonable for a keen yachtsman to expect similar features on a car autopilot. With an alarm to wake up the driver if a hazard is detected.
As far as I know merchant ships also have autopilots and it is not unknown for the above experienced yachtsman to tell hair raising tales of nearly being run down by a freighter in the English Channel with nobody on the bridge. The tale sometimes/often features a barking dog.
So the term autopilot implies more capability than a cruise control, and can therefore be misleading.
You wouldn't normally use an autopilot on your yacht in a busy shipping lane or entering/leaving a harbour or mooring but it is perfectly fine for tootling along on the open sea.
Tesla marketing people please note.
Re yachts, shipping etc...
Teslas already have a highly advanced autopilot; an order of magnitude more advanced than systems in shipping vessels, and way more than just cruise control. Cars travel much faster, are much closer together and have to navigate far more complex environments than ships, and even then Tesla systems are nearly at the fully autonomous stage - it truly is an impressive achievement. The issue is that nearly Isn’t enough; Musk promised it would be fully autonomous and road-legal, and that hasn’t happened yet.
It must be incredibly frustrating for Elon. The product is AWESOMELY good, and yet because he’d promised it would be INSANELY good, it’s still perceived as underperforming.
Lord Elpuss quaffed his Musk-laced Kool-Aide and opined...
"...already have a highly advanced autopilot..." No it isn't. The Concept and its SW are terrible.
"...are nearly at the fully autonomous stage..." No they're not. Not even close.
"...is AWESOMELY good..." No it isn't. It keeps crashing into things. That's not good.
...someone was willing (or desparate enough) to destroy their hipster credibility and look like a tit for the stock photo accompanying this article.
I hope for their sake it paid enough to cover their rent, tuition fees and/or monthly consumption of over-hopped craft IPAs.
At least he wasn't made to wear spectacle frames containing no glass, which is an improvement for Shutterstock.