Fully self driving
Does not make the vehicle autonomous and requires careful driver supervision
Anyone want to buy my Fully Profitable investment product (does not make money and requires continual input of new capital)
Tesla has yanked the latest beta, 10.3, of its Full Self-Driving (FSD) software from participating car owners after boss Elon Musk noted the company was "seeing some issues" with the code. Seeing some issues with 10.3, so rolling back to 10.2 temporarily. Please note, this is to be expected with beta software. It is …
This post has been deleted by its author
As a learner driver is effectively an Alpha or Beta compared to a qualified driver and they are required to have L plates and optionally P plates, then perhaps any self-driving car should have B plates on if they are running non-tested software so that the rest of the world can recognise that they are likely to do something stupid
You put it in a rolling testbed and test it thoroughly under real conditions, using professional drivers to override the system when it goes wrong.
You don't just plonk beta software on the general public, especially when the software is controlling a potentially lethal weapon.
Volvo had a fleet of Volvo 240s converted to automated driving in the 70s, likewise Mercedes and Volkswagen had converted existing cars for automated driving in the 70s and 80s. These rolling testbeds were used to refine the self driving or column driving (Mercedes, ISTR - all cars in a queue on the motorway slave off of the car in front).
These are not production cars, but specially adapted testbeds (based on real cars), used to test and refine hardware and software. Alphabet's Waymo (formerly Google) also does the same. What they don't do is let beta software loose on non-professional drivers to test on public roads!
What they don't do is let beta software loose on non-professional drivers to test on public roads!
Yeah, but it's cheaper. Those test drivers don't cost you money, in fact they pay you to test your product - what's not to like?
To me it's simple - if a specific mode of automation is not legal to use in a jurisdiction then it must not be able to be enabled by the owner/user. This could be enforced by a penalty on the driver (for enabling) and on the manufacturer (for allowing to be enabled). This way the only vehicles on the roads with this software active would be those test-beds that had been authorised by the relevant authorities.
I'm getting pretty pissed off at hearing Tesla drivers bragging about how they've been driving on the motorways with automation active that isn't legal to use. Tesla weasel out of their responsibility by caveating that you need to "maintain control of the vehicle" when we all know that drivers are utilising reduced attention levels when enabling these modes.
Would these drivers be happy if I bragged about being pissed on the same motorway? Mildly tipsy?
Emissions aside, there are plenty of cars that you actually own, and even with emissions if you follow the law you can in fact repair or upgrade those components as well.
There is a special place in hell for a car that is always online, always talking to the mothership, always recording, requires an EULA to even purchase, and threatens its "owner" if it detects unsanctioned hardware (the Ethernet port flap some time back).
"There is a special place in hell for a car that is always online, always talking to the mothership, always recording, requires an EULA to even purchase, and threatens its "owner" if it detects unsanctioned hardware (the Ethernet port flap some time back)."
Have you been car shopping lately? Because that describes most new car models nowadays. "always connected" is basically mandated by EU law nowadays! Lots and lots of equipment in modern cars is now in "black boxes" that talk via proprietary protocols over CAN-bus and are serialised where the car won't even start or run if the right codes aren't detected. Aftermarket stuff may be possible, but only if you pay the stealership to program the computer through the manufacturers proprietary (and locked down) computer system.
Engine ECU's are nowadays a tiny (and mostly inconsequential) part of what a car does. Most of it's functionality is dictated by a central computer (often the "infotainment" system) that is not directly the ECU. And that includes stuff like controlling light modules (now controlled though a CANbus interface, instead of switching 12V on or off), heater controls (instead of a knob pulling a cable to open or close a valve it's now a 3 levels deep menu on a touchscreen controlling a stepper motor on a valve. Probably again through a CANbus interface to some intermediate module. Airbags? CANbus to intermediate module(s). Seatcontrols? Probably CANbus. Dials/gauges? Electronic screen that will mysteriously die 2 weeks after warranty ends and will be unobtainium in 10 years, controlled through CANbus of course. Serialised too so you can't put in a different one if it breaks. And no, mileage is usually stored on several OTHER modules, not on the instrument/dial cluster itself nowadays so it's not that either.
There's lots and lots and lots of stuff in modern cars that is basically proprietary electronics/programming that means it becomes harder and harder to make/buy aftermarket stuff to repair cars that are broken. You can currently keep a now 15 year old car on the road for a very long time with the spares available to purchase nowadays. Repairing a 10 year old car is very tricky as lots of stuff is no longer available. Cars that are now 5 years old will very likely be completely repairable 10 years from now because of some stupid electronics stuff that is completely unnecessary to the function of the car and completely unobtainable by that time preventing it from working. If you don't believe that, just keep a look out for how many slightly older model VAG cars you see about with the early generations of LED lighting and how often they're failing. But they're monolithic units and extremely expensive to replace so nobody does it. And all this is WORSE with electrical cars because they all use basically entirely and only proprietary equipment in their battery, BMS and driveline. Tesla's are very difficult to fix by design for instance and lots and lots of their functionality can be disabled remotely by Tesla at their whim. There is no way to opt out, no way to prevent that. Tesla owns your car, you're just allowed to use it. And the same will go for many others.
The bit that irks me more is the note about safety scores. "Sent out [...] initially only to those with "perfect" safety scores, according to Musk, before being made available to those with Safety Scores of 99/100."
So your car rates your safety. I guess it won't be long before these scores are collated by data aggregators à la Equifax and sold to whoever is interested. Presumably except to yourself, you know, because of security. Then your car dealer will let you know that, "sorry, with a safety score of only 97 out of 100, you can not have that tuning kit."
Insurances. Car rentals. Ride sharing. Inner cities. All will deny you based on some black-box AI assessing your driving skills. What can go wrong?
I think it already happens with data from phone GPS and force sensors. (All phones have GPS sensors, even most $100-$200 cheap phones have force sensors).
My driving insurance co used to ask me to send them my odometer reading, but that stopped a few years ago. I am assuming they determine that indirectly with cell phones now.
No, that probably has other reasons. Your insurance is based on the number of kilometers you drive annually. So you sign up for insurance for, e.g., 10,000 km/a. Then all they need is the initial odometer reading. When you have an accident, they will ask again, and if you've exceeded your limit on average, they will simply deny your insurance claim.
.. or stationary emergency vehicles (a problem they have apparently known about for years)?
I have seriously looked at buying a T-55. Problem is it doesn't even begin to fit in my garage and getting the required paperwork for owning the (de-milled) gun is a serious pain in the neck. Not to mention the cost of transporting or fueling it (diesel usage being measured in gallons per mile instead of the other way around should give you some indication). And since afaik there's no rubber track shoes for T-54/55 track, can't really drive it anywhere either.
I wouldn't worry about the Teslas - It seems the AI has a preference for concrete barriers, tractor trailers and -in latest version - police vehicles
https://www.nytimes.com/2021/06/29/business/tesla-autopilot-safety.html
- whilst containing the problem by offing the "drivers" !
Oh, he got the trick of blaming the drivers from Microsoft: everything that goes wrong there is always blamed on the user and the poor schlobs having to manage it. It's never Microsoft's fault that it is the common thread between all mass hacks of late..
So much negativity here on what's a hugely impressive combination of hardware and software engineering. Truly impressive and solving one of the hardest problems..trying to predict what us unpredictable meatsacks will do when behind the wheel.
The transition from human to computer driven car will be a bumpy road I'm sure, but once we're further down this road, we'll look back and wonder why we didn't do it sooner to reduce the huge number of road deaths each year.
With all due respects, that problem is not solved. The sooner that myth is laid to rest, the better. Disclosure: I am a pedestrian.
Railways are a subset of surface transportation, and have been around for nearly 200 years in the UK, and there are STILL fatal accidents.
https://en.wikipedia.org/wiki/List_of_rail_accidents_in_the_United_Kingdom
...is not a comprehensive list.
Railways are a lot, lot simpler to control and operate than cars: the carriages can go one way... or the opposite way, plus the people that operate engines are -er- trained. The design of the vehicular safety systems are closely interlocked with the fixed ground-based security systems that route and signal the trains.
Now compare that with road transport.
Most programmers are used to serving up software that recovers from errors by putting up a dialog box with a "There has been an error" message in it and expect a user restart. In real life -- that is, real time -- you can't do that. You can't work around a serious bug, you have to pull the release and start over. This is to be expected.
As a rule the worst bugs -- the ones that cause system failures immediately -- are the easiest to fix. The ones that only cause subtle problems that only seem to turn up with particular phases of the moon or cause tiny, but incremental, position errors are the tricky ones.
Who on god's earth trusts a guy that insinuates miners are paedophiles (whilst looking fairly paedo like himself), spends most of his life ramping up cryptocurrency, ignoring Covid regulations and putting his business over the health of his employees, and then produces the most unreliable vehicle money can buy?
The guy is pretty annoying at best.
I have a Tesla.
No way on earth they'll achieve full self driving with this generation of technology. For example, my car often misses temporary speed limits and emergency brakes at odd times.
Until cars can communicate with each other and the infrastructure around them I just can't see it happening.