All full?
When the teat of venture capital starts drying up, it's time to have a long burp and count the takings (before possibly moving on to the "next big thing" venture).
The CEO of self-driving cab outfit Cruise has parked his career and strolled off into the sunset. Kyle Vogt founded Cruise Automation in 2013. The startup was acquired by General Motors in 2016, and worked to develop a driverless taxi service that commenced operations on the streets of San Francisco in 2022. Cruise has had a …
You were saying?
https://www.sfchronicle.com/bayarea/article/waymo-cruise-driverless-cars-18304792.php
Although it looks from the reported caveats on the numbers that US regulators an NHTSA are clueless. If they were competent, they'd have each incident properly defined and authenticated (so no dupes), they'd have data for vehicle miles travelled, and they'd also publish alongside reference data for accidents involving human driven vehicles on similar duties in the same area.
OceanGate springs to mind... Anyone who says the S-word (safety) gets fired.
There was a repeat of an old documentary on the telly a few days ago which featured a self-driving car startup, I wish I could find it.
The CEO was driving the presenter around a test track at about 15mph using a shoddily-built machine that controls the steering wheel and pedals of a standard car, with a webcam on the roof to see the world, when the control started to go unstable, and they experienced what he jokingly called "just a bit of pilot-induced oscillation" as the car started to wobble and then swerve violently.
It went forward a few years and it was starting a robotaxi service. It may even have been the same bloke / company.
One of several identical Aimotive Inc 'disengagement reports' (this one dated 2022/06/15) states "Our software has a limitation in that we do not (yet) cancel lane changes once it has been initiated; however, we do cancel lane changes if the software deems it unsafe prior to the actual maneuver. In this case we had already started the maneuver, therefore the driver had to take over. This isn’t a software bug, it’s a known issue that is being worked on"
Which means they actually designed it not to take corrective action when a situation becomes dangerous. Quite apart from the defective grammar, this should take some serious explaining, but it seems to be OK because self driving cars are the way forward (but obviously not safely sideways).
Of course it solves problems. Self-driving cars have the potential to vastly reduce the number of accidents (they probably already have less accidents than human drivers, but they get more scrutiny). They also allow people to concentrate on doing more interesting things than driving in traffic, like reading or sleeping or anything they want. On average, American drivers spend two years of their life driving, which is a real waste of time.
Cruise was a Dot Com 2.0 demoware company that suckered a very desperate GM into spending a $1B (yes billion) on what was nothing more than a very flaky piece of demo hardware running a cobbled together stack of software thrown together from various university grad projects and open source projects.
GM obviously did zero technical due diligence before buying. Because the first thing that happened after GM acquired the company was they seemed to have brought in for the first time someone with some relevant technical expertise. And they immediately added actual physical sensors (LIDAR etc) to the vehicles. So they could actually accurately monitor what was going on outside the vehicle rather than totally depend on half assed hacks of purely visual / object recognition video processing software. Running on low QOS video links.
GM then turned around and sold half of Cruise to the ultimate Bigger Fool - Softbank. GM next spent serious money buying regulatory approval to allow these death trap cars on the streets of SF. I've seen these cars stop in the middle of 16'th St, confused by a MUNI bus. I've seen them miss some of the often very confusing / missing lane makers, markers that human drivers take as a given while driving in SF.
People who took these "Free Taxis" do not realize just how shoddy the software they are running is and just what death traps they are. About as safe hitching a ride with a bunch of stoners returning from a Grateful Dead concert. As least crackhead and meth drivers pay attention while driving. If a tad aggressive.
Only question now is has the Idiot Senior VP at GM who signed off on this very transparently boneheaded deal been fired yet. Cruise had been shopped around for a while with no takers. Because anyone who did any competent due diligence would have seen what a heap of crap the hardware / software was. Pure trade show demoware quality.
The real problem is not technical, but legal / ethical. With 99.99% reliability you will still have incidents when the autonomous vehicle has a dead body under its front wheels. The passengers in the vehicle and society in general must consider that to be acceptable and not look for fault or compensation.
The usual retort to that is "but they are still better than humans", so great it happens less often but it still happens and the buck has to stop somewhere.
Right now its with the human driver, fair or otherwise. If you were in charge and you caused death/damage/injury then its you and/or your insurer that are liable. Thats what third party insurance is all about. You might even end up doing time.
Govts are rumbling about holding manufacturers accountable if a self driving car causes a loss. I doubt very much if the manufacturers will sign up for that because that means unknown, potentially significant liabilities over a very long time span. They will try very hard to shift that back onto the end consumer. What if they bought the tech in from somewhere else? What if they are no longer in business. What if the sensors are in less than perfect condition after being out in the real world for 5 years. Think your ABS warning light is annoying, just wait...
Whole thing just hasn't been thought through, pretty much like everything Big Tech tries to force upon us and our great leaders fall at their feet hoping to pick up a few crumbs.
And I'd argue against the retort that they are better than humans. Maybe in the nice easy predictable environments like motorways, but not in complex urban environments or windy, twisty country lanes, or in foul weather which I suspect are the conditions under which the majority of accidents tend to happen. Humans make a lot of mistakes, but they also learn from those mistakes (the decent ones do). "F*ck that was close! Better not do that again". For the software to "learn" (i.e be updated) the provider needs to be able to inspect the near misses and that leads to some serious privacy concerns.
Better than which humans, specifically?
The ones paying attention to traffic, weather, and road conditions? Not so much.
The ones checking their 'smart' phones, balancing their checkbooks (I witnessed this firsthand) or doing their makeup? Unquestionably.
Could the real problem be a societal one rather than a technical one?
Here in the UK the recent King's Speech (announcement of planned legislation) included a first attempt to address this.
... only the driver – be it the automated vehicle or a person – will be held accountable in the event of an accident or incident.
Non-driving responsibilities however will still remain with the person behind the wheel, such as maintaining appropriate insurance for the vehicle and ensuring proper loading, as well as responsibility during any part of the journey where the person is driving.
https://www.rac.co.uk/drive/news/autonomous-vehicles-news/driverless-cars-move-closer-to-reality-in-kings-speech/
There could be a loophole here - KillerCarCo says "everyone must upgrade to v1.3.7 because there are big fixes". Upgrade (deliberately?) costs a small fortune. Car isn't updated because owner can't afford it, or just hasn't been paid yet. Car has an accident while in full self driving mode. KCC denies responsibility because the responsibility doesn't lie with them because the car wasn't updated.
They're going to have to be very careful how they word this legislation, I think.
. . . the number of accidents or incidents caused by human drivers per vehicle mile over the same period of time.
Human driver gets into fender bender, not even news.
Autonomous vehicle involved even tangentially? Front page, above the fold.
Note: I view the current technology with some skepticism but I also look back on the days when barely usable computers (even Macs!) used to crash on a regular basis while today we have systems beyond the dreams of Seymour Cray sitting in our back pockets performing complex tasks and running with astonishing reliability.
Computers can be improved. Humans? You tell me.