back to article UK signals legal changes to self-driving vehicle liabilities

The UK government has promised to "clarify and update" the law to allow the introduction of self-driving vehicles to the country's roads, but it is set to be a long, technical journey. In the King's Speech this week, in which the governing party sets out its legislative program, King Charles III said ministers would "introduce …

  1. abend0c4 Silver badge

    £42 billion and 38,000 skilled jobs

    Using the Cruise metrics, that equates to around 25,000 robotaxis at a net cost of £1.6M each. Truly the "potential benefits" of which Liz Truss would be proud.

    1. Doctor Syntax Silver badge

      Re: £42 billion and 38,000 skilled jobs

      I was thinking more about the 2035 target date. Safely in the future - nobody in government today will be there to be held to it, even if anyone remembers.

  2. DJV Silver badge

    "Potholes"

    Given the state of some of the UK's roads, the potholes that I find I have to regularly negotiate locally will definitely scupper many a robo-taxi!

    1. b0llchit Silver badge
      Coat

      Re: "Potholes"

      That you call the scenic route with added entertainment. It will make your gentleman master spy franchise proud by giving a whole new meaning to shaken, not stirred. Passengers will surely offer any and all secrets after a self-driving guided tour.

    2. S4qFBxkFFg

      Re: "Potholes"

      The simplest solution would be to beef up the vehicles themselves. I don't know where the sweet spot is though, somewhere between a Land Rover and a Challenger?

      1. Anonymous Coward
        Anonymous Coward

        Re: "Potholes"

        Boom.

    3. codejunky Silver badge

      Re: "Potholes"

      @DJV

      But the driver must be fleeced to subsidise other forms of transport. How dare we expect our roads to be car worthy

  3. Lee D Silver badge

    This has always been necessary.

    At which point you're putting the liability on the software - which if they are the "driver" by their definition - also means: INSURANCE.

    So now although you might choose to insure your car as an asset, the "3rd-party" (main) component of the insurance should be on the system driving it.

    Then you will discover that a) nobody wants to take that on as a car manufacturer and/or b) the cost of self-driving cars / subscriptions (yep, ongoing costs of insurance will require ongoing subscriptions) skyrockets to compensate.

    And it's only at that point that, drunk as a skunk, you can get into a self-driving car and let it take you home. Until then, you are always the driver/responsible.

    So, look forward to expensive subscriptions for self-driving, paying insurance for the vehicle AND 3rd-party insurance via the subscription, and companies being sued to oblivion and your car "decertified" if, for instance, something like the Dieselgate scandal comes out, or some AI is found to be terribly faulty as a knock-on effect of even one lawsuit involving the cars around the world. A recall will mean "no driver" until you update your software to a recertified version. Not to mention obsoletion when your car software is not up-to-date, or is too old to support, and now it's no longer legal to use on the road except if you're driving it yourself.

    This stuff is all "just another 20 years away" again, because the above isn't going to happen overnight no matter how much business you throw at it. And when it does, Ford etc. are then basically a software / insurance company that happens to make cars.

    1. Neil Barnes Silver badge
      Holmes

      only the driver – be it the vehicle or person – is accountable

      So... in the event that the automation does something which would result in disqualification for a fleshy meatsack driver, is it then forbidden to move until the ban is completed - six months, a year, ten years? Leaving the owner with an unusable and depreciating asset? And if the offence is punishable with prison, will there be a big HM Car Park to which the vehicle is escorted and left at His Majesty's Pleasure?

      Surely the onus - and the risk - is upon the company developing the vehicle to make sure it doesn't misbehave in these ways.

      1. Lee D Silver badge

        Re: only the driver – be it the vehicle or person – is accountable

        Why not? That's what humans get. And this self-driver is all the same entity, so yep.

        Now imagine the insurance that the manufacturers will build into it once they realise that one car accident could cripple all their cars until the courts rule it's safe to drive again. Ouch.

        The onus is entirely on whatever is in control of the vehicle. If that's me, it's on me. If that's not me, it's going to be the car.

        Like I explain to bad bosses on a regular basis in my career - I can have both the power and the responsibility, or neither, but you can't mix and match.

        1. Rob Fisher

          Re: only the driver – be it the vehicle or person – is accountable

          Imagine if one human cat accident meant that all the human driven cars were taken off the road until the bugs in all the humans could be fixed.

          1. Doctor Syntax Silver badge

            Re: only the driver – be it the vehicle or person – is accountable

            Human drivers are all different. That's why they're tested individually. The automated drivers for the same model of car should be all alike. If they're not there's an even bigger problem - how to certify the model.

            1. SCP

              Re: only the driver – be it the vehicle or person – is accountable

              That's why they're tested individually.

              If the standard of testing of automated systems was similar to that of human drivers (even in those countries that have comparatively strong driving tests) it would be a very low bar.

              Humans bring many strengths to the challenges posed by driving that are very difficult to achieve automatically, but also a great many weaknesses - and that is before you get to some of the abominations that some "designers" inflict with their road designs. This is evidenced by the statistics on road deaths/injuries. Even so, we do not insist upon periodic re-certification of drivers.

              Automated driving still has a way to go before it is safe enough for common and widespread use - but you are not going to progress the technology without taking it onto the streets([in a well-planned and carefully controlled way). Part of the challenge is updating the regulatory framework that will oversee the use of those systems.

          2. LybsterRoy Silver badge

            Re: only the driver – be it the vehicle or person – is accountable

            -- one human cat accident --

            Does that translate to a human who is part cat, or an enhanced cat (C'Mell maybe) or me treading on the moggies tail?

        2. gnasher729 Silver badge

          Re: only the driver – be it the vehicle or person – is accountable

          Insurance should just be handled by the insurance companies, same as now. They will have data eventually how much damage was caused by a million self driving cars in a year. And that may be more or less than the damage caused by a million drivers. So the insurance premium will be higher or lower.

          1. LybsterRoy Silver badge

            Re: only the driver – be it the vehicle or person – is accountable

            -- same as now --

            Now means us ICE drivers are paying higher insurance premiums because of EV drivers so the take is human drivers of cars will pay more because of AI driven cars. Think I can do without that.

      2. Doctor Syntax Silver badge

        Re: only the driver – be it the vehicle or person – is accountable

        "is it then forbidden to move"

        If the software is common to all the vehicles of that type then presumably the ban should apply to all of them.

      3. hoola Silver badge

        Re: only the driver – be it the vehicle or person – is accountable

        Yes but there still has to be the ability to "Ban" the vehicle,

        Let's assume there is a bug. That makes all other vehicles vulnerable so they should all be "banned".

        The company then has to take the hit providing taxis or whatever to the people who have bought the shite in the first place.

        Just like an MOT, if it is not road worthy then it cannot be legally driven.

        It will be easy enough as the vehicles are all connected to data anyway.

        Just because this is software does not take away any of the responsibility. Software has a history of being bug-ridden detritus that is just about good enough. Those systems that are good are at teh very top end and extremely specialised, not consumer solutions.

        There is an acceptance that software can have bugs, it does not matter, it does not kill or injure, just a periodic inconvenience. Now you are putting the software in control 1 1/2 tonnes of metal that is interacting with the general public without constraint.

        1. druck Silver badge

          Re: only the driver – be it the vehicle or person – is accountable

          1 1/2 tonnes

          And the rest, if it is some hideous over-sized electric SUV.

          1. Stork Silver badge

            Re: only the driver – be it the vehicle or person – is accountable

            Even something relatively modest like a Kia e-Niro is 1800kg

            1. J.G.Harston Silver badge

              Re: only the driver – be it the vehicle or person – is accountable

              I've just checked the manual, and my car is 940kg. Though, 1040kg when the driver is added. ;)

              1. Anonymous Coward
                Anonymous Coward

                Re: only the driver – be it the vehicle or person – is accountable

                mx5? sounds about the right weight, great car

        2. Justthefacts Silver badge

          Re: only the driver – be it the vehicle or person – is accountable

          Sorry, how is this different to today? If a car manufacturer designs smthg whose petrol tank explodes, there’s a product recall. The manufacturer replaces the component at their expense.

          Recently, Toyota released a car where the wheel tended to fall off. Because they’d forgotten to factor in that electric cars put more torque down; strains the wheel more.

          Mostly, they don’t release cars until they are confident that the design is damn-foolproof. And occasionally they get it wrong and lose money; and have to pay compensation. People are waving that this is the first time anything like this has happened in human history, and it just isn’t

          1. Richard 12 Silver badge

            Re: only the driver – be it the vehicle or person – is accountable

            It's a difference in kind of recall.

            At the moment, at its worst a recall generally means the vehicle can be driven slowly to a registered garage for the faulty part to be replaced before it fails catastrophically.

            It's also very unlikely that multiple vehicles will fail simultaneously as driving style and pothole counts vary.

            It is probable that faults found in self-driving software will result in all vehicles using a particular version to be immediately taken out of use until the software is fixed, verified and installed, because suddenly they're uninsurable, if not illegal.

            The same as a driver who loses their sight isn't allowed to drive home from the optician.

            1. Justthefacts Silver badge

              Re: only the driver – be it the vehicle or person – is accountable

              “It is probable that faults found in self-driving software will result in all vehicles using a particular version to be immediately taken out of use until the software is fixed, verified and installed, because suddenly they're uninsurable, if not illegal.”

              Sorry, this is just total over-reaction and FUD. We’ve already seen early self-driving vehicles on the road. Unlike “I, Robot”, when there’s a bug the eyes don’t suddenly turn red and the car turns into a a murderous beast. They are not “all going to fail simultaneously”. What happens is that under some rather rare condition, the car fails to handle it correctly. The idea that once a car is already shown to meet some safety standard (say “fewer than 5 fatalities per billion miles”), you’re suddenly going to have to take them all off the roads until the new fatal rare condition is handled, is just silly. Unless they all happen to meet the road-condition “man in orange sweatshirt with GO logo, holding the hand of child wearing flashing-light wheelies, near dusk”, they are no more dangerous than before it happens the first time.

        3. SCP

          Re: only the driver – be it the vehicle or person – is accountable

          That makes all other vehicles vulnerable so they should all be "banned".

          Surely that should depend on the vulnerability and the actual risk it presents. For example - should there be a type ban if a very rare set of circumstances leads to a minor accident?

          Even in the case of a serious accident (life-changing injury/death) - if the circumstances are extremely unlikely to be repeated would an instant ban be the best way forward. Such decisions are often not without their own cost in terms of consequential injury and death as people switch modes of transport.

          Within the air-sector accident investigations frequently identify different degrees of remedial action. In some cases immediate grounding is needed, but often the risks can be addressed through scheduled updates to systems or scheduled maintenance.

      4. SCP

        Re: only the driver – be it the vehicle or person – is accountable

        It seems likely that as automation increases there will need to be an associated investigative body similar to AAIB or RAIB [in the UK - other countries = other bodies]. Certainly the AAIB has the power to ground unsafe systems.

        Obviously a major difference between where car-automation is going and current-day air-systems is that the pilot/flight crew is still present to handle problems - though there is interest in fully autonomous flight (particularly for freight).

    2. hoola Silver badge

      This has been my argument as well.

      As soon as control of the vehicle passes to software, the people who created that software and own all the rights etc have to be responsible and culpable when it goes wrong.

      That includes prison sentences if the failure merits it. Death by dangerous driving has to be the outcome is someone is killed by a self-driving vehicle and people at teh company have to be responsible. That will be the QA team that sign off.

      I would also include the occupants of the vehicle as well. If the vehicle crashes when it is supposed to be self driving and software of sensor failure is the cause then they are culpable.

      Currently there is no responsibility, it is with the driver. The entire point of self driving is that the occupants do not need to take over. The reality is that someone cannot because they are unable to react fast enough anyway. If you need a "driver" in the vehicle to be responsible and be able to take over then the entire thing is a complete waste of time.

      Based on what my latest Golf appears to be capable of doing we are years away from safe self driving. It is not capable of running on the road and we must not be in the situation where to have self driving vehicles, there have to be massive upgrades to roads. That is completely pointless. Spend the money on public transport instead.

      1. codejunky Silver badge

        @hoola

        "It is not capable of running on the road and we must not be in the situation where to have self driving vehicles, there have to be massive upgrades to roads. That is completely pointless. Spend the money on public transport instead."

        Why would we waste the money on public transport instead? Cars are popular for the simple fact that they actually work. You get in the car, drive to where you are going and get out, the exceptions being in crowded cities. Public transport may or not arrive, is not pleasant and you have to get yourself to the pickup point, travel to where the transport is doing dropoffs, then get to where you are going. And that is best case scenario of local travel, anything further afield is usually much worse.

        I am not convinced of automated transport on the smaller roads but on the main arteries I can see it being a good idea. Especially for trucking goods.

        1. Stork Silver badge

          Re: @hoola

          Public transport can work, try to visit Switzerland. The Swiss recognise it, they approve the subsidy in referendums (referenda?).

          1. Doctor Syntax Silver badge

            Re: @hoola

            However well it works it works on specific routes. If your journey simply that route or a part of it it's fine. If it involves parts of two or more routs, even if there are interchange points between the routes, there's likely to be time wasted changing and the journey is likely to be far from direct.

            Nevertheless, even when the journey consumed excessive amounts of time, when I used to commute into central London from Wycombe by train I didn't envy the drivers on the crowded roads we passed.

          2. An_Old_Dog Silver badge

            Public Transport

            I currently live in a major metro area, and public transport here is pretty good. Out in the small towns and rural areas where I lived most of my life, it was absolutely non-existant.

        2. Anonymous Coward
          Anonymous Coward

          Re: @hoola

          "Why would we waste the money on public transport instead?"

          I don't see why my tax pounds should be wasted on roads I do not use.

          1. Anonymous Coward
            Anonymous Coward

            Re: I don't see why my tax pounds should be wasted on roads I do not use.

            Maybe you should spend some time learning about human society and how it works?

      2. Rob Fisher

        By this logic 1000 software developers are driving 100 million cars simultaneously. If those cars cause 1 death, would you jail all the software developers?

        Or divide the sentence by a billion?

        Or give them a reward for saving so many lives compared to the bad old days when humans used to drive?

        1. Doctor Syntax Silver badge

          "Or give them a reward for saving so many lives compared to the bad old days when humans used to drive?"

          Spot the evidence-free assumption.

      3. Justthefacts Silver badge

        Correct analogy, exactly wrong

        “Death by dangerous driving has to be the outcome is someone is killed by a self-driving vehicle and people at teh company have to be responsible. That will be the QA team that sign off.”

        No, because that’s not the case in the case of a human driver, for the offence of dangerous driving.

        “when driving falls far below the minimum standard expected of a competent and careful driver”

        If your human driving kills somebody, and yet you followed all best precautions, you *don’t* get prosecuted Run over a 5yr old child outside a school at 3pm, expect to go to jail matter “why” it happened, because you knew you were supposed to take special precautions. Do so at 3am, with the child wearing black pajamas, it’s the parents fault.

        Competent and careful drivers cause approximately 5 deaths per billion miles. We know this as a measured fact.

        The equivalent for an *engineering team* is to keep the fatality rate below that rate. So long as their implementation does so, they don’t get prosecuted.

        This is just the same as: it is still possible for engineers to design and build bridges. Occasionally those fail, and the design team is not mecessarily prosecuted.

        But if *in addition* the bridge architect was criminally negligent, then yes they would. And that occurs when they fail to take the precautions and design practices in line with professional practice. So it should be same here.

        If, to take a relevant example, the self-driving car fails to roll off an accident victim *because this is the first time anyone ever thought of it in engineering*, so it wasn’t in the test set. It may be bad, adds into the “does this technology kill more than 5 per billion” box, but is not in itself negligent.

        But if the company fails to fix the bug, and to implement that test case into tegression test of all future SW versions. That’s a negligence and process problem.

        I think people are letting their prejudice override standard engineering best practice.

        1. An_Old_Dog Silver badge

          Re: Correct analogy, exactly wrong

          Do we even agree on what software engineering best practice is, or should be?

          There isn't any best practice, or set of best practices, that can't be subverted by a manager eager to do things faster and more-cheaply.

          1. Justthefacts Silver badge

            Re: Correct analogy, exactly wrong

            And yet somehow we manage to deploy safety-critical gear with significant software components.

  4. A Non e-mouse Silver badge

    An interesting point will be the overlap of responsibility between autonomous mode and manual mode. Surely it's not fair to 100% blame the human if the car goes back to manual mode a second before a crash?

    1. Lee D Silver badge

      It's the old hybrid-dilemma.

      If you hedge your bets and try to make a device do two different things, chances are it will do both badly and cause you more problems than either.

      As far as I see it, you would need to be buying a "self-driving car" (with a subscription because the software/insurance would be on the manufacturer of the car) or a "human-driving car".

      At that point you can abandon all controls, steering, instruments, much of the dashboard, etc. and make the car's job so much easier.

      But trying to do both in one is just a temporary solution that's never going to work well in terms of liability, insurance, etc. We're already seeing that with Tesla "Autopilot". All parties point fingers at the other and it becomes an expensive mess to sort out.

      Whereas a dedicated self or human driving car - you know exactly who's liable immediately and can just deal with the collision (never "accident") straight away.

      Self-driving cars will be a thing eventually, but they'll be a totally different thing. They'll be a personal transport unit that you hire or rent. The seats don't even need to face forward or even be seats - they could be beds! But the obsession with trying to make the car do the human's job but tolerating interference from the human, not to mention all the other humans around it, on a road built for humans and signs and signals readable by humans, plus handing back to a dead/inattentive human if it panics, and putting the onus on the human at all times... that's just a ridiculous mess of liability.

      I would be happy to see a little automated pod zooming down an isolated lane of a motorway overtaking me, with kids lying on a bed reading a book on the back seat, and mum and dad making sandwiches in the front seat (which is turned to face the kids). I'd be in sci-fi heaven.

      But what we have at the moment is idiot-hell where some twat thinks that their self-driving car is infallible, falls asleep at 70mph and kills a family, then tries to blame the manufacturer when it's not even clear if he ever turned on the self-driving at all.

      1. elsergiovolador Silver badge

        They'll be a personal transport unit that you hire or rent.

        Yes, the essential bit of "you will own nothing".

        1. Stork Silver badge

          With all the cars on lease or other finance, plenty of drivers are there already.

      2. Peter2 Silver badge

        what we have at the moment is idiot-hell where some twat thinks that their self-driving car is infallible, falls asleep at 70mph and kills a family, then tries to blame the manufacturer when it's not even clear if he ever turned on the self-driving at all.

        At the moment a "self driving car" is legally speaking no more than cruise control; it is there to assist the driver and is never in control of the vehicle. The owner is always liable for a crash and it's very difficult to get the manufacturer for anything because there is no process built into law for holding the manufacturer responsible for anything beyond manufacturing faults. If the manufacturer is moving into driving however then clearly that is no longer fit for purpose.

        I would be inclined to support a different method whereby a self driving car must pass a driving test. Additionally, at the moment nobody bothers legally crucifying a driver who kills themselves in an accident for obvious reasons. If a self driving car manages that then it needs to be prosecuted and banned for whatever period is appropriate; and that needs to remove every instance of that self driving vehicle from the roads for the protection of the public for the defined duration of the ban. And also; the manufacturer should supply insurance for the self driving portion of the car or self insure by agreeing to bear all costs and depositing the half million required in an escrow bank account as per the requirement in the highway code.

        I would be happy to see a little automated pod zooming down an isolated lane of a motorway overtaking me, with kids lying on a bed reading a book

        Let's see what happens in a car with a child crash dummies sitting on a seat in a crash without a seatbelt.

        How do you wear a seatbelt lying on a bed? Surely you'd get your head smashed open against the rear of the vehicle in an accident, and then catapulted through the front window? I suspect that your always going to be stuck with seats and seatbelts.

        1. ragnar

          I think the implication was that it's a controlled lane, separated from the meatsack drivers, entirely filled with computer controlled vehicles that are completely aware of the lane and each other, and thus crashes would be exceptionally rare except for mechanical failure.

          1. druck Silver badge

            Prescott's bloody M4 bus lane all over again.

      3. Doctor Syntax Silver badge

        "They'll be a personal transport unit that you hire or rent."

        So on Monday morning you'll rent or hire the unit that took last night's party of drunks home? I doubt it.

        1. Richard 12 Silver badge

          That's a problem long solved by taxi ranks - and all other firms of public transport, actually.

          If public transport meant a pod that turned up at your house within five minutes and took you exactly where you wanted to go with few to no diversions, why would you own a car?

          Sitting in traffic and searching for a parking space is not my idea of fun.

      4. An_Old_Dog Silver badge

        Renting Transport Pods

        If you flag down a taxi, and it smells of body odor, fentanyl smoke, urine, vomit, sex fluids, and/or old McDonald's Happy Meals, you're gonna say, "Sorry, Charlie ... I'm not getting in there. I'll get a different taxi."

        If people are renting a non-personally-owned transport pod, are they going to have the chance to pass on a skanky one, or are they stuck with Hobson's choice? Because if it's Hobson's choice, people will just refuse to use the pods.

    2. Lurko

      May not be fair, but to bureaucrats that won't matter. Somebody always has to be liable. In the UK it'll be the driver because the bureaucrats hate motorists. In the US it'll be the car maker or software provider, simply because that way there's plenty of money to be made by the lawyers (easier than wringing it out of individual driver's insurance companies on a slow case by case basis).

    3. hoola Silver badge

      And in that situation it will all be logged and available for analysis.

      That logging needs to be in Escrow so that the manufacturer cannot tamper with it.

  5. elsergiovolador Silver badge

    1985

    Looks like the UK is heading towards totalitarian dystopia and self-driving cars is just a thin end of the wedge.

    Here is why:

    The government could use the pretext of ensuring safety and efficiency in self-driving vehicles to install advanced surveillance systems. These systems could monitor not just traffic conditions but also keep a track of citizens’ movements, associations, and routines.

    With the advent of self-driving technology, the government could exert more control over where and when people travel. By requiring mandatory routes or restricting access to certain areas, the government could effectively control population movement under the guise of traffic management or environmental concerns.

    Promoting reliance on automated vehicles could lead to a decline in driving skills among the populace, making them more dependent on government-controlled transportation systems. This dependency could be leveraged to control aspects of citizens' lives, such as limiting travel during certain times or to certain locations.

    By highlighting the risks of cyber-attacks, the government could enforce strict cybersecurity measures that may include invasive monitoring of personal devices and communications under the pretext of national security.

    The government could exploit the vast amounts of data collected through self-driving vehicles for commercial gain or political manipulation. This could include selling data to third parties (as they do with NHS now) or using it to manipulate public opinion and electoral outcomes.

    By controlling the rollout and access to self-driving technology, the government could create a divide in society – those who have access to the latest technology and those who do not. This could lead to social stratification and increased control over the privileged class.

    With self-driving vehicles being highly dependent on software and remote control systems, a government with nefarious intentions could theoretically gain access to these systems. This access could be used to remotely control vehicles, directing them to crash at high speeds, making it appear as an accident. Such a method of targeting opponents would offer the government plausible deniability, as vehicle crashes can often be attributed to mechanical failures or errors in the self-driving system, rather than foul play. This tactic could be used selectively and covertly against key political opponents, journalists, activists, or anyone deemed a threat to the regime. The randomness and apparent non-connection of these accidents could make it difficult to trace back to the government. People raising concerns could be dismissed as conspiracy theorists etc. using the usual methods.

    1. heyrick Silver badge

      Re: 1985

      I'm not sure Braverman is that technically literate... (thank $DEITY)

      1. Doctor Syntax Silver badge

        Re: 1985

        Technical illiteracy is no block on her intentions. Quite the opposite, I think.

    2. Stork Silver badge

      Re: 1985

      To push through such a grand plan requires more capability than most governments have shown in recent times.

      1. J.G.Harston Silver badge

        Re: 1985

        It's good to be alive

        In 1985!

    3. NiceCuppaTea

      Re: 1985

      Dont forget it will only accept electronic payments in governemnt crypto/digital coin so if your on the naughty list you cant travel!

  6. Jimmy2Cows Silver badge

    38,000 new skilled jobs

    Where do they get these arbitrary numbers from? Doing what, exactly?

    Manfacturers already have their R&D teams, and if they solve the problems around self-driving those teams definitely won't be growing, so it won't be that.

    Manfacturers already have their buillding teams, and are progressively automating their build processes, so it won't be that.

    Existing mechanics will mostly retrain to be qualified for work on autonomous cars, so it won't be that.

    Taxi drivers will be displaced by autonomous cars, so it definitely won't be that.

    1. Peter2 Silver badge

      Re: 38,000 new skilled jobs

      Yes, that was my thought as well.

      Presumably the new jobs will be plugging them into chargers and cleaning the seats?

    2. Doctor Syntax Silver badge

      Re: 38,000 new skilled jobs

      "Where do they get these arbitrary numbers from?"

      Sitting round a table shouting out numbers until someone says "That sounds about right.".

      1. elsergiovolador Silver badge

        Re: 38,000 new skilled jobs

        That's literally how it works in many places.

        1. Anonymous Coward
          Anonymous Coward

          Re: 38,000 new skilled jobs

          and they call it a 'research-based decision-making process'

  7. BinkyTheMagicPaperclip Silver badge

    Automated driver handover

    I'm sorry, but what drugs are the researchers on? Neither 2 nor 40 seconds is in any way adequate. It needs *minutes*.

    Either the time to handover is short (and 40 seconds is still short), in which case a self driving car is largely useless as the driver has to be continuously alert and in a position to take control.

    Alternatively it's a self driving car that actually works within a given environment (let's say a motorway), in which case it is 'safe' to take a nap, have lunch, engross yourself in a book or computer game, where there is several minutes warning that the motorway turn off is arriving and control is needed on A/B roads. If the driver does not take control a few minutes prior to turn off, the car most probably should move to the hard shoulder and stop. That has implications both for smart motorways, and also for detection and penalties for not taking control in a timely manner.

    Can't say I'm confident or particularly enthused about the prospect of a self driving car in the forseeable future, even if it's highly attractive to be able to tell the car to do 4-6 hours of motorway driving and chill out, or even the prospect of starting a long distance journey early in the morning and sleeping most of the way.

    1. elsergiovolador Silver badge

      Re: Automated driver handover

      as the driver has to be continuously alert and in a position to take control.

      That is probably going to be extremely frustrating to many people. It's like being a passenger supervising another driver. At least when driving you are doing something that is somewhat exciting and stimulating. Probably sitting in such a car would be an equivalent of watching the paint dry and you need to jump in and remove any fly you notice.

  8. Blofeld's Cat
    Coat

    Er ...

    "But a report from Parliament's Transport Committee highlights a number of obstacles in the vehicles' path."

    Which is rather more than the vehicle's sensors did in at least a couple of cases ...

  9. jonha

    We have two Mway junctions near town with pairs of giant roundabouts, four or five lanes. The road markings were pretty unclear to start with and are now (mostly) so faded that they're becoming a real hazard, esp for those who don't know the roundabouts. I am not sure how a self-driving car will negotiate this sort of thing... never mind who's responsible for any accidents.

  10. JimmyPage Silver badge
    WTF?

    It's so fucking simple

    it's embarrassing.

    If a car wants to drive by itself (or it's manufacturers) then it sits a bog standard UK driving test.

    If it passes then - like us mean puppets - it has demonstrated a level of competence assumed to be "safe enough"

    All of this hand wringing would make sense if human drivers never made mistakes or killed people.

    1. Dave314159ggggdffsdds Silver badge

      Re: It's so fucking simple

      Passing a driving test is utterly trivial. Current cars on the roads are capable of it, to the extent that any car would ever be - they can't be seen to check mirrors, eg, so the test isn't actually applicable.

      We're way beyond that stage already. The hard stuff is basically untouched, but urban driving well enough to meet driving test standards is easy.

    2. BinkyTheMagicPaperclip Silver badge

      Re: It's so fucking simple

      Would you like to take a taxi driven by a driver that had passed their test that afternoon?

      At the minimum you'd expect a Pass Plus certified driver, so the goalposts are moving already.

    3. sabroni Silver badge
      FAIL

      Re: It's so fucking simple

      For every complex problem there is a solution that is simple, straightforward and wrong.

  11. Herring`

    Scepticism

    I think about when I am driving in traffic. I am doing stuff like establishing eye-contact with pedestrians, cyclists, other drivers. Assessing whether they know my intentions and I know theirs. Then there's classic scenarios like a football rolls across the road from between parked cars - probability that a kid chasing it might run out. Someone is in that parked car - are they going to open the door. If you've spent time on the road on a bicycle or motorbike and you're still alive, you've probably got used to assessing stuff like that.

    Motorways are probably a better option for automation. But there's still stuff there - like if you've got cars both overtaking and undertaking a middle-lane dope, they might follow the manoeuvre with attempting to both occupy the same space in the middle-lane at the same time.

    1. elsergiovolador Silver badge

      Re: Scepticism

      The problem is that the rich people think earth is over populated. If they foist such a dangerous contraption on the pleb, it means it will have an effect of herd thinning.

      They just need to wrap it somehow, so it is palatable to the masses.

      1. heyrick Silver badge

        Re: Scepticism

        DEATH RACE 2030

        Drivers start on the east coast with ten million dollars. Everybody that gets run over is a hundred thousand deduction. You forfeit all if the car crashes (especially if you die). What's left if/when you reach the west coast is yours to keep. Better hope it's a positive figure.

        Televise the whole thing. You'll get idiots lining up to participate. It'll be a great way to testing the technology and thinning the crowd a little while entertaining the rest. As for payments and sponsorship? Well, we have the technology to put big OLED panels on the cars to push endless adverts, not to mention along the route like those god awful boards around the edges of football fields these days. And if it distracts the robodriver, oh well...

    2. Dave314159ggggdffsdds Silver badge

      Re: Scepticism

      I really don't understand why so many people, including you, are so totally confused about what are the easy parts of autonomy, and what the hard.

      The bits about eye contact and making intentions clear are largely irrelevant - autonomous cars will be entirely predictable and very cautious and patient compared to human drivers. But it's a fair point to suggest we might want some additional signals added to autonomous vehicles for communicating with other road users.

      The stuff about noting hazards and motorway driving are the (relatively) really easy bits. Sticking to rules instead of driving dangerously due to impatience is not an issue for computers.

      The stuff that is hard for autonomous vehicles is things like unconventional road layouts, cones round roadworks, and similar stuff humans find very easy - basically, like Captchas, where we are much better at certain kinds of image recognition.

      1. elsergiovolador Silver badge

        Re: Scepticism

        Uh...

        autonomous cars will be entirely predictable

        The issue is that you can't predict the behaviour of the environment. You the autonomous car is going to work on the basis of if X then Y, well... good luck. It's not going to work.

        we might want some additional signals added to autonomous vehicles for communicating with other road users

        One thing driving instructor tells you that you should never trust the signals you see on the car and you have to use your instincts to figure out the intention.

        Classic example are roundabouts. Drivers tend to give wrong signals all the time, take the exit at the last second or take one they didn't signal they are going to take.

        You could probably mitigate some of it if all cars on the round are autonomous and can coordinate with each other, but it's never going to happen.

        The stuff about noting hazards and motorway driving are the (relatively) really easy bits.

        Nothing easy about motorway. People tend to picture it as a straight road, with wide lanes and nothing going on. Until you get an animal on the road or your tire bursts and many other scenarios you can't do if X then Y for.

        The stuff that is hard for autonomous vehicles is things like unconventional road layouts, cones round roadworks, and similar stuff humans find very easy - basically, like Captchas, where we are much better at certain kinds of image recognition.

        We currently don't have a system for autonomous driving, so probably the hardest part is to build one, but such technology does not exist yet. Sure you have some systems that pretend they drive autonomously, but it's like trying to have conversation with AI. You can quickly tell, something is "off". Mainly because current systems can't reason - contrary to what brochures for investors say.

        1. An_Old_Dog Silver badge

          The Hard Stuff

          I don't think self-driving vehicles are equipped to deal with trees falling off buildings down on top of vehicles in the roadway. I mention this because I saw it once. The human driver stomped his gas pedal (the lane ahead of him was two carlengths clear), and the tree crashed through the cargo area of the delivery van he was driving. Death avoided! I don't think self-driving cars have sky-scanning sensors. (I think the human driver probably saw an unexpected shadow, looked up, then reacted.)

          "But how often does something like that happen?", you ask.

          Risk is the likelihood of something bad happening, multiplied by the negative consequences of it happening. I think we all set "human death" at an extremely high value of "negative consequences".

          1. sabroni Silver badge

            Re: I think we all set "human death" at an extremely high value of "negative consequences".

            I think we all set "our own death" at an extremely high value of "negative consequences".

            I think a majority of us set "human death" at an extremely high value of "negative consequences".

            The suits in the C suite are more concerned about dividends.

  12. Rich 2 Silver badge

    Why?

    “… ensure the potential benefits of self-driving technologies…”

    What ARE the benefits of self-driving cars? I’ve never seen an answer to this and I can’t think of any

    1. BinkyTheMagicPaperclip Silver badge

      Re: Why?

      There's been a lot of posts about this. A perfect self driving car would be brilliant.

      Go to a concert, have a number of beers, as the encore plays ring your car. It starts from its cheap car park several miles out and picks you up at a pick up and drop location. You slump dozily into the car and it drives you home as you sleep. Total cost only a few pounds, rather than dozens by going in a taxi.

      Decide to travel to the Isle of Skye. Stick bags in the car at 1am. Strap yourself in, fall asleep. Arrive prior to 9am for breakfast.

      You decide to go to *Berlin*. Flights are inconvenient. Leave at lunchtime, pop on Eurostar, have dinner in Calais. Snooze until Berlin. (not actually the best example, train may be taken in the same time, but you take the point)

      The Taxi of Mum or Dad is no longer a thing. Trip for your child is approved, car drives them there, drops off, returns home.

      You have a hospital appointment and either can't drive, and/or the parking is appalling. Car drops you off and returns home whilst you have your appointment.

      Need to send something to your friend in a hurry. Stick item in car, tell it to go to your friend. Once item is retrieved, car returns to you.

      In a utopian future with perfect AI and no energy scarcity it would be fantastic. Unfortunately it's unlikely to happen soon.

      1. Herring`

        Re: Why?

        Minor issue: If you look at traffic congestion at the moment, many cars have only one person in them. Now add a load of cars with no people in them

        1. BinkyTheMagicPaperclip Silver badge

          Re: Why?

          Delivery services are already particularly cheap, I can't see it happening too often if self driving cars became a thing.

        2. graeme leggett Silver badge

          Re: Why?

          the self-driving cars wont' stop in the yellow boxes, will signal to each other their turning intentions, not stop obscuring side roads, leave adequate braking distance and so on.

          The traffic will flow much better than with us egotistical and fallable humans at the wheel

      2. Doctor Syntax Silver badge

        Re: Why?

        And back in the world of real devices?

        1. BinkyTheMagicPaperclip Silver badge

          Re: Why?

          Back in the world of self driving cars as they are I see minimal advantages. Possibly lane assistance might be useful, but as it's fundamentally necessary to remain alert I see the advantage of very few assistance features.

          Auto headlights (that can be over-ridden) are one of the few enhancements I like. Reverse parking sensor and camera. Cruise control. I imagine I'd probably trust an auto parallel park as that seems doable, and I'll admit I do not always get the angle correct first time.

          I've tried cars with other distance sensors and they're incredibly annoying, beeping at me when I'm 'too close' to concrete bollards on a motorway with narrow lanes and no room to safely move further away. Even rain sensitive wipers never get it right, I'd rather adjust things myself.

          1. Dave314159ggggdffsdds Silver badge

            Re: Why?

            Good motorway assistancebsystems are really useful without bringin autonomy into it - it frees up a lot of attention to be used for general situational awareness.

            I agree the bad ones seem more like sabotage than help.

    2. werdsmith Silver badge

      Re: Why?

      What ARE the benefits of self-driving cars? I’ve never seen an answer to this and I can’t think of any

      Less people dying. Without an ego a car will not road rage, race and will become safer than human drivers (low bar).

      1. Doctor Syntax Silver badge

        Re: Why?

        "Less people dying."

        Where are the self-driving cars that achieve this?

        1. werdsmith Silver badge

          Re: Why?

          What part of “will become” are you struggling with?

        2. Dave314159ggggdffsdds Silver badge

          Re: Why?

          Current, grossly inadequate technology is _already_ safer per mile than the global average. About on a par with the US average. Still some way behind the average in countries with decent levels of driver training.

          It really isn't that hard to be better than the bad drivers out there. Problem is, everyone thinks they're one of the good drivers.

  13. Anonymous Coward
    Anonymous Coward

    just no

    if any system requires hand off to a human while still in motion, it's an immediate fail whether that is 2secs or more.

    A standard dumb human does not have the response or awareness to take over while being molly-coddled by ai

  14. xyz Silver badge

    Calm down, calm down...

    This is brought to you by HMG... it's never going to happen. Just look at the news piece slapped in the middle of the article... Something about auto lane changing by 2021. This is all just gov word crap to make them feel groovy and hip and "to inifinity and beyond". Anyway one S. Braverman got binned today so not a total loss.

  15. Fonant

    The problem is one of defining "safe enough", as the article points out.

    We probably want self-driving cars to be safer, if not A LOT safer, than human drivers. Autonomous vehicles killing thousands of people a year is not going to be acceptable, even if we accept that death toll from fallible human drivers.

    How many UK deaths per year caused by self-driving cars is acceptable? If zero, then self-driving cars are going to have to be VERY slow and cautious out on the roads, especially when mixing with human-driven traffic.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like