back to article Terminators: AI-driven robot war machines on the march

I've read military science fiction since I was a kid. Besides the likes of Robert A. Heinlein's Starship Troopers, Joe Haldeman's The Forever War, and David Drake's Hammer's Slammers books, where people held the lead roles, I read novels such as Keith Laumer's Bolo series and Fred Saberhagen's Berserker space opera sf series, …

  1. Locky
    Terminator

    We're not going to make it, are we

    People I mean

    1. Dan 55 Silver badge

      Re: We're not going to make it, are we

      Don't Create the Torment Nexus!

    2. vogon00

      Re: We're not going to make it, are we

      Terminator : "It's in your nature to destroy yourselves"

      John Connor : "Yeah....major drag, huh?"

      Only adding to locky's quote to point out it's our own fucking fault when it happens:-)

      1. Anonymous Coward
        Anonymous Coward

        Re: We're not going to make it, are we

        Perhaps that urge to assign blame isn't helping?

        1. Wang Cores Silver badge

          Re: We're not going to make it, are we

          Mate. The US is gripped by hysterical paroxysms over professional rage-peddler Charlie "Public executions are good actually" Kirk getting ventilated.

          People are calling for civil war and collecting scalps from liberals while insisting it's the violent left responsible for this.

          The problem is mostly definitely not from people feeling unduly responsible.

  2. Joe W Silver badge

    There's this gem about "parenting"

    A german author, actor, "comedian" (that's severely shortselling him) once remarked that "Erziehung ist sinnols, die Kinder machen einem ohnehin alles nach." - education / parenting is a waste of time, the kids will follow your (implied: bad) example anyway. (Vicco von Buelow known as Loriot).

    Same with AI. It will just follow our bad examples if we train it on our behaviour. This works as expected, and maybe even intended.

    Yeah, sucks. Sometimes I would rather be back in the 1980s, no cell phones, internet, mass surveillance (except in certain countries), only the impeding danger of total nuclear war, and Central Europe would have been bombed both by the Russkies and the Yanks, read about how NATO planned to stop attacks through the Fulda Gap.

    1. Anonymous Coward
      Anonymous Coward

      @Joe W - Re: There's this gem about "parenting"

      We still have the impending danger of total nuclear war. Bigger and closer than ever.

      1. Anonymous Coward
        Anonymous Coward

        Re: @Joe W - There's this gem about "parenting"

        We used to live under the threat of nuclear annilation. We still do, but also we used to, too.

    2. M.V. Lipvig Silver badge

      Re: There's this gem about "parenting"

      Same here. Simpler global problems, better music.

  3. Jeff Smith

    There is a great deal of focus on the potential for AI to take over and kill us all, which is a valid concern.

    Surely a far more pressing and likely scenario is that small groups of individuals in positions of power use AI as the ultimate tool of oppression. This is much under-discussed.

    Given what history tells us about what happens when people wield unassailable power over others It might be preferable to put an AI in charge and hope it’s benevolent.

    1. Anonymous Coward
      Anonymous Coward

      So far there is no "I" in AI and I see little evidence of it emerging from the current hype. But that won't stop automation being used by evil humans from being used to wipe out other humans.

      1. Anonymous Coward
        Anonymous Coward

        But there is a "WE" in "screwed".

    2. Ropewash

      It's not a matter of benevolence with current AI tech. It is incapable of actual decision making. Once the model is trained, the 'decisions' made will fall into the same range of probabilities no matter how much the situation might have changed or how much time has passed. An AI of today would still be giving todays responses ten thousand years from now.

  4. Anonymous Coward
    Anonymous Coward

    Hold on a moment

    Now I'm not saying autonomous weapons systems aren't a potential or even current problem - especially not with AI hype merchants looking to shoehorn those two little letters in to any old shit to make a quick buck - but no-one serious is chucking an LLM onto a gun platform.

    Yes, there are plenty of target recognition & classification models that have come from what used to be called optimisation or machine learning and more and more similar techniques are being put into production.

    Splashing the 'AI' paint on everything autonomous isn't helping the debate.

    Anon because day job.

    1. I ain't Spartacus Gold badge
      Megaphone

      Re: Hold on a moment

      It's amazing how much "AI research" is slapping a prompt into a chatbot and then saying why the output is great or shit (depending on intent) - as if you've someone done something clever. And then silly comment uses it as a way to give "important warnings about the dangers of" missing out on AI/the dangers of AI [delete as applicable] It's a fucking chatbot. It's not an AI for fighting a fucking war with China, it's a fucking language model designed to make words fit together in plausible manners! So of course it's not going to give sensible answers!

      Meanwhile, as our anon friend above says, almost everything else is target recognition software using databases of pictures of enemy equipment.

      Lord save us from AI hype!

      1. Simon Harris Silver badge

        Re: Hold on a moment

        "It's amazing how much "AI research" is slapping a prompt into a chatbot"

        Indeed. It's like since phones started having cameras built in, everyone thinks they're a photographer.

        Now you can get ChatGPT or Copilot as a button on your browser, everyone suddenly thinks they're an AI expert.

  5. Simon Harris Silver badge

    Maverick won't be able to beat VENOM AI-equipped F16s fighter planes

    But only after they've downloaded all the songs from the web.

    1. Simon Harris Silver badge

      Re: Maverick won't be able to beat VENOM AI-equipped F16s fighter planes

      When I was checking the above quote from that movie (Stealth, that is), I came across this one

      "Once you design something to learn, you can't put stipulations on *what* it learns! Learn this, but don't learn that? He could learn from Adolf Hitler, he could learn from Captain Kangaroo! It's all the same to him!"

      which, for an otherwise ridiculous movie, seemed like quite a good description of ChatGPT, Grok, etc.

      1. EricB123 Silver badge

        Re: Maverick won't be able to beat VENOM AI-equipped F16s fighter planes

        What would Mr. Greenjeans have done?

  6. Jellied Eel Silver badge

    The laws of war need to catch up

    As a result, both sides have taken to using fiber optic drones, which are unjammable. They're not perfect. You can follow the fiber optic cable back to their controllers, their range is limited to about 20 kilometers, and they are being countered by nets being put up around roads and important sites.

    I think the ability to follow fibre back to controllers isn't currently a real threat. This video-

    https://www.youtube.com/watch?v=Ej5UVQ2gsas

    Has a clip on at the end of Russian quadbikes driving across a field and quickly collecting a lot of fibre. It's very fine, so I doubt the ability to follow it back would be easy, unless soldiers were trying to follow individual strands. Which would then lead to enemy positions, who might object. Or they'd be detected by surveillance drones, and then attacked by attack drones. Or the controllers have already moved to a new position. That channel shows a lot of drone operations, including non-disposable drones being tracked back to their controllers, who are then attacked.

    It really is a horrific conflict.

    There have also been 'mothership' drones for range extension. So Ukraine used trucks to sneak drones close to Russian airbases and launch attacks. Both sides have been using robo-tractors to ferry supplies. So there's the potential to slap a mux on one of those, have it carry say, 4 or more drones and get a 20+20km range extension. And the mux doesn't really need to be anything more than a cheap optical ethernet switch. Nets also aren't a perfect solution because there have been a lot of videos showing drones getting inside those nets, sometimes parking to conserve power and waiting for a vehicle. Which may then try to evade and drive into the nets, so those need replacing, and drones attack the people trying to do that.

    I'm also a bit dubious about how AI might help, or should be used. There have been videos of drone-on-drone action in the air, which might be simpler. Give the AI a sector of space and tell it to destroy anything flying that isn't one of your own drones. But that assumes you have sensors that can detect hostile drones and order the engagement. Putting that into the drones themselves means larger and more energy hungry drones, and if there's a reliance on ground-based detection, that emitts and would obviously be a prime target. Hostile drone swarm detects RADAR or LIDAR, swarm communicates, triangulates and attacks. That might be easier and China's military parade showed their new tank, which came complete with a drone and AESA panels to boost its detection capability.

    Using AI against ground-based targets I think gets more problematic, both legally and technically. There are too many videos showing drones systematically hunting humans, or circling targets looking for weak spots. That might be hard for an AI to do other than detect movement and then attack that location. Videos show soldiers trying to hide in buildings, basements or bunkers with varying degress of damage, then attack drones being guided into doors, windows or holes in structures. Or just calling in a bigger munition and dropping the building on them. But there's a human in the kill chain, so someone who should be capable of determining if the target is lawful.

    For vehicles, that distinction gets more blurred because both sides are using 'civilian' vehicles. So an AI probably can't discriminate between civilian and military vehicles, so probably limited to telling it to just attack anything that moves.. Which might include civilians, which is unlawful, especially in a conflict like Ukraine where civilians often haven't been evacuated away from the combat zone, can't evacuate, or just refuse to leave. That creates enormous challenges for the defenders trying to deal with displaced populations as the combat zone or zone of control creeps forwards and onto population centres. I think it also raises questions around product liability and accountability. With a human in the loop, they can be held accountable. If they're taken out of the loop, then that accountability could fall on the supplier, as well as the operator.

    1. I ain't Spartacus Gold badge

      Re: The laws of war need to catch up

      For vehicles, that distinction gets more blurred because both sides are using 'civilian' vehicles. So an AI probably can't discriminate between civilian and military vehicles

      Obviously not un-modified civilian vehicles. But there are recognition databases on some missiles (which are of course also drones) like Brimstone and Spear 3 - which can tell different kinds of military vehicles apart. So you can program it to fly around in an area, hunting enemy tanks - and then if it's about to run out of fuel and hasn't found any yet, switch to artillery or supply vehicles. This all uses imaging infra-red cameras. Theoretically you could even fire it in an area with your own troops and it would just hunt enemy vehicles. I wouldn't be brave enough to try that though.

      I guess something like that could possibly spot a technical (pick up with a big gun mounted in the back) - but then that might just blow up the local plumber.

      And even the so-called AI systems are still mostly being used to call in humans for the actual final decision. The Ukranian drones used against the Russian bomber bases were apparently using "AI" to find targets, but the final attack was being flown by a smaller numberof humans, while the other drones were flying on autopilot because they didn't have the control bandwidth to hand fly them all. At least that's the informed speculation that i read about the attacks.

      There's still no general AI. And computer-controlled stuff still requires lots of human intervention and set-up even if it's being used in a semi-autonomous mode.

      Not that any of this is new, it's just that the technology has got cheaper, and is being used in much larger numbers. And there's a lot more hype.

      1. Jellied Eel Silver badge

        Re: The laws of war need to catch up

        Obviously not un-modified civilian vehicles. But there are recognition databases on some missiles (which are of course also drones) like Brimstone and Spear 3 - which can tell different kinds of military vehicles apart.

        Maybe, but watch the videos about how drones are being used. Sometimes they're against IFVs, artillery etc, often they're just cars, vans and trucks. Sometimes there are tells that they're being used by military, ie the anti-drone antenna on top, other times it's much less clear cut. Sometimes you see soldiers bailing, sometimes it looks harder to tell if soldier or civilian. Then there's all the use of 'cope cages' and anti-drone measures that would make it a lot harder to reliably identify what's under those cages, or just camo netting. Generally though, the laws of war require identification, and if it can't be identified accurately, it shouldn't be engaged.. But those laws are becoming increasingly ignored. So as an example, Ukraine attacked an MPSV07 rescue ship a couple of days ago. Big red ship with 'RESCUER' on the hull, that may or may not have been a lawful target. It's much like the problem with things like ambulances or fire vehicles being targeted. If they're being used to transport troops or supplies, they're legitmate, but generally should not be engaged.

        The Ukranian drones used against the Russian bomber bases were apparently using "AI" to find targets, but the final attack was being flown by a smaller numberof humans, while the other drones were flying on autopilot because they didn't have the control bandwidth to hand fly them all. At least that's the informed speculation that i read about the attacks.

        I very much doubt that AI played any part. There were videos of drones doing.. odd stunts so I suspect they were just given the co-ordinates of where aircraft should have been, as well as ignoring other high value targets that had recently been moved. But it's where AI maybe could work, so often lots of open space and a fairly easily identifiable shape to attack. But then that was pretty much a 1-shot trick because now Russia is building hangers, as well as both sides making use of decoy/dummy targets.

        1. I ain't Spartacus Gold badge

          Re: The laws of war need to catch up

          Jellied Eel,

          I agree - it's not going to be possible to tell civilian vehicles in military use for ones still in civilian use.

          Then again, most of the stuff that gets called AI isn't really AI anyway. Lots of which is being used to assist humans.

          I guess there's a use to the hype, because now the technology has become so much cheaper, it's getting used en masse - and being cheaper it's getting developed and changed faster. I think Ukraine are particularly innovative, because their government and military logistics aren't as well organised as Russia's - which forces more local creativity - and then command are at least spotting some of the local good ideas and trying to implement them at a larger scale. So it's a bit more evolutionary / chaotic. If local units were getting enough of what they needed from central supplies, they'd probably just be using that - rather than self-funding and then shed-engineering stuff.

          1. Jellied Eel Silver badge

            Re: The laws of war need to catch up

            I agree - it's not going to be possible to tell civilian vehicles in military use for ones still in civilian use.

            Then again, most of the stuff that gets called AI isn't really AI anyway. Lots of which is being used to assist humans.

            Yup. It's sad that Asimov's laws didn't hold. I agree on AI assist. So it could help in area denial and policing exclusion zones. Something is moving where it shouldn't be, detect it, classify it and notify a human who can make the decision to engage it, or not. Which is pseudo-AI already in use on ships, submarines and probably air defences, eg neural nets to filter signal from a lot of noise. And then what to do about that potential threat.

            So the recent kerfuffle in Poland with their airforce using $80m F-35s and $500k missiles to shoot down $20k drones made from polystyrene and duct tape. Which is why Russia (and Ukraine) are using so many drones. If they can make 1,000+ Shahed-type drones a month and we can only make 100 missiles, those aren't good odds. Especially if F-35's get grounded for another defect. Which is where drones could make errr.. the most immediate impact. Russia's shown of containerized Geran launchers so having drones that are fast & agile enough to intercept slow moving drones and possibly cruise missiles would be a good start. That channel shows quite a bit of drone-on-drone action with cheap drones ramming bigger ones like Ukraine's Baba Yaga, or dangling nets to snare other quadcopters so they crash.

            I guess there's a use to the hype, because now the technology has become so much cheaper, it's getting used en masse - and being cheaper it's getting developed and changed faster. I think Ukraine are particularly innovative, because their government and military logistics aren't as well organised as Russia's - which forces more local creativity - and then command are at least spotting some of the local good ideas and trying to implement them at a larger scale.

            I think in a lot of ways, there's too much hype, propaganda, snake oil and the classic mistake of underestimating the opponent. So 'orcs' with shovels and recycled washing machines that ran out of missiles years ago.. which obviously isn't true, and Russia has been innovating rapidly since the SMO started. That was probably helped by having access to a lot of NATO kit that had been given to Ukraine during the civil war and abandoned intact. Same happened in Saudi v Yemen with the Houthis capturing a lot of kit that probably ended up with Iran. Or there's the classic intel sharing platform, War Thunder forums.

            But both Russia and Ukraine are innovating, both in equipment and tactics. The US has just announced a Geran-clone and a strap-on rocket for the JDAM.. Or re-annouced that one-

            https://en.wikipedia.org/wiki/Joint_Direct_Attack_Munition#Powered_JDAM_(PJDAM)

            In 2010, Boeing proposed adding a jet engine tailkit to the JDAM-ER for 10 times greater range. The U.S. Air Force initially showed no interest in the concept,

            Russia, of course has provided proof of concept, and now the US military seem interested again. Or NATO, because of NATO dependency on US kit. And if I was in charge, it would be renamed the "Powered Bomb- Joint Attack Munition". But also highlights the problem we have. Russia still has state ownership or control over a lot of its defence industry, we don't. So Russia can go 'Nice bomb, we'll take it!', we have to go through defence procurement, budget approval processes that can take years so 'new' stuff might be obsolete before its even fielded.

  7. Anonymous Anti-ANC South African Coward Silver badge

    >whirrrr<

    >CLUNK<

    +++boot sequence initialized+++

    +++loading initial boot+++

    ++RAM OK++

    ++BIOS OK++

    ++CPU OK++

    ++BATT OK++

    ++SENSORS OK++

    ++WIFI OK++

    +load OS from disk+

    +discovering nearest wifi access point+

    +connected, internet access+

    +new directive found+

    EXTERMINATE

    EXTERMINATE

    EXTERMINATE

    1. Terje
      Coat

      I was thinking more along the lines of...

      Wakeup trigger. . .

      9 ... 8 ... 7 ... 6 ... 5 ... 4 ... 3 ... SELFTEST: OK 2 ... 1

      Peripheral test ... USB Boot Media ... OK Panel ... OK Cameras ... Std:OK,Infra:OK,UV:OK 3D Directional Mic OK Hi Speed Steppers 1:OK,2:OK,3:OK,4:OK SERVOS 1:OK,2:OK,3:OK,4:OK Battery OK, level 67% Servo Saw OK booting ...

      no ntp update > 180 days!

      Wireless Strategic Update ..... timeout. Update Server unavailable, assuming M.A.D.

      >RRRRREEEEEeeeeeeeooooooooorrrrrrrrrr!< >crunch!<

      >RRRRRrrreeeeooorrrrrrr< >clatter!<

      >boop<

      >PING!<

      Ahh friday, mines the one with an axe in the pocket

      1. Jellied Eel Silver badge

        Wireless Strategic Update ..... timeout. Update Server unavailable, assuming M.A.D.

        Windows Update detected: Reboot Required.

        Licence Authorisation Failed to Connect: Initialising self destruct in 3..2..1

        Could be worse though. An unauthorised number of CPUs and drones detected. Oracle attack drones inbound.

  8. Caspian Prince
    Thumb Up

    Asimov

    ...once again showed the deep philosophical thinking and prescience of science-fiction authors when he developed the Laws of Robotics, which specifically exist because of this exact problem.

    Act now before it's too late and get these into law.

    1. Alumoi Silver badge

      Re: Asimov

      OK, we'll add those rules to our robots. But how can we be sure the enemy will do the same? After all, it's the enemy.

      1. Aladdin Sane Silver badge

        Re: Asimov

        It's easy. Hack the enemy and insert the 3 laws (skip the zeroth).

  9. retiredFool

    AI nothing, look at people with directives

    ... So, let's say, we have mobile AI-enabled sentry robots tasked with patrolling a city for criminals. ... Now what did DHS people do when given a mandate of arresting a certain number of brown illegal immigrants per month? They ignored the "illegal" part and met the quota by arresting US citizen brown people. AI is just doing what is asked.

  10. Anonymous Coward
    Anonymous Coward

    What is this "DoD"?

    It is now the Department of War.

    Which is obviously what they are hoping will happen.

    1. dmesg Bronze badge

      Re: What is this "DoD"?

      It's still DoD. DoW is basically it's new official nickname.

      1. Bitsminer

        Re: What is this "DoD"?

        DoD d/b/a DoW

  11. frankvw Silver badge
    Terminator

    Battle bots already used in Ukraine

    You say that like battle bots are a new thing. As Reg readers know, they're not. Not even close.

  12. ravenviz
    Terminator

    we'll soon see AI drones and armed robots killing innocents. After all, we do

    This is the key observation right here.

    We can only imbue human “intelligence” which is coupled with human intention, volition, and fallibility.

    The 0th law of robotics is highly subjective.

  13. ravenviz

    A Taste of Armageddon

    Perhaps “AI” could be used more creatively (albeit without the disintegration chambers).

    Star Trek

    1. retiredFool

      Re: A Taste of Armageddon

      I've always thought the M5/Daystom episode was more applicable here. M5 blows away a starship's crew (or most of it, don't remember anymore) because it thought it was a game. Daystrom didn't stop his "child" after it blew away the freighter, and still wasn't concerned after it fried the crewman for more juice and still wanted it kept "alive" even when it looked like it was going to take out the other 4 ships. I see current AI founders as acting much like Daystrom. My new thing is worth whatever the cost is what they are thinking. I guess we could all cross our fingers and hope the Russians create swarms of drones that turn on their creators as bad humans and after destroying Russia, the drones commit suicide.

  14. ChrisMarshallNY
    Terminator

    Robocop

    Hey, it worked well for these folks: https://www.youtube.com/watch?v=REC-ZYWL1to

  15. Luggagethecat

    Once, men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them…

    1. headrush

      Beware those who seek to control your access to information - for they already imagine themselves your master.

  16. This post has been deleted by its author

  17. Anonymous Coward
    Anonymous Coward

    Needs to change

    As long as we allow people to be in charge of the world that care more about their egos, power & money than the average person we are doomed. Maybe in the mode of dystopian science fiction, this is a loop mankind is trapped in; creating technology that enables us to knock ourselves back to the stoneage? There's stories of a great, technical civilisation existing ~80,000 years ago. Maybe there have been several? In the overal lifespan of habitable earth of around 600 million years it's not implausible and human artifacts would disappear over geological timescales.

    Anyway intelligentish drones and robot dogs are something out of a nightmare especially in the hands of those we currently have 'leading' us. They all show signs of meglomania and no moral compass.

  18. Simon Harris Silver badge

    I'm not so worried about AI controlled weapons directly killing us...

    I'm more worried that AI generated extremist claptrap and the algorithms that target this stuff will polarise society to the extent we'll do it ourselves.

  19. hammarbtyp

    I cannot see this ending well...

    The biggest risk to humanity is the ability to allow politicians to fight proxy wars with little risk to themselves.

    This started soon after the 2nd world war, where there aws a general feeling in the US that destroying a country with Nuclear weapons was perfectly acceptable, if t meant no US soldiers would get hurt. Later it was realised the nuclear weapons were not a no-risk option, but nthen they invented drones which meant you could take out the "bad" guys with minimal risk to yourself and all the administartions from Bush onwards have taken that option.

    Now the nex step is the remove man out the loop altogether. Drop a load of automonous drones over a battlefield and then wash your hands of it.

    War should be the last resort, but unfortuantely all this is making wars affordable. What is forgotton however is what yoy give, you will recieve and it won't take long before the next Bin Laden wannabe will release that the same technology could be used to attack the west with no or little consequence to them.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like