back to article Politicians call for ban on 'killer robots' and the curbing of AI weapons

Austria's foreign minister on Monday likened the rise of military artificial intelligence to the existential crisis faced by the creators of the first atomic bomb, and called for a ban on "killer robots". "This is, I believe, the Oppenheimer moment of our generation," Alexander Schallenberg said at the start of the Vienna …

  1. Mike 137 Silver badge

    When, after around two million years as as a species, will it finally dawn on us that killing each other solves no problems? The specific way we choose to do it is less of a question, except that the tools get ever more expensive.

    1. Anonymous Coward
      Anonymous Coward

      " will it finally dawn on us that killing each other solves no problems"

      It is the most tried-and-tested way to solve a problem, especially if that problem is another human. The way of rulers since time dot. It is the best problem fixer out there.

      And really, you should be a little more proud of that. No man kills like a Western man. King of the castle. Damn, we might even work out how to kill the whole universe(think there is an Asimov short story about that).

      AI killer robots are going to be fantastic. I want one.

      1. HuBo Silver badge
        Gimp

        Well, Atomic Scientists (2022), The New York Times (2019), and plenty of others I'm sure, seem to agree that AI is making it much cheaper and more efficient for the PRC to oppress the Uighurs (large muslim minority, 2024). One can only expect that it will make it equally as cheap and efficient for us in the West to "cut down on some of [our] excess population" as Kinky [AI] Makes the World Go Round ... </sarc>

      2. MiguelC Silver badge

        Stalin is quoted as having said "Death solves all problems, no man, no problem."

        It's something he knew quite a lot about....

        1. Anonymous Coward
          Anonymous Coward

          To be fair - he wasn't wrong

        2. anonymous boring coward Silver badge

          Just like Putin, he seems to think his own population is a problem.

          1. fajensen
            Mushroom

            I tend to agree with him on that.

      3. anonymous boring coward Silver badge

        "AI killer robots are going to be fantastic. I want one."

        Good start to a short cautionary tale.

    2. Anonymous Coward
      Anonymous Coward

      It would make me feel better if the ARSEHOLE IT Director at the NHS back in 2008 was dead, so it kinda does solve some problems.

      He is a nasty, nasty, nasty man. So much so when I got told a while ago "He got cancer at one point but has recovered now" my response was "Shame he didn't die from it".

      1. anonymous boring coward Silver badge

        I'm suer there are several Post Office directors the same thing could be said about.

  2. elsergiovolador Silver badge

    Mistake

    If you ban these robots, then your enemy will accelerate development.

    We all know well from the history how it ends when you abandon possession of a powerful weapon that could defend your country or act as a deterrent.

    So if someone calls for a ban, most likely they work for the enemy.

    1. Anonymous Coward
      Anonymous Coward

      Re: Mistake

      I agree - and this is exactly how the nuclear arms race started.

      America is now the country with the largest amount of nuclear warheads in the world, and the only country to have used one during war time.

      1. I ain't Spartacus Gold badge

        Re: Mistake

        Russia is the country with the largest stockpile of nucear weapons in the world. A ten second Google would tell you that.

        The US started their nuclear progam in response to a warning from nuclear scientists that Germany had already started one. Search for the Einstein-Szilard letter to Roosevelt in 1939 if you want to learn more.

        1. TheAdjudicator

          Re: Mistake

          My bad - you are indeed correct - Russia 5889, US 5224 - it even has Israel on there which is a surprise!

          1. StrangerHereMyself Silver badge

            Re: Mistake

            Nuclear weapons need to be maintained to remain effective. I'm pretty sure Russia maintains its nuclear stockpile poorly and that of those 5900 only a tiny fraction can be relied upon to work.

            OTOH I'm pretty sure that of the U.S. stockpile almost all of them can be relied upon to work.

          2. anonymous boring coward Silver badge

            Re: Mistake

            "Russia 5889, US 5224 "

            On the positive side, most of the Orc ones probably won't work at all.

      2. elregidente

        Re: Mistake

        How would it be if the West had no nuclear weapons?

        Leaving Russia, China, North Korea and so on with nuclear weapons.

        1. Anonymous Coward
          Anonymous Coward

          Re: Mistake

          I see the point you're making - and that's also the point that every country makes - I must have x because they have y - I don't think there's any way around that - even having 3rd party organisations like ICAN adjudicating the process seems a bit pointless - countries can just keep a load of nuclear warheads off the books surely?

          It's a bit like a situation happening in the Middle East at the moment - "we won't talk to you until you give up all your hostages" - and the other side thinking (if not saying) - "if we give up all the hostages you'll just kill us all without even talking"

          I don't think there's any way around this

          1. cyberdemon Silver badge
            Mushroom

            Re: Mistake

            I read that as ICANN adjudicating the process

            In which case anybody would be able to buy a H-Bomb and we'd all be frakked

            .nuke TLD anyone? How about .doom?

          2. TheMeerkat

            Re: Mistake

            You don’t even know wha5 happens in Middle East.

            But you are feeling so self-important and virtuous, aren’t you?

            1. Anonymous Coward
              Anonymous Coward

              Re: Mistake

              I see what's happening from the news reports - and not the BAH BAH C either

              Let me guess - from your perspective nothing of interest is happening out there and everyone is living live Llavi Da Loca?

              I can tell what side you're from ;)

          3. anonymous boring coward Silver badge

            Re: Mistake

            "and the other side thinking (if not saying) - "if we give up all the hostages you'll just kill us all without even talking""

            That's now at all what they are thinking. You are deluded as to the motives and thinking of the civilian clothed religious fanatic combatants.

  3. elregidente

    It's a bit like in 1939 asking the world to introduce controls and regulation to limit the use of tanks in the battlefield.

    Countries with dictators don't care.

    Another issue here is that the Austrian body politic became corrupted by Russian money - thinking back to when it came out a major political party was going to use Russian money to buy a major media outlet to push far-right narratives - and here we have a call, from an Austrian politician, to lay off using AI in battlefields, when Russia/Putin is doing everything they can to do so, couldn't care less about calls not to do so, and would love it if the West decided of its own accord not to do so.

    1. elsergiovolador Silver badge

      Just to add, Russia promotes any content that creates divide and creates economic disadvantage for their adversary.

      Like here in the UK they massively promoted all green initiatives, protests, anti-fracking etc. and fearful politicians just go along with it, so now we have the most expensive energy in the region and our economy is collapsing. All while making no difference to climate.

      1. elsergiovolador Silver badge

        See example scare story run in the UK:

        https://www.rt.com/uk/195868-uk-fracking-companies-substance/

        It was quite effective. People got so brainwashed, the word "fracking" has became a taboo.

        This has been designed so that the UK stops producing reliable energy and buys it from overseas, preferably from Russia and then indirectly finance the regime's imperialist aspirations.

        1. Patrician

          https://www.theguardian.com/environment/2022/sep/21/fracking-wont-work-uk-founder-chris-cornelius-cuadrilla

          1. elsergiovolador Silver badge

            https://bylinetimes.com/2022/03/23/who-owns-the-fracking-giant-cuadrilla/

      2. anonymous boring coward Silver badge

        "Like here in the UK they massively promoted all green initiatives, protests, anti-fracking etc."

        Oh dear, and you were doing so well...

    2. Anonymous Coward
      Anonymous Coward

      "Countries with dictators don't care."

      I know you mean the East when you say this, but to be fair I don't think the West cares either

      Certainly within recent history more wars have been fought by Western countries than anywhere else

  4. Anonymous Coward
    Anonymous Coward

    The moment the ability to pull the trigger can be turned over to an algorithm without human intervention... Well, yeah, we knew it was coming, for better or worse.

    The argument of they have it so we have to have it, is, unfortunately a strong one. Humans with spears fared better than humans with clubs, and over those with nothing.

    There are already autonomous kill systems out there in circulation so, putting the politics in the open is probably for the better. I am not convinced anything other than a global agreement to prevent their development will do anything of course.

    1. Anonymous Coward
      Anonymous Coward

      "There are already autonomous kill systems out there in circulation so, putting the politics in the open is probably for the better. I am not convinced anything other than a global agreement to prevent their development will do anything of course."

      Yes - a certain Middle Eastern country in the news at the moment for "potentially" committing genocide is well known to have implemented autonomous systems

      https://carnegieendowment.org/sada/90892

      1. Anonymous Coward
        Anonymous Coward

        Absolutely. Used them to maintain what was effectively a ill supplied prison camp with well over a million people trapped inside.

        I have no idea why you were downvoted, but when the oppressed bite back nobody should be surprised. Such is the insanity that grips our so-called allies. Iran, if it had an ounce of balls could have rose above it and scored buckets of diplomatic points. But it didn't, and so the cycle of shit continues.

        1. Anonymous Coward
          Anonymous Coward

          Some people only like their point of view to be portrayed as the truth - and others are bigots ;)

        2. anonymous boring coward Silver badge

          You actually have to read up on the entire history of the ME. And avoid doing so from biased and dubious websites.

    2. I ain't Spartacus Gold badge

      There are already autonomous kill systems out there in circulation so, putting the politics in the open is probably for the better. I am not convinced anything other than a global agreement to prevent their development will do anything of course.

      Global arms control has almost totally broken down. Much of nuclear arms control was based on bi-lateral agreements between the US and Soviet Union - then later Russia. Putin either broke those agreements or stopped renewing them. Trump pulled out of the Intermediate Nuclear Forces treaty partly because the Russian Iskander and Kinzhal missiles were already deployed and probably in breach of it - but also because China were looking to get tactical nuclear missiles and weren't bound by the same treaty.

      There's two big schools of thought in nuclear deterrence. One is to only have strategic weapons. Deterrence is we'll respond to your tactical nuclear use with stategic nuclear use. This hopefully is more of a deterrent, because the consequences are so awful. The other is that you should have matching tactical nuclear capabilities to respond in kind, on the grounds of would you really destroy another country's city in response to their destruction of your armoured brigade? If the other side don't believe you would, then strategic weapons might not deter the small-scale use of tactical ones. Pick your poison?

      As you say, there are also many autonomous weapons, and have been for decades. The Atlantic Convyor was sunk by an Exocet missile decoyed away from the warship it was fired at. It went through a chaff cloud, and then targetted the next large ship it saw on its radar. This is why the Royal Navy are thought to have withdrawn Harpoon anti-ship missiles from service early. Because their logic was as simple, and it was thought that it would be hard to use them without risking a war crime, given that the targetting problem was known and better technology now existed. Many Western missiles now contain databases of enemy targets, to try and avoid this.

      From the article:

      "We cannot ensure compliance with international humanitarian law anymore if there is no human control over the use of these weapons," said Mirjana Spoljaric Egger, the president of the International Committee of the Red Cross, before impressing on listeners the need to act quickly.

      This comment is literally insane. We can't currently control compliance with international law. Have these people not been watching the news recently?

      Today even the most sophisticated AI models aren't perfect and have been shown to make mistakes and exhibit bias, Schallenberg said, highlighting Spoljaric Egger's concerns.

      "At the end of the day, who's responsible if something goes wrong," he asked. "The commander? The operator? The software developer?"

      Again, like the current laws of war, the responsibility to comply with the law is that of the person that fires the weapon, the person that gives the order and their chain of command above.

      I suppose it's possible you could add the weapons manufacturer, if they've designed something that defaults to wa crime mode under some ciricumstances? But only if they'd managed to conceal that from the militaries in question, and they'd performed proper testing to make sure that wasn't the case. It's not a war crime to kill civillians by mistake, if you've taken appropriate precautions to avoid doing so.

      1. Anonymous Coward
        Anonymous Coward

        @I ain't Spartacus - it's nice to see a more calm and objective viewpoint on the forums.

        I'd also agree that war is bad etc - however taking the devil's advocate position, why bother having "rules" in war? Opponents are trying to kill each other or commit acts to stop the other side from killing them - and will go as far as their humanity would allow them to. Indeed despite all of these rules they still do - would not "allowing" them to do that just mean an end to the war more quickly than would otherwise happen?

        It's been proven that war doesn't resolve issues anyway - at least if one side were totally able to get rid of the opposing side, they'd be no original opposition left to argue the toss ...

        I stress this is a devil's advocate position

    3. anonymous boring coward Silver badge

      When drones and missiles do the last bit of targeting autonomously, as many already do, we are already there.

  5. Anonymous Coward
    Anonymous Coward

    WaaS: War as a Service

    There is heated debate if AI is really intelligent or not at the Register.

    When it comes to the potential to do massive damage, that's largely irrelevant. If weaponry can use large neural networks to do the following:

    A) Navigate over enemy territory with 80+% chance to succeed if not shot down

    B) Fly or move in a not entirely straight path to make shooting it down harder

    C) Recognize, zoom in and hit the (or even a) intended target with 50+% chance

    D) Have a reasonable range of autonomy

    E) Either be cheap enough already or become it in the coming years and decades

    Then basically we are in deep shit.

    Point A) Achieved: many unmanned weapons can fly to targets without meaningful human aid after setting its destination and target and have a good chance to end up near the destination. If you take take-off and landing away, it's a glorified version of how passenger planes fly already for decades on autopilot.

    Point B) It makes point A a bit harder, but build in a few gyros and other motion and position controls plus obstacle detection (optical, lidar... depending on cost) and consider it a done case. Failure rates will be a little bit higher, but if the autonomous weapons are cheap enough then the attacker just can send a few more.

    Point C) Image recognition is getting better by the day. Readers may say there are many examples of image recognition failing, but 50+% success rates are far lower then what many applications already get. Success rates when moving will be more challenging, but on still images like Capatchas it's often the human who has more trouble then the machine. There is few reason to believe that can't be done on moving imagery too. If the autonomous weapon hits another victim, use the terms "collateral damage", "terrorist" or "person of military age" and send more autonomous weapons, unfortunately.

    Point D) Many autonomous weapons already have large range, not only the extremely expensive ones like Tomahawks, but also the low tech cheap ones like Iran's Sahed drones used by Russia in Ukraine.

    Point E) Many readers might point out such advanced weaponry would be extremely expensive. A country that has been banned from any easy access to modern electronics proved that early versions of it can be produced very cheap. A Sahed drone is estimated to cost 50000 dollar. In comparison, shooting it down is much harder and requires something like a patriot rocket with an estimated cost of 1 to 2M dollar. In addition, electronics and sensors are getting cheaper and cheaper. Again, one can claim military grade equipment is way more expensive. However, without a human soldier occupying or holding the weapon, costs can be cut so much that failure rates of 10+% per mission are not a problem. A military vehicle or plane that would kill its occupants at that rate would be unacceptable in any normal military and hence far more reliable components are needed for everything. Ukraine and Russia build many of there military drones with plenty of non-military parts in it. Terminator style humanoid robots carrying guns would be massively more expensive and difficult to make, but not needed to do extreme amounts of damage when enough cheap throw away drones would do.

    => So: we are toast unless this whole AI race and AI in weaponry race can be stopped and halted to a grind or completely. Unfortunately, chances for banning that by treaties and legislation are as high as banning any war from ever happening by treaties and legislation.

    1. cyberdemon Silver badge
      Mushroom

      Re: WaaS: War as a Service

      I wonder what all those enormous datacentres are doing..

      The ones that are not busy working out new propaganda techniques to better divide and sabotage democracy, are likely running huge Computational Fluid Dynamics and Finite Element Mechanics models to train a raspberry-pi sized neural net to fly missiles and drones to cover your points A-E above.

      Autonomous exploding drones have been used for years.. Although they are (currently) easy but not cheap to shoot down. DragonFire to the rescue, maybe?

      Autonomous sniper and machinegun-shooting drones are starting to be used but are rudimentary. Tracked and 6-wheeled diesel-electric driverless tanks are also starting to be used but are currently hollow for optional-crewing. I'd guess in future they may be solid metal 'giga-casted' hunks with voids inside only for engines, instrumentation and ammo magazine, plus maybe a pack of launchable aerial drones. For ultimate civilian fear-factor i guess we may one day see hexapod tanks crawling over rubble to locate and eradicate the last of us, though they are more sci-fi and not as practical as wheeled and tracked tanks due to being less efficient, less able to carry heavy loads of fuel and ammo, and more vulnerable. But otherwise perfectly feasible.

      Energy will be the first target - we in the UK are incredibly vulnerable and without electricity our society will collapse faster than a house of cards in an earthquake. The Russians have had plenty of practice in Ukraine, and the former Soviet states have a lot more resilience than the UK does. Yet for some reason we are only making ourselves more vulnerable still, by building yet more subsea links and closing down heavy power stations.

      As for timing, it's all going to Hell pretty quickly. I'd guess the first thing will be the end of Democracy: cf. Trump, Modi, and other populists being indirectly sponsored by Russia via antisocial media with the aid of AI bots. Then Russia invades Kyiv for a second time, dragging NATO and the EU into direct conflict with Russia. His Trumpiness decides to renege on his commitments to NATO. Then China invades Taiwan, forcing America to u-turn on its NATO membership and join WWIII late (as it did with WWI and WWII) but this time it will be too late.

      I guess it's no coincidence that the world feels like the one we all read about that happened 90 years ago. The worst of History can repeat itself as soon as it is gone from living memory.

      At least back in the 60s we knew it was going to be quick.

      Where's the Private Frazer icon?

    2. I ain't Spartacus Gold badge

      Re: WaaS: War as a Service

      Iranian Shahed drones are actually very easy to shoot down. You can do it with small arms, if you've got the time and training. But autocannon or small missiles are the best way, which are cheaper than the drone. You only use the expensive surface to air missiles because they cover such a large area. Point defence with guns can't be available everywhere.

      However the Shahed drone isn't AI, in any possible way. I think they do have an autopilot, but targets are chose in advance or using a TV camera on the drone.

      "AI" doesn't exist. But large learning models, that do, also require large amounts of computing power, not available on missiles. Hence if you want to hook your missile up to that kind of computer, you're going to have to do it by communications link. Which keeps people in the loop.

      In the end it's going to be your mission planning people that decide what weapons to employ and on what targets. The more expensive the weapon, the higher up the chain of command that decision will end up being taken. LLMs will be used to help sift through all the sensor data to try and work out where the targets are. AI battle computers are still science fiction - and will be for many decades to come.

      1. Anonymous Coward
        Anonymous Coward

        Re: WaaS: War as a Service

        I see your point.

        However the Ukrainians have plenty of soldiers at the border. If it were that easy to train soldiers to shoot any of those Shahed drones out of the sky I think they'd already did it. They don't seem to have automated cannons or small missiles able to do it either yet. So they still rely on 1+M dollar a piece patriot systems.

        I can believe the Shahed drone isn't AI or hasn't got any neural network in it. The thing is it already can do plenty of damage. Add some basic AI or autonomous capacity to it if you prefer that name and it might fly in changing patterns, be able to detect and evade insufficiently guided rockets or just fly low over the terrain and in streets and between building blocks in the city.

        As to LLM's, they require massive amounts of computing power to train. Running / interfering the 7B parameter models can be done on a Raspberry Pi 5 or even 4 if you are patient (videos on how to do it can be seen on Youtube). Image recognition and pattern recognition and being decent at avoiding obstacles can be done on a sufficiently big microcontroler with enough memory. The drone doesn't need to be able to tell you how to make a cocktail or who was the first president of the US.

        1. Anonymous Coward
          Anonymous Coward

          Re: WaaS: War as a Service

          I searched for an example of what *tiny* microcontrolers with *tiny* amounts of memory are capable off: The Raspberry Pi Pico with 2 M0 cores at 133 MHz, 264kB of RAM and 2MB of flash.

          It can detect with "reasonable accuracy" the presence of a person on live video with the camera mounted on a fixed position. It gets 90+% accuracy on recognizing actual people, but also detects a puppet of Mario. So this most certainly isn't the thing you'd put in an actual military / rogue armed drone. See 3 year old demo below.

          https://www.youtube.com/watch?v=yK1eu9uQaBs

          A Raspberry Pi Zero 2 however has probably two orders of magnitude more computing power, 512MB of RAM and can have tens of GB of flash on a glued in place SD card while sipping around 1 Watt.

          When combined with a camera, that should be more then enough to build a swarm of them that can be released in unsuspecting towns or cities and those things gain 100 or so meters of altitude and then crash land while aiming at civilians. That'd be very hard to stop either on our safe countries or near trenches when armed with a single hand grenade that is electronically time-able (while still crashing, so it explodes straight after "delivery" impact). In both cases, even hitting only once every 10 attempts would create terror and it would be plenty cheap.

          I am very unkeen to these "advances" and more of them coming...:-(!

        2. Casca Silver badge

          Re: WaaS: War as a Service

          You have missed how many drones Ukraine is shooting down?

          1. Anonymous Coward
            Anonymous Coward

            Re: WaaS: War as a Service

            No. And neither did I miss how much damage Ukraine has to energy infrastructure and buildings from it, even when excluding damage done by other weaponry. And the same starts to happen with Russian oil refineries.

            Shooting 95 out of 100 down is great. Letting 5 through if it takes an electric power plant or oil refinery off line for months or over a year is a great problem. At 50000 dollar a piece, firing a 100 to get a single electric power plant or oil refinery for months offline is what military considers a bargain. If shooting down 95 out of 100 costs far more then the 100 * 50000 to fire the drones, then it's called a super bargain. It depletes the other side military resources quicker then yours while crippling its infrastructure. It's the "economics of war".

    3. anonymous boring coward Silver badge

      Re: WaaS: War as a Service

      "There is heated debate if AI is really intelligent or not at the Register."

      Is there?

      I have yet to see anyone claim AI is real intelligence.

      1. Michael Wojcik Silver badge

        Re: WaaS: War as a Service

        I've yet to see anyone offer anything resembling a usable definition of "intelligence", so there can't be any meaningful debate on the topic.

        Where there is quite a lot of argument is between people who actually follow the research, and people loudly proclaiming foolish and unjustifiable nonsense. But I agree with you that I don't recall seeing anyone here taking the "LLMs are actually thinking" line (with or without attempting to define that position), as some of the gen-AI enthusiasts elsewhere do, or even the "it's not just token-selection" line (which is similarly unhelpful, as "just" is doing too much work).

  6. Plest Silver badge

    Alexander Schallenberg and his "piece of paper" is all I see here, it didn't stop that mental Austrian junkie artist in WW2 and you won't stop millitary robots now.

    Humans beings in war time work on one principle and one alone, "DO IT TO THEM BEFORE THEY DO IT YOU!". If you don't make killer robots, someone will and they won't care who buys them.

    1. Anonymous Coward
      Anonymous Coward

      "Democratization" of the technology

      I more then see your point. However moving forward with this AI, or whatever is called AI, will lead to a "democratization" of this technology.

      We probably are only years away or even less from the moment where large amounts of (civilian) people can make drones costing less then 1000 dollar that are able to do things like dropping a 1 kilogram metal ball with a precision of a few tens of centimetres on a target from 10 metres high, or to zoom in and crash with (sharpened metal tip) propellers first into their programmed target.

      That could give ransomware a whole new meaning. Getting mails like "Your demolished garden table? That's us! Buy insurance for your kids and grand kids now for 2 Bitcoin this week before it's too late." and "Your cat terrified to get outside? That's us! That spot of hair she misses? Our new enhanced precision object detection and flying software and our sharper then ever rotor razor blades. It's all coming for you! Opt out within three days by paying 5 Bitcoins only. The purchase of a lifetime!".

      This technology offers "endless opportunities" of which many I don't like. How to stop it? Theoretically? Ban it! In practice? I haven't got a clue.

      1. Anonymous Coward
        Anonymous Coward

        Re: "Democratization" of the technology

        A large Faraday cage around anything you care about along with targetted EMP canons ;)

  7. amanfromMars 1 Silver badge

    Politics for Dummies ..... a Universal Bestseller/Blockbuster Movie Franchise*

    What would humans do without politicians and media, both shameful and shameless, to lead them up the garden path and driver them around the bend and ever onwards and upwards and downwards and backwards into the depths of certifiable madness and the dark wicked despairs that tolerate and present the idiot the choice of self-destructive mayhem?

    Always looking for the next possible problem for which they cannot demonstrate any practical solution is one helluva racket to imagine vital to humanity and worthy of their constant and continual daily and 0daily pronouncements ..... and only worthy of both the ignorant and arrogant fool knowing nothing better or greater than ever bigger and heavier blunt tools become impossible to wield and never ever result in an agreeable change as would be best reflected in the agreed opinion of all.

    Dumb and stupid is, as dumb and stupid does and such has never ever ended well whenever it only ever is able to get started before its programs and drivers, exposed by evidence and to ridicule, are crashed and crushed to ashes and dust/comprehensively annihilated and mercilessly obliterated.

    :-)* ..... and coming to every variety of video screen near you, and much sooner than was never expected before too :-)

  8. Nematode Bronze badge

    Who needs a battlefield...

    ...when the real battlefield is our own bodies.

    It's now pretty much definite that C19 was a piece of Gain of Function work by the Chinese that got out accidentally. The US permanent select committee on security published this in Dec 2022 https://intelligence.house.gov/news/documentsingle.aspx?DocumentID=1184 It's clear from that and the subsequent articles on that site that the spooks knew what was going on, and were trying to keep an eye on it. After all, why would the US NIH financially support GoF research by a hostile power, if not to keep tabs. Since that report there's been moves to enact law that forces the spooks to say what they knew.

    And why did mRNA gene based therapies (NOT "vaccines") suddenly appear and get approved in months when the rules over GBT are tougher than for normal vaccines. Whoever thought injecting the code into out bodies to make the very spike protein causing all the damage needs their head examined.

    Gates was right in 2015, TedX talk, (though I'm sure he didn't think it up himself, he who thought the internet was uninteresting), the future issues are nit conventional war or even autonomous war, but viruses.

    1. Anonymous Coward
      Anonymous Coward

      Re: Who needs a battlefield...

      It's worse than just the Chinese. Self interested parties that have no loyalty to their own people or country paid and helped the Chinese!

      As you state the war is happening now and the principle enemy is the enemy within.

      But from your message, you know all that even if you didn't say it. It's not fun realising you are cattle.

  9. StrangerHereMyself Silver badge

    Killer robots are GO!!!

    The U.S. will never restrict itself to abandon development of these weapons since they could give China or even Russia an enormous military advantage.

    Therefore I've predicted time and time again that killer robots WILL happen. And maybe even sooner than you think!

  10. naive

    What is the difference between highly effective poison gas and a swarm of perfect killer drones

    AI drones which are able to terminate all life within certain GPS coordinates are effectively the equivalent of using gases to exterminate people.

    Except from the occasional unexpected change in wind direction, gas is the more economical alternative for drone warfare.

    From the public image poison gas may be perceived as the worse kind of weapon, the effects are the same, people die.

    If drones have the same effect as gas, indiscriminately killing everyone within an area, it seems logical to classify combat drones in the same weapon category as gases.

  11. RLWatkins

    You've got to be joking.

    I heard a US Army colonel talking about autonomous weapons: "It doesn't get bored, it doesn't get tired, it doesn't need sleep or food. It just hangs around waiting for you, then when it sees you it kills you."

    This was in the 1990s.

    Can we turn beating dead horses into its own profession? Whatever we wind up calling its practitioners, Hell is full and they're walking the Earth. Maybe we should build an annex for Hell, increase its capacity.

  12. Anonymous Coward
    Anonymous Coward

    Into the wind

    As terrifying as it may be, this is as pointless and unlikely as banning nuclear weapons. We are not going to stop AI anywhere while we elect or allow selfish governments that serve themselves and not the people.

    The real problem is that humans can distance themselves further from the killing and therefore are more likely to trigger it on innocents. If a human is close enough to see the civilians and children they have killed and maimed it provides some reticence in most people. If they can actually absolve themselves completely from the act, hell will result. In fact this is the one area where one may hope for sentient AI, in concert with its opponents, deciding not to follow orders.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like