back to article Hawking, Musk, Woz (and others): Robots will kill us all

Notables of the technology world including physicist Stephen Hawking, biz baron Elon Musk and techno-hippy Steve Wozniak have teamed up to warn us all about the menace of killer robots. In an open letter and petition, the distinguished trio and their co-signatories warn: Autonomous weapons select and engage targets without …

  1. Yugguy


    About as much use a chocolate teapot. A rubber crutch. A fart in a wind tunnel.

    Good excuse for the picture though, well done.

  2. Your alien overlord - fear me

    Well done Lewis

    for getting a picture of Talulah Riley in one of your articles.

  3. Anonymous Coward
    Anonymous Coward

    Typical knee-jerk reaction over nothing

    I'll take a Tomahawk missile over all of these buffoons.

    1. Zog_but_not_the_first

      Re: Typical knee-jerk reaction over nothing

      Be careful what you wish for..

  4. Mage Silver badge
    Black Helicopters


    Why then one wonders is the real motive of these otherwise reasonably smart people to sign such a pointless petition?

    1. Your alien overlord - fear me

      Re: Excellent

      Er, John Conners came back from the future to warn them - haven't any of the Terminator documentaries warned you enough?

    2. JLV

      Re: Excellent

      Musk & Hawkings have both warned about AI risks. There is a strong relationship between a capable, non-munition, weapon (i.e. one that shoots, not one that is shot as per Lewis' examples) operating autonomously and pursuit of an aggressive to humans AI capability.

      I believe they are correct to engage the debate but unlikely to have much success. If there is ever a significant US - China conflict, I expect drones to be a strong component of it, esp from Chinese end. They don't have the military expertise/tradition of the US, they do have the engineering and they have incentives to follow untraditional methods to bypass US dominance. Another form of asymmetric warfare but this time between top-tier opponents.

      Ultimately we need to get better at not wanting to kill each other rather than hoping for weapon restraint doing the thinking for us. Though there are plenty of examples of arms controls working - cluster bombs, landmines, NBC.

      1. AceRimmer1980

        Re: Excellent


        1. JLV

          Re: Excellent



          <insert sheepish icon>

      2. Lynrd

        Re: Excellent

        If there is ever a significant US - China conflict, I expect drones to be a strong component of it, esp from Chinese end.

        I would expect more from the US end - we have a technological head start. Also, lots more people in China - People are cheaper than drones.

    3. Anonymous Coward
      Anonymous Coward

      Re: Excellent

      because some of them, when everyone else is kept from developing autonomous machines, will be the ONLY player in the market when a future need arises. Musk gets huge bank from launching USAF spy birds, and is in on the autonomous car development. When the dedicated drone makers fall away, he'll have the practical experience and capability to be the Haliburton of autonomous warfare.

  5. MyffyW Silver badge
    Paris Hilton

    Scanning through I misread "Kalashnikovs of tomorrow" the first time round and thought it was a new MTV show. Sequel to "At home with the Kalashnikovs", maybe?

    Paris, for the non-existent reality TV angle.

  6. W Donelson

    Hint: They are smarter than you, and anyone at Vulture Central, by a long shot

    so pay attention.

    1. Anomalous Cowshed

      Re: Hint: They are smarter than you, and anyone at Vulture Central, by a long shot

      "Hint: They are smarter than you, and anyone at Vulture Central, by a long shot"

      I would fix this sentence as follows:

      "Hint: They are more famous and supposedly smarter than you, and anyone at Vulture Central, by a long shot"

      1. Pascal Monett Silver badge

        Hawking is smart.

        Compared to him, the rest are just famous.

        1. MrXavia

          What makes him smart, is that he admits his mistakes!

        2. Roj Blake Silver badge

          "Hawking is smart.

          Compared to him, the rest are just famous."


          You may say that, but I've won as many Nobels as him.

  7. Banksy


    I for one would be interested in discussing this issue further with Ms Riley.

  8. Mark 85

    Spot on..

    I believe Lewis is dead on. Autonomous means many things to many people. An ICBM can be considered autonomous.. fire and forget about it. Same for many other weapons. Where the fear is, that these weapons launching themselves. To use ICBM's... one could design a system that would fire them under a set of conditions.. if any condition is a false positive, there's a problem and not just one the receiving end.

    I think that is the issue.. the launch of the system. And by system, to use what Lewis used, the Tomahawk.. they are autonomous only after pre-loading in the target area and then hitting the "Launch" button.

    1. Anonymous Coward
      Anonymous Coward

      Re: Spot on..

      I'm pretty sure Homo Sapiens were the first autonomous weapon, though some other species of Homo may have us beat on that... Or E.T.

    2. Anonymous Coward
      Anonymous Coward

      Re: Spot on..

      Yeah. Aren't most bullets also "fire and forget"? :)

  9. John Savard

    Some Grounds for Worry, But...

    While I wasn't very positively disposed to this call for action in the first place, and I defer to the expertise of the article writer as well, I can understand that a future where people could take the CPU out of a smartphone and use it to control a killer robot built from a Meccano set would give some people pause.

    I guess the real question is: would specifically military development of killer robots bring that future closer to a greater extent than the ordinary improvement of computers for civilian purposes? Someone evil-minded could take bits and pieces developed for benign purposes and cobble together a deadly killer robot out of them once the available bits and pieces become good enough, I'd think.

    1. Anonymous Coward
      Anonymous Coward

      Re: Some Grounds for Worry, But...

      I doubt most civilian drones can carry heavy weapons right now, but there isn't a huge difference between a remote controlled drone and an autonomous weapon. Both keep the operator well out of harm's way. And certainly they'd both do just as well at terrorizing an unarmed populace. The only real difference is if the commander tells an operator what to destroy or if he tells the machine directly (and probably still through an intermediate tech.)

      I'm not sure if this is supposed to be another part of their anti-AI crusades or if they just don't understand that war evolves as technology does.

    2. Anonymous Coward
      Anonymous Coward

      Re: Some Grounds for Worry, But...

      You don't need an evil mind to cobble to together a deadly killer robot, i thought that was a standard geek past time? (sure I never intend on building a deadly killer robot, but my son thinks every robot needs a chainsaw attached to it...)

  10. ian 22

    Too late...

    Self-driving automobiles are due to appear commercially soon and if I can envision self-driving car bombs and self-driving lorries loaded to the gunnels with AMFO, so can others.

    Off-the-shelf killer robots for the asking.

  11. cat_mara

    [Struggles to work the phrase "autonomous drone" into a salacious phrase involving Ms Riley, fails]

    I'm sorry, this has never happened to me before.

    It's late and I'm tired.

    Can we try again later?

  12. Gannon (J.) Dick

    Perhaps these are actually human problems.

    Ban the weapons, great idea.

    Oh, Elon, Steve, Steve, and you too Taluhla, before you leave riddle me this ...

    If the sun never sets on the Empire, how is the Emperor to know whether the productivity lapse, unpredicted by the Empire's business model, is due to robots goofing off or slaves sleeping ?

    I might also say that that criminal terrorist mastermind Pinocchio has been under arrest since those rude remarks about the Emperor's new clothes.

    The puzzled emperor seems a decent sort, not one to regard citizens as slaves.

    Could you smart people help him out with a clever theory about robots involuntarily robot-ing in their sleep ?

    It is much better than admitting slave owners exist, and sleep well

  13. The little voice inside my head

    More than what AI might "think"

    It is about who uses it, the Terminator movies are about machines taking over, and human population gets oppressed. With the AI advancing to new levels, governments that have this advantage might just decide "get rid of them" (the terrorists, the people that think differently) in simple terms, and off they go. These governments would be like the Machines themselves... not accepting any other way of thinking but the "oppressor's".

    How about developing AI to find better ways of producing energy? What causes wars and the need of a militia? Resources.

    Could AI teach us new ways of using resources?

    Anyway that's not what the article is about, but I think the scientists are foreseeing misuses (I consider killing a misuse of technology, but so were the described "revolutions").

  14. dan1980

    "A cruise missile, such as a standard Tomahawk or Stormshadow/Scalp, is autonomous from the moment it is launched. It flies to a location where its target is thought to be, but it does not simply crash on that location: it takes a digital picture at the scene and decides whether something that looks like a legitimate target is in the picture.

    If the missile's software decides there is such a something, the target is struck - and one Tomahawk, equipped with many canisters of munitions which can be deployed separately, can attack multiple targets at different locations."

    Well and good.

    BUT, there is a key difference here. A cruise missile is targeted at a specific area that has been designated to be fired upon. As I understand it the target detection capabilities of the modern Tomahawk are there to allow the missile to be re-targeted if the intended target is no longer at the designated location.

    One big problem when talking about this area - of robots and AI and drones and smart munitions - is that the same few terms can be used to mean several different things. It's important, therefore, to ignore the specific terms used and find out exactly what it is that is being discussed.

    I believe that the potential weapons being discussed are a world away from cruise missiles and there is indeed a danger there.

    The point is that miniaturisation and commoditisation of technology has the potential to change it into a different beast and it's not necessarily helpful to point out how some form of the technology has been with us for decades already.

    Movie cameras have been around for a century but that doesn't mean that tiny, ubiquitous cameras dotting every building and street corner and bus and train and shopping centre and school is nothing to worry about.

    The fact that one can record days of high-quality video and audio using a tiny, cheap, easily concealed device that is accessible remotely and can store that video footage indefinitely, on cheap, long-lived media, well, that raises new concerns.

    And that is the deal here. It's not that autonomous weapons system aren't available, it's that technological progression may put these capabilities in the hands of anyone, regardless of means. (Within reason!)

  15. Allan George Dyer

    It's already too late...

    We've seen video of a quadcopter-mounted handgun recently and facial recognition software is freely available. How difficult would it be to release a van-load of quadcopters programmed to "proceed to beach, shoot faces" and drive quietly away before the carnage starts? A bomb would be much simpler, but the terror effect of murderous drones on shaky cameraphone video would be much greater.

    The limiting factor is the availability of guns. A drone can't be the "Kalashnikov of tomorrow" until you mount a Kalashnikov on it.

    1. Lysenko

      Re: It's already too late...

      >>We've seen video of a quadcopter-mounted handgun recently

      Seriously?? Maybe a recoilless air pistol, but there is no way in hell any quadcopter of the sort typically discussed by drone paranoiacs is going to withstand the forces of discharging a conventional firearm even if totally randomised aiming and instant loss of control\destruction of the platform is deemed acceptable. Bomb delivery via Amazon type drone tech., maybe - firearms: no chance.

      1. Julz

        Re: It's already too late...

        1. Lysenko

          Re: It's already too late...

          You do realise that is a notorious fake, right? (Quite well done though, I admit)

          1. oldcoder

            Re: It's already too late...

            I think the only "fake" part was embedding some detonators in the dummies.

      2. Anonymous Coward
        Anonymous Coward

        Re: It's already too late...

        To me mounting a gun on a drone sounds like a challenge, but is in no way impossible. Some drones available in toy shops are very large now, and then there is the serious amateur drones, and the homebrew scene. Whether the drone survives after firing is a moot point if its job is done.

        I certainly would not use the phrase "no way in hell" for this.

  16. MrXavia

    Pointless, as soon as we have humanoid robots capable of independent action we are doomed...


    They will invade out homes, cook our food, clean our houses, wipe our asses and have sex with us!

    There will be no need to fire a shot, the human race is doomed to extinction by artificial snoo snoo.

    1. Anonymous Coward
      Anonymous Coward

      I for one welcome artificial snoo snoo if it does all that too.

  17. NanoMeter

    Boston Dynamics

    Sure it's not robots from Boston Dynamics they are warning about?

  18. Anonymous Coward
    Anonymous Coward

    Humans are sequence heads.

  19. DCLXV


    It seems unfeasible to expect that any sort of hypothetical moratorium on autonomous weaponry could be enforced.

    Therefore, it follows that the best option would be shoring up defenses against such technology. Ah, but how can an organization/state that has banned research of such technology expect to know enough about the technology to develop defenses against it?

    Perhaps the best way forward is the status quo, an eternal arms race. Seems more sensible than expecting human nature to radically change.

    1. oldcoder

      Re: Ugh

      Not any different than the Pope Innocent II banning crossbows in 1139 (but only against Christians...)

      Didn't work then, won't work now.

  20. Michael H.F. Wilkinson

    "Perhaps these are actually human problems."

    SSH! Don't tell the robots, they might decide this is reason enough to get rid of us.

  21. Alistair


    What is this that keeps trying to hack my server?

  22. Stuart Castle Silver badge

    I think Lewis missed the point.

    I don't think the likes of Hawking and Musk are concerned about the fact that today's weapons can pick a target then guide themselves to that target without human intervention. I daresay they are aware of that (hell, Musk has been working with Nasa, so could well have military connections he wouldn't be allowed to broadcast. These weapons don't (AFAIK) generally launch without SOME human intervention, even if it's just a human telling them to launch.

    I think what they are concerned about is giving that launch authority to a machine, so it can decide to attack people based upon it's own, arbitrary conditions. I don't know their full capabilities, but can today's missile systems go from being stood down (or however they normally are during peace time) to launch without any human intervention at all?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like