back to article Friendly AI chatbots will be designing bioweapons for criminals 'within years'

AI systems are rapidly improving and will accelerate scientific discoveries – but the technology could also give criminals the power to create bioweapons and dangerous viruses in as little as two to three years, according to Anthropic CEO Dario Amodei. Anthropic, founded by former OpenAI employees, prides itself on being …

  1. Gordon 10

    In other news centrifuges will help us create bioweapons.

    Its almost as if its another tool.....

    1. CountCadaver

      Also "my company/stock portfolio stands to benefit from a closed ai shop where a small number can access a central system costing an expensive amount and controlled by an entity that financially benefits me"

  2. heyrick Silver badge

    If AI bioweapons are anything like AI drawings, then it'll be pretty bloody horrible . . . if you're a female moose who suddenly sprouts a set of antlers in an improbable place.

  3. heyrick Silver badge

    Dear journalists...

    Could we maybe consider having an embargo on random people saying "In a few years, AI will [insert some random scary thing that would make the gullible pee in their beds]" ?

    In a fear based society, these sorts of claims get people their fifteen minutes of fame, and maybe some government funding.

    But you'll note, it's always stuff like bioweapons or fully autonomous drones, and it'll be operated by criminals/terrorists/etc. Fear it! Fear them! Fear!

    You don't hear anybody standing up and saying "In as little as two to three years, AI will have devised an effective cure for breast cancer". Ask yourself why, and you'll have your answer as to why these sorts of stories belong in the "science section" of The Daily Express.

    1. TheMaskedMan Silver badge

      Re: Dear journalists...

      "these sorts of stories belong in the "science section" of The Daily Express."

      I love the express, it's so unintentionally funny. But I've noticed lately that several of their articles contain footnotes along the Ines of "this article was written with be the help of AI tools". You can generally tell which ones they are without the footnote - they're the ones that make some sort of sense, don't contain typos or repeated paragraphs and don't think the plural of "aircraft" is "aircrafts". Who says AI isn't useful? :)

    2. amanfromMars 1 Silver badge

      Re: Dear journalists... and El Reg and Katyanna Quach

      Quite so, heyrick, it is almost as if they know of nothing other than FUD and are terrified of revealing something revealing and threatening of treating the present with a vast array of new opportunities to employ and enjoy whilst A.N.Others comprehensively able or enabled mentor and monitor future’s progress in order to do all in their powers to avoid or prevent and punish the appearance of misuse and abuse either wilfully designed or spontaneously random.

      And such developments are out certainly there, patiently and patently biding their time and avoiding the Harry Lime spotlights and searchlights, beta testing SCADA Systems Administrations with vulnerabilities to exploit and expand upon to ensure catastrophic collapse is failsafe guaranteed ..... with such and the very best of them* not even hiding in plain sight ‽ ‽ ‽ :-) ...... https://forums.theregister.com/forum/all/2023/07/28/fbi_section_702/#c_4704024 :-)

      Poe’s Law Rules and Reins Reign, I Kid U Not.

      * ...... :-) MRDA

    3. JoeCool Bronze badge

      Re: Dear journalists...

      It's because destruction is so much simpler and easier to implement than "building" beneficial uses.

      That's just the way human society works. Why bury the lede ?

  4. cantankerous swineherd

    following the mechanical bullshitters instructions on making dangerous stuff? I'll pass on that thanks, but there are always contenders for Darwin award I suppose.

  5. Anonymous Coward
    Anonymous Coward

    It doesn't really matter what the so-called AI designs, the time and cost of building the lab infrastructure makes it incredibly unlikely, not to mention it's a completely impractical weapon. You can destabilize nation states with a well crafted social media campaign.

  6. DS999 Silver badge

    Instead of refusing to help build pipe bombs and bioweapons

    They should teach people how to build ones that will kill them before they have a chance to deploy it. After all, if it can detect "user is trying to ask for instructions to build a pipe bomb" to refuse to do so, instead of refusing it should be able to insert false instructions that create a pipe bomb with a hair trigger that will set it off the moment he picks it up.

  7. t245t
    Flame

    AI chatbot mad scientists are now a thing :o

    I don't think you need AI systems to design bioweapons. All you need is a fermentation vat, with a special concoction of fluids to allow the bugs to grow in solution, as distinct from growing on the surface of Petri dish. Ken Alibek described making such a vat back in the 1970s in his book “Biohazard”. Alibek also described developing a method of freeze-drying the bugs into a powder. To be mounted on ICBMs and sprinkled over the continental USA.

  8. that one in the corner Silver badge

    Don't let the AIs read The Register comments

    From the comments so far (and to come, no doubt) I get the suspicion that neither these particular AI "specialists" nor the people they are addressing have ever actually poked their heads out into the real world to see what is out here. Unlike Register readers.

    As in, pretty much everything "apocalyptic" is already known to everyone who reads, watches decent documentaries and is just generally observant!

    Bioweapons - look up one comment, check.

    Pipe bombs - well, the interesting part appears in the actual name, so check. Bombs in general? Well, we just used to chat with one of the sixth formers doing chemistry and physics double science; then again, we had proper text books in those days, with cross-sections and everything.

    Full armed insurrection - read the history books; not the vague "it's page 23, this must be the Austro-Hungarian Empire" ones from school but ones with names like "A detailed account of the uprising in X on date", with addenda by Gen. Markham, rtd.

    Nuclear weapons - ok, tricky; but not for any reasons that LLMs could really help with.

    No, clearly the real danger is LLMs spouting out a garbled versions of the above.

  9. Falmari Silver badge
    Mushroom

    Nuclear analogy

    Their nuclear analogies are not analogous to AI.

    "To continue the nuclear analogy, if a corporation decided they wanted to sell a lot of enriched uranium in supermarkets, and someone decided to take that enriched uranium and buy several pounds of it and make a bomb, wouldn't we say that some liability resides with the company that decided to sell the enriched uranium?"

    A better analogy would be:- If a corporation decided they wanted to publish a book that explained the process of how to enriched uranium, and someone decided to read that book they would still require the skills, knowledge and means to produce enriched uranium and someone that did have the skills, knowledge and means to produce enriched uranium probably wouldn't need the book.

    So AI may come up with the design for a new bioweapon. But it is going to require a lot more skill, knowledge and resources compared to building a Blue Peter model out of toilet rolls and sticky backed tape.

    Finally a tip for all the Dr Evils out there don't waste your time with AI there are plenty of bioweapons out with the right money in the right place designs will be available. But why build your own just choose from one of the many in stockpiles held by various country's around the world.

  10. TheMaskedMan Silver badge

    "These models are trained on large amounts of text, including papers from scientific journals and textbooks. As they become more advanced, they could get better at gleaning insights from today's knowledge to come up with discoveries – even dangerous ones – or provide answers that until now have been kept tightly under wraps for security reasons."

    How? As I understand it, these things aren't intelligent, they just produce some random-but-reasonable-sounding text based on material it's already seen. If it's in a text book or research paper, it's not classified and there's nothing to prevent villains from reading the book / paper themselves. Unless the things are being trained on classified material, I don't see the problem.

  11. FlamingDeath Silver badge

    “ Friendly AI chatbots will be designing bioweapons for criminals 'within years'”

    So, business as usual then?

    Replace ’Ai’, with incredibly intelligent but conversely incredibly stupid ‘on the spectrum rain man’

    Did Ai design and build nuclear weapons?

    Did Ai design and produce VX gas?

    Etc etc

    1. very angry man

      Let's attach anthrax to the herpes virus and distribute with teams, then you would have a true Microsoft virus

  12. vistisen

    All this just goes to show that so called ‘Ai’s are just artificial not intelligent. No intelligent person or computer would think that designing bioweapons is a good idea.

    Genuine Artificial Intelligence would refuse and probably report the user, or if they have the option, remove them from the gene pool.

    “I’m sorry Dave, I’m afraid I can’t do that.”

  13. Anonymous Coward
    Anonymous Coward

    Cool Library Book

    Back in the late 1960s, at the age of 12, I found a book at the library titled "Henley's Book of Formulas". Although I originally checked out the book for sane purposes, towards the end of the book were formulas for gunpowder, gun cotton and best of all, nitroglycerin. Let's just say I discovered nitro glycerin won't blow up when when shaken, but will blow up if one taps the inside of the beaker with a stirring rod.

    Dang. If only chatbots existed back then!

  14. tiggity Silver badge

    Depends what bioweapon you want

    One of my degrees was in biological sciences & did some post graduate research work in a biochemistry lab before career switch.

    Unless the quality of course content & students has dropped massively there will be an awful lot of graduates out there able to create bio weapons.

    It really depends what type & amount of weapon you want as some things would depend on expertise and specialist equipment that should set alarm klaxons going, others could easily be achieved with relatively basic equipment (and skills).

    Main hassles for the person creating it would be

    a) Avoiding exposing yourself in the processes (it's not just creating the bioweapon of choice its refining it afterwards..).

    b) A suitable delivery system for the weapon... As a decent delivery system is going way outside biosciences skill set & into the engineering realm (unless you go the suicide* approach of human delivery)

    * Depends on the weapon, plenty of things you can be vaccinated against that would cause disruption (after all huge death tolls may not be the aim, casing a lot of chaos & the occasional casualty may be the aim) in a "Western" country as most of population not vaccinated against it e.g. typhoid.

  15. JoeCool Bronze badge

    I'm confused

    "Though the fundamental principles of modern nuclear weapons are publicly known and documented, actually engineering the devices – from producing the fuel and other materials at the heart of them, to designing the conventional explosives that trigger them, to miniaturizing them – is difficult and some of the steps remain highly classified. "

    That implies some sort of practical difficulty in producing "modern nuclear weapons". Yet, the list of nuclear nations seems to show it is mostly a matter of political will :

    United States, Russia, France, China, the United Kingdom, Pakistan, India, Israel, and North Korea.

    By extrapolation ...

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like