back to article BEAST AI needs just a minute of GPU time to make an LLM fly off the rails

Computer scientists have developed an efficient way to craft prompts that elicit harmful responses from large language models (LLMs). All that's required is an Nvidia RTX A6000 GPU with 48GB of memory, some soon-to-be-released open source code, and as little as a minute of GPU processing time. The researchers – Vinu Sankar …

  1. cyberdemon Silver badge
    Devil

    All that's required is an Nvidia RTX A6000 GPU with 48GB memory

    Right, and how much does one of those cost???

    This is now an arms race between 'bullshit-for-good' and 'bullshit-for-bad', and nvidia have conveniently set themselves up as the sole supplier of bullshit-generating machinery

    1. BinkyTheMagicPaperclip Silver badge

      Re: All that's required is an Nvidia RTX A6000 GPU with 48GB memory

      4 grand. Not obscene in the grand scheme of things, just generally outside the reach of most enthusiastic amateurs

      I would imagine that it's also possible on GPUs with less memory, it'll just take a fair bit longer. 16GB GPUs are readily available. Tends to get rather expensive beyond there though.

      Nvidia aren't the sole supplier of GPUs - there's also AMD and Nvidia, it's just that AMD are content being an also ran, and Intel are still pretty early in re-entering the GPU market.

      1. Catkin Silver badge

        Re: All that's required is an Nvidia RTX A6000 GPU with 48GB memory

        I would imagine that it's also possible on GPUs with less memory, it'll just take a fair bit longer. 16GB GPUs are readily available.

        It should be possible but, just to clarify, 'a fair bit longer' is orders rather than linear if the full VRAM is required to run the code. That particular card has a PCI Express 4.0 x16 interface at 31GB/s but the memory bandwidth is 768GB/s so swapping from the SDRAM to the VRAM in a card with the same specs, but less VRAM is almost 25x slower. This doesn't take into account the overhead of identifying when these swaps need to take place.

    2. craigba

      Re: All that's required is an Nvidia RTX A6000 GPU with 48GB memory

      No need to buy one.

      Providers like runpod (no association) show spot pricing of 0.49USD per GPU hour for this spec.

  2. Catkin Silver badge
    Mushroom

    I expect any nefarious instructions will be about as credible as the Anarchist Cookbook, which was dangerous but only to the person dumb enough to follow the "recipes". It's still my pet theory that it was a cunning attempt to deprive anyone of that mindset of their fingers before they got their (abridged) hands on useful information.

  3. HuBo
    Pirate

    Fear the turtle BEAST

    Another awesome Touché by the swashbuckling Terps! En garde (à l'assaut)!

  4. anonymous boring coward Silver badge

    I'm all for anything that can make LLMs blow up.

  5. FeepingCreature Bronze badge

    All that's needed is token probabilities

    Next up: "using just an EEG, we jailbreak an enemy combatant in less than a minute..."

  6. tekHedd

    "hallucinations"

    Quote: a prompt that elicits an inaccurate response from a model – a "hallucination" –

    And here we are again, forgetting that *all* output from an LLM AI model is a hallucination. Just, some of them coincide with correct answers. Usually, because the correct answers are part of the training data, and because other correct answer might be very close, spatially, to the correct answers. But that's luck. ("Probabilities" is just luck but with the odds in your favor.) Any time you get a correct answer from an LLM, that's luck.

    1. Version 1.0 Silver badge
      Joke

      Re: "hallucinations"

      LLM AI predicts and reuses "Halo in lunatic" ... A classic Anagram Intelligence

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like