back to article AI isn't throttling HPC. It is HPC

In recent discussions with industry vendor sales/marketing types, I've been hearing that HPC demand is falling off while AI system demand is continuing to increase. I've also seen articles implying that AI is somehow displacing HPC. Huh? Ok, this is The Reg, so I'd like to open this semi-rant by stating that the idea AI growth …

  1. Jedit Silver badge
    Boffin

    AI is not HPC

    It would, I think, be more accurate to say that AI is replacing HPC. It uses the same kit, but it doesn't replicate the performance. So it's not itself HPC, but it's also not throttling HPC because HPC systems are still being created.

    It's like using a Formula 1 car for your daily commute. It's completely unnecessary and a waste of the potential, but doing it doesn't mean that there's somehow fewer F1 cars.

    1. Snake Silver badge

      Re: AI is not HPC

      It's them switching terms to simply continue the hype, which of course makes them MONEY. "It isn't HPC", they say, so they can hype you on buying an entirely new server farm to meet their new, glossy "AI" expectations. Not "add on", not "reconfigure", not "reallocate" - it's "all new" at the next refresh cycle with more outlay, because, of course, "AI" is all now a "necessity". It's desktop "CoPilot PC" hype on a 40kW scale.

  2. Korev Silver badge
    Boffin

    > This means, for example, analyzing more compounds in more permutations in drug discovery

    Exactly what I'm doing today, the problem is looking more like a data analysis problem then an ML/AI or HPC problem

    1. Anonymous Coward
      Anonymous Coward

      In most of these examples it’s inference. Running the numbers to prove hypothesis and once you have your potential winners doing the science/engineering on them - whether BioChemistry for drug manufacturing and clinical trials, fluid dynamics for an F1 Front wing or NASA chewing through Petabytes of data.

      Good old fashioned workloads, looking for stuff.

  3. dubious
    Flame

    aww, 40kW?

    40kW per rack? What is this 2022? 250kW/rack is what we're deploying now without trying to get super dense, and HPC vendors were indicating ocp going to 400kW/rack last year.

    When upcoming GPUs are 1kW and you're cramming 8 into a chassis, it adds up fast.

    1. DS999 Silver badge

      Re: aww, 40kW?

      Nvidia was talking about 1 megawatt per rack by the end of the decade. Sure hope the bubble bursts before that or here in the US we'll be dealing with electricity prices 4x higher than today and regular rolling blackouts like we're some kinda third world country.

      1. Anonymous Coward
        Anonymous Coward

        Re: aww, 40kW?

        I’ve seen the electrical and telecoms cables you have hanging everywhere as you don’t seem to want to put any in the ground.

        Man … it looks already looks like fucking Indonesia or Egypt.

        Both also democracies struggling under wanna be authoritarians.

      2. David Hicklin Silver badge

        Re: aww, 40kW?

        > Nvidia was talking about 1 megawatt per rack by the end of the decade. Sure hope the bubble bursts before that or here in the US we'll be dealing with electricity prices 4x higher than today and regular rolling blackouts like we're some kinda third world country.

        With that amount of heat in a single any failure in the cooling system will have things melt down far faster than any protection system could reach

        I was thinking how hard it would be to cable that but realised that if we do reach that point then power would be coming down busbars - just don't drop that spanner!

      3. TheMajectic

        Re: aww, 40kW?

        Aww what are you saying about us in 3rd world countries? We love rolling blackouts! And unfortunately we're also building data centers everywhere like we're magically going to have more power available *sigh*

  4. Anonymous Coward
    Anonymous Coward

    To some extent

    Yeah, I guess in 2009, the render farm used by NZ's Wētā for Avatar's CGI visual effects did occupy positions 194 to 198 of Top500, and consequently qualified as HPC, nominally at least. It seems they then pivoted to a deal with AWS for 2022's Avatar: The Way of Water, and soon outgrew Wellington's powergrid (which is reminiscent of the hunger for power in today's AI) ...

    Then again, I'm not a big proponent of diverting resources towards systems focused on makebelieve (AI or CGI), especially those that dedicate sizable silicon area to single-purpose very low precision compute, when we so pressingly need to forge ahead towards FP64 Zettascale computing to get accurate earth-scale weather simulations at 1-km resolution (among other things). In HPC, I'd prioritize the serious stuff first, before play and entertainment (I think).

    1. Anonymous Coward
      Anonymous Coward

      Re: To some extent

      I’d bet on Avatar Fire and Ash making significantly more profit than OpenAI over the next 5 years.

      I will draw a start-line from 19-Dec-2025

      .

      1. Anonymous Coward
        Anonymous Coward

        Re: To some extent

        Right on! And if it breaks $100 billion, it'll even beat OpenAI to AGI (per Microsoft's benchmark) ... LoL! ;)

  5. lnLog

    bits

    I'd like to see open foam or other 'techical' applications make any useful contributions with a 4 bit wide bus (even if there are x n of them in parallel). I know that is a stretch, but the cards for AI are dropping the wide / high floating point capacity in favor of the low crap, as has been reported here multiple times.

  6. Henry Wertz 1 Gold badge

    spot on

    spot on really. Heavy RAM requirements? Check. Heavy storage requirements? Check. High speed interconnects? Check. And I'll note the newer supercomputers have (almost if not entirely) moved toward having GPUs available for compute. One can argue semantics but these AI clusters have very much in common with the traditional HPC builds.

    Perhaps once the AI bubble bursts (I don't think AI will become irrelevant or anything, but really AI in your fridge and etc? Really...) some of these will be repurposed for high speed compute.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon