back to article What's going on with Eos, Nvidia's incredible shrinking supercomputer?

Nvidia can't seem to make up its mind just how big its Eos supercomputer is. In a blog post this month re-revealing the ninth most-powerful publicly known supercomputer from last fall's Top500 ranking, the GPU slinger said Eos was built using 576 DGX H100 systems totaling 4,608 GPUs. That's about what we expected when the …

  1. Yet Another Anonymous coward Silver badge

    Cos nobody gives a flying fsck

    In the good old days Cray would announce their latest bug box and everyone below it on the list would buy one and everyone would laugh at the foreigners for coming in at number 99.

    Now nobody cares if your shed full of GPUs is bigger than some other government labs shed full of GPUs.

    1. b0llchit Silver badge
      WTF?

      Re: Cos nobody gives a flying fsck

      Of course you care about the sheds full of power sucking devices.

      You only get to boast your numbers and performance if it is scaled by the amount of "free" power you are able to extract combined with the amount of cash registers you are allowed to plunder at nanosecond intervals.

      1. NoneSuch Silver badge
        Joke

        Re: Cos nobody gives a flying fsck

        Token, "Yes, but can it run Crysis?" reference, just to get it out of the way.

        1. ITMA Silver badge

          Re: Cos nobody gives a flying fsck

          No no no no...

          Can it run SPACE INVADERS!!!

          :)

          1. I am David Jones Silver badge

            Re: Cos nobody gives a flying fsck

            No no no no no no…

            Does it have E.Coli displays???

  2. Anonymous Coward
    Anonymous Coward

    C'mon, let's have some journalism

    In my book, the job of the journalist is to understand and sift issues, and then equally as important to write up the important and urgent matters in terms that say 90% of their audience can understand...and then there's this:

    "Oh, and if you're not familiar with the term AI exaFLOPS, it's a metric commonly used to describe floating point performance at lower precision than you'd typically see in double-precision HPC benchmarks, like LINPACK. In this case, Nvidia is arriving at these figures using sparse 8-bit floating point math, but another vendor, like Cerebras, might calculate AI FLOPS using FP16 or BF16."

    1. Richard 12 Silver badge

      Re: C'mon, let's have some journalism

      A few definitions would be nice.

      Links are fine, but just listing a few terms feels a little lazy.

    2. HuBo Silver badge
      Pint

      Re: C'mon, let's have some journalism

      It's the Wild West out there in AI land. No agreed-to performance standard but whichever puts a bootleger's moonshine in the best light (I reckon this one thar will drive you blind, in 10 seconds flat!)!

      Equally scary is that, with no official info on the fastest Chinese exaflopping machines, analysts are forced to make assumptions, and extrapolate from related publications, to then see that Chinese documentaries (or News pieces?) quote these extrapolations to describe how solidly their HPC effort is going (eg. see the 11:35 mark in this youtube video that comes from a reader comment to a recent analysis newspiece: 根据the next platform的报道).

      It will be most important for El Reg and The Next Platform to dispatch their HPC experts over to Hamburg Germany this May 12-16 for ISC'24 so that we can start getting some real answers to these questions (IMHO) -- first hand, from the horses' mouths!

      (Not like last year where it seems they flew to Frankfurt instead!)

  3. Snowy Silver badge

    May be they sold some of the DGX H100 GPU's in order to reduce order delays?

  4. EricB123 Silver badge

    Subscription Required?

    May the missing exaflops are only available if you take out something like a high performance subscription?

  5. devin3782
    Coat

    According to Jenson, the more you buy the more you save... so my guessing is that was the problem, by adding so many cores they saved too much money, so, reduce the core count, they save less and ngreedia makes more.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like