back to article SC25 gets heavy with mega power and cooling solutions

Hydrogen-fueled gas turbines, backup generators, and air handlers probably aren't the kinds of equipment you'd expect on the show floor of a supercomputing conference. But your expectations would be wrong. At SC25, datacenter physical infrastructure took center stage with sprawling dioramas of evaporative cooling towers and …

  1. HuBo Silver badge
    Windows

    Yay, nay, or ouch?

    It's interesting but unfortunate (imho) that the current AI hype-bubble is leading to this focus on high-density electricity production like this, especially through thermal techs. I can't help but think that what is truly needed are better ways to extract controlled electron flows directly from the otherwise disordered inner-state of matter, possibly through new metamaterials that act as one-way valves, or diodes, at the level of elemental pseudo-particles/waves (quantum or not). Maybe harvesting nuclear radiation through photovoltaics could work there to some extent (or somesuch)?

    The days of LLM tech (with its wasteful energy consumption) as lead AI prospect are probably counted as well. For example, it seems from Thomas Hubert and team (a name almost as nice as "Bert Hubert") that Gemini's AlphaProof's relative success at Math Olympiads is mostly thanks to its use of classical AI, namely using L∃∀N (for formal mathematical reasoning) and tree search (some version of the A* algorithm?) to do its do, coupled with whatever Test-Time Reinforcement Learning (TTRL) is. It may match Gary Marcus's "neurosymbolic techniques" perspectives as well as that of DARPA's Shafto, and clearly doesn't work at all without the classical AI part, period.

    Accordingly, datacenter that "will exceed 400,000" GPUs sound like a huge waste of resources if their focus will be on running LLMs. If they consume 150x what El Capitan does and yet don't produce 150x the computational oomph (at proper FP64 for HPC, and INT64 for classical AI) then they are a huge waste, full stop. A proper 150x El Capitan would crank 300 FP64 ExaFLOPs, which with MxP may result in 3.0 ZettaFLOPs of performance, and finally allow for high-resolution climate simulations at Earth-scale (among others). Granted the ICON team received the Gordon Bell Prize for climate modelling yesterday for its "Computing the Full Earth System at 1 km Resolution" on JEDI, Alps, and Jupiter, but using other physically-based models, or enhancing that resolution further (eg. to predict traveling wave derechos and traveling swirl tornados), still mandates Zettascale computing (iiuc).

    Oh, and (almost unrelated) the other Gordon Bell Prize this year is for the Tsunami prediction research covered here back in August by Tobias ... cool stuff (imho)!

  2. Rich 2 Silver badge

    Groan…

    “When complete, OpenAI's first Stargate datacenter will exceed 400,000 Nvidia GPUs consuming 1.2 gigawatts of power.”

    …which might be justified if it was going to be put to use on something a tad more useful than outputting broken code, or plagiarising some books or “nudification”; like some pressing global issue or other. But alas

    Oh, and if the hydrogen turbines is an attempt to make the whole thing look “green”, it’s worth pointing out (in case you didn’t already know) that the vast majority of hydrogen is extracted from fossil fuels and is definitely not “green”

    1. Jimmy2Cows Silver badge

      Re: Groan…

      Shame it's not 1.21 gigawatts (sorry, err... Jiggawatts) and then it could disappear back in time up its own fundament.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon