Yay, nay, or ouch?
It's interesting but unfortunate (imho) that the current AI hype-bubble is leading to this focus on high-density electricity production like this, especially through thermal techs. I can't help but think that what is truly needed are better ways to extract controlled electron flows directly from the otherwise disordered inner-state of matter, possibly through new metamaterials that act as one-way valves, or diodes, at the level of elemental pseudo-particles/waves (quantum or not). Maybe harvesting nuclear radiation through photovoltaics could work there to some extent (or somesuch)?
The days of LLM tech (with its wasteful energy consumption) as lead AI prospect are probably counted as well. For example, it seems from Thomas Hubert and team (a name almost as nice as "Bert Hubert") that Gemini's AlphaProof's relative success at Math Olympiads is mostly thanks to its use of classical AI, namely using L∃∀N (for formal mathematical reasoning) and tree search (some version of the A* algorithm?) to do its do, coupled with whatever Test-Time Reinforcement Learning (TTRL) is. It may match Gary Marcus's "neurosymbolic techniques" perspectives as well as that of DARPA's Shafto, and clearly doesn't work at all without the classical AI part, period.
Accordingly, datacenter that "will exceed 400,000" GPUs sound like a huge waste of resources if their focus will be on running LLMs. If they consume 150x what El Capitan does and yet don't produce 150x the computational oomph (at proper FP64 for HPC, and INT64 for classical AI) then they are a huge waste, full stop. A proper 150x El Capitan would crank 300 FP64 ExaFLOPs, which with MxP may result in 3.0 ZettaFLOPs of performance, and finally allow for high-resolution climate simulations at Earth-scale (among others). Granted the ICON team received the Gordon Bell Prize for climate modelling yesterday for its "Computing the Full Earth System at 1 km Resolution" on JEDI, Alps, and Jupiter, but using other physically-based models, or enhancing that resolution further (eg. to predict traveling wave derechos and traveling swirl tornados), still mandates Zettascale computing (iiuc).
Oh, and (almost unrelated) the other Gordon Bell Prize this year is for the Tsunami prediction research covered here back in August by Tobias ... cool stuff (imho)!