The Biggest and The Best
Whichever creep is president --- including any of the runners-up if either old fool conks it --- will immediately order a superior supercomputer.
Europe is to build four Nvidia-Intel-powered supercomputers, one of which will be the most powerful super yet built for AI applications, the GPU giant reckons. That top-end machine, nicknamed Leonardo, is expected to reach 10 exaFLOPS albeit at FP16 precision; supercomputers tend to be benchmarked using FP64, though FP16 is …
Fair point and kudos to Nvidia and Jensen for stepping in to help rather than sitting on their hands, but the Italian machine is >20x the AI performance of Cambridge-1, I would respectfully suggest that that is not only "not in the same league" but a completely different sport.
That top-end machine, nicknamed Leonardo, is expected to reach 10 exaFLOPS albeit at FP16 precision
All four computers are capable of running simulations at higher and lower precisions, including 64FP and 32FP as well as bfloat16 and 8-bit integer.
I don't get it. Does it mean that the first sentence gives the number in FP16 precision just to get a awsome number, but then at runtime FP64 might be used? If that's the case, isn't that a scientific, EU funded, project? Shouldn't they avoid this kind of marketing tricks in this context?