back to article Nvidia signs up for an Italian Job: Building for Europe the 'world's fastest AI supercomputer' by 2022

Europe is to build four Nvidia-Intel-powered supercomputers, one of which will be the most powerful super yet built for AI applications, the GPU giant reckons. That top-end machine, nicknamed Leonardo, is expected to reach 10 exaFLOPS albeit at FP16 precision; supercomputers tend to be benchmarked using FP64, though FP16 is …

  1. Claverhouse Silver badge
    Happy

    The Biggest and The Best

    Whichever creep is president --- including any of the runners-up if either old fool conks it --- will immediately order a superior supercomputer.

    1. VexedGuru

      Re: The Biggest and The Best

      And meanwhile in good old blighty .............the tumbleweed rolls silently across our bleak supercomputing landscape

      1. RM Myers
        Coat

        Good old blighty

        Have you just issued a call to Arm(s)?

        1. Vikingforties
          Coat

          Re: Good old blighty

          Nah, it'll cost an Arm and a leg.

        2. druck Silver badge
          Happy

          Re: Good old blighty

          They'll need an arm to point all those FLOPs in the right direction.

      2. Elephantpm

        Re: The Biggest and The Best

        Not exactly.

        https://www.google.com/amp/s/www.marktechpost.com/2020/10/07/nvidia-announces-cambridge-1-uks-most-powerful-supercomputer-for-ai-healthcare-research/%3famp

        1. VexedGuru

          Re: The Biggest and The Best

          Fair point and kudos to Nvidia and Jensen for stepping in to help rather than sitting on their hands, but the Italian machine is >20x the AI performance of Cambridge-1, I would respectfully suggest that that is not only "not in the same league" but a completely different sport.

  2. Gene Cash Silver badge

    The question is no longer "will it run Crysis?" but "will it edit a Slow Mo Guys video?"

  3. devTrail

    Precision

    That top-end machine, nicknamed Leonardo, is expected to reach 10 exaFLOPS albeit at FP16 precision

    All four computers are capable of running simulations at higher and lower precisions, including 64FP and 32FP as well as bfloat16 and 8-bit integer.

    I don't get it. Does it mean that the first sentence gives the number in FP16 precision just to get a awsome number, but then at runtime FP64 might be used? If that's the case, isn't that a scientific, EU funded, project? Shouldn't they avoid this kind of marketing tricks in this context?

  4. devTrail

    Precision for AI

    though FP16 is presumably good enough for AI

    FP16 might be good enough, but only for certain problems and only if the data normalisation and scaling is not too much.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like