back to article Why we will not have a unified HPC and AI software environment, ever

Welcome to the latest Register Debate in which writers discuss technology topics, and you the reader choose the winning argument. The format is simple: we propose a motion, the arguments for the motion will run this Monday and Wednesday, and the arguments against on Tuesday and Thursday. During the week you can cast your vote on …

  1. DrDudd

    Of course we should, but we never will...

    I vote for unified standards because... its just a vote. But back in the real world I know it will never happen because people are too selfish to think of others outside their commercial bubble. Standards don't stifle innovation so long as they are flexible enough to allow change. The idea that you would ever develop a new GPU then tell the engineers AFTERWARDS that it needs to be ROCm compliant would be sheer madness, so they won't complain (so much) if they knew from the outset, but maybe might need to adjust designs to fit. In the end the needs of the many customers to get ease of use outweigh the needs of the few engineers to be too lazy to read about standards before putting soldering iron to wire, or code in their favourite (but clearly non-standard) IDE.

  2. Tom 7 Silver badge

    And then the Chinese commit to RISC-V in a big way.

    I can see them being more than happy to work on GPU stuff for it in a big open way. Its in their interests after all and if they want to be malicious undercutting closed source software that might be used to undermine them and getting most of the world involved as well....

  3. Anonymous Coward
    Anonymous Coward

    Boy does the author of this article ever have the situation nailed to a "tee"! :)

    1. Roland6 Silver badge

      The author could have buried the body if they had gone back another 20~25 years to the mid-1980's where attempts were made to unify BSD and System-V...

  4. oiseau Silver badge

    Companies? No, people.


    ... companies do not compromise to be one of the crowd when it is not in their best interest.

    Actually, it is people who do not compromise to be one of the crowd when it is not in their best interest.

    It is people who make decisions on absolutely anything that a company does.

    From deciding on its mere existence to flushing 8.8 billion down the corporate toilet.

    ie: all the ifs, whats, whys, whens and whatevers of a company is ultimately decided by people.

    And although it is almost always basically money oriented, sometimes it is not so much that but other aspects of people's nature.

    For a relatively close example, look at Linux and the incapacity of the Open Source movement to harness and line up the awesome amount of resources at its disposal and put it to work in the same direction under a set of basic standards that all distributions would agree upon and use.

    eg: like using the same OS tree so all components in every distribution are in the same branch/place.

    But no, most everyone who can wants to do their own thing: "Look Ma, I rolled a new distro today!"

    It is as old as time itself: human nature.


    1. weladenwow

      Re: Companies? No, people.

      Companies? People? No, it's communities of interest. In this case it's Government-funded research organisations such as CERN, and JPL. When it becomes too costly to adopt 57 different ways to do one thing, common standards will happen.

      1. Roland6 Silver badge

        Re: Companies? No, people.

        The challenge is ensuring those common standards are Open de jure Standards and not proprietary de facto standards...

  5. amanfromMars 1 Silver badge

    If you want stagnation and deflation

    Unified standards do not deliver progress. The lowest common denominator is not a supplier of novel excellence. Existing elite legacy status quo systems though are an avid fan, for it did until very recently quite effectively curtail any unwelcome superior competition.That ship though now keeps RMS Titanic company, 12,500 feet (3,800 metres; 2,100 fathoms) below the waves of North Atlantic Ocean and is never going to sail and rule the high seas again.

    1. Roland6 Silver badge

      Re: If you want stagnation and deflation

      >Unified standards do not deliver progress

      Agreed, however, they enable progress particularly in the usage of stuff-based on those standards.

      Examples include: MSDOS, Windows, the TCP/IP protocol suite, GSM, 3GPP, ...

      All of these created markets for products built for or on these platforms.

  6. Anonymous Coward
    Anonymous Coward

    Universal Turing Machine..... capable of emulating ANY other computing machine.

    Why no mention here? One read/write (nearly) infinite set of symbols......there's your "single standard".

    Problem solved!!

    1. Anonymous Coward
      Anonymous Coward

      Re: Universal Turing Machine.....

      But can it represent any QC (Quantum computer?. C.f. "Could we use an LLVM-based cross-compiler to build apps for quantum computers? This alliance says yes" [recently on theRegister]

      Personally I find a Turing bicycle is able to solve any otherwise intractable problem, far superior to even the most advanced QC currently available.

    2. Dan Olds

      Re: Universal Turing Machine.....

      But will it run Crysis? That's the key question, right?

  7. prof_peter

    Does it matter?

    Why would you even care if there's a unified framework? It's not like the two groups talk to each other. And it's not like the folks developing hardware for AI really care about the HPC market - to give an idea of the market size ratio, the entire top500 list would fit into US-East-1 with room to spare.

    Traditional HPC - batch systems used by physicists etc. - is becoming a smaller part of overall computing at most research universities, as other fields start using more and more computation, and typically choose post-80s environments for doing so. As the fraction of non-batch computation grows, eventually someone's going to start asking the physicists if they can just emulate their ancient environments on something newer.

  8. amanfromMars 1 Silver badge

    FTFYSNAFUBAR .... Future Progress, but not quite as you may have planned it ....

    ..... without IT and AI Quantum Leaping

    As the fraction of non-batch computation grows, eventually someone's going to start asking the physicists if they can just emulate their ancient environments on something newer. ..... prof_peter

    And/Or ...... as the factions/fraction of non-batch computation grows, eventually physicists etc. are going to start asking someone if ancient environments can be emulated on something newer ‽ .

    Although surely the answer to that is already quite obvious.

  9. Charlie Clark Silver badge


    There are plenty of examples on both sides demonstrating that, where interoperability is required, this is not possible without standards. But innovation, almost by definition, happens in the absence of standards. This is why many standards are developed retrospectively based on existing products. WiFi is both a good and bad example: companies scrambled to make use of unlicensed spectrum and ignored lots of good ideas, like channel management, on the way. As a result lots of the standards were always playing catch up and were essentially marketing badges. This has changed more recently with the convergence of the wifi and mobile phone networking, which, because of the use of the licensed spectrum tends to have a more committee-based RFC, specification, ratification approach.

    But vendors will almost always go for some kind of standards plus approach, which will allow them support the standard but also to add their own secret sauce.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon