back to article Cloud giants 'ran out' of fast GPUs for AI boffins

Top cloud providers struggled to provide enough GPUs on-demand last week, AI experts complained to The Register. As a deadline for research papers loomed for a major conference in the machine-learning world, teams around the globe scrambled to rent cloud-hosted accelerators to run tests and complete their work in time to …

  1. Pen-y-gors

    Burning the midnight oil

    Glad it's not just undergrads who wait until the last minute to write an essay!

    1. Anonymous Coward
      Happy

      Re: Burning the midnight oil

      Glad it's not just undergrads who wait until the last minute to write an essay!

      This sounds like more than writing the paper, it is doing the actual research to go into it. So what are these researchers doing the rest of the time if it's not their AI models? Is it "There's a deadline coming up, so I'd better do some work"?

  2. Anonymous Coward
    Joke

    They should have used AI...

    ... to predict when enough GPUs would have been available at an acceptable price to complete their papers...

    1. stephanh

      Re: They should have used AI...

      AI: "Wait until after the conference."

    2. Named coward

      Re: They should have used AI...

      AI: Paper submissions are now open, you have 20 days to submit your paper, please don't leave it till the last moment

      1. Tom 7

        Re: They should have used AI...you have 20 days to submit your

        You dont understand deadline. Or 'management' who will not allow you time to do your real work until you point out if its not done today they wont be able to take your work to conference.

    3. phuzz Silver badge

      Re: They should have used AI...

      The old rule with research supercomputers was that you work out how long your problem is going to take to run on current hardware, then wait half that time and buy the new version of the hardware which will do the job in less than half the time, thus saving you time overall.

  3. tomonion

    Re: Azure offer similar spot pricing tiers...

    Unless I am mistaken, Azure do not offer spot pricing.

  4. defiler

    Scale to meet demand?

    "...as AI development ramps up, hosting providers must scale to meet demand"

    But the demand isn't there 95% of the time. You can't possibly accommodate every last spike in activity and still run a business. Not without it costing more than keeping the system internal to the researchers.

    Bottom line, if it's a critical computing resource, make sure *you* can turn the taps on and off.

    1. Brewster's Angle Grinder Silver badge

      Re: Scale to meet demand?

      Cf. the British winter and snow ploughs.

  5. Mage Silver badge

    so another Cloud disinformation

    Better to use Cloud, because it's elastic, you get more as you need it instead of paying for your own iron.

    Yeah right.

    1. allthecoolshortnamesweretaken

      Re: so another Cloud disinformation

      Well, sometimes the elastic just snaps.

    2. Anonymous Coward
      Anonymous Coward

      Re: so another Cloud disinformation

      It is elastic for general purpose CPU, storage, etc. You are never likely going to saturate that supply. ML GPU is a corner case at this point in time.

  6. Anonymous Coward
    Anonymous Coward

    I know that Google has, on several occasions, affected the global supply for components... remember when memory was extremely difficult to source a few years ago. Pretty crazy, "I would like to place an order for memory", "Alright, how much do you need?", "How much do you have?", "On hand in the warehouse?", "No... on this planet."

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like