back to article Thirty-nine weeks: That's how long you'll be waiting for an AI server from Dell

Dell has told investors that demand for AI servers has surged, but buyers will be forced to wait 39 weeks to get their hands on the hardware due to supply chain constraints. Speaking on the computing giant's Q3 2024 earnings call, vice chair and COO Jeff Clarke said Dell shipped half a billion dollars worth of AI servers in …

  1. Anonymous Coward
    Anonymous Coward

    "due to supply chain constraints"

    That'll be NVIDIA then.

    The sooner we move away from CUDA the better. There is evidence out there that shows AMD cards absolutely trounce NVIDIA in some AI workloads. Especially with larger models because of NVIDIAs meagre RAM offerings.

    It wouldn't surprise me at all if the next generation of NVIDIA cards are only fractionally faster, but have tons more RAM onboard and come with misleading meaningless marketing claims like "can run AI models twice the size as previous generations"...I suspect this is why the jump in memory wasn't as large as it could have been this generation.

    At the consumer level, the RTX 4090 should have been 32GB RAM, the RTX 4080 should have been 24GB etc.

    Those RAM jumps alone would have made the "higher than expected" pricing a lot more reasonable.

    They have themselves stuck in limbo right now with consumer cards...because there are rumours of the 4080 Super round the corner (which surprise surprise, will have more RAM and only slightly higher clocks)...but it will come too late because the 50 series isn't that far away.

    There are huge numbers of people still on the 20 series that will skip the 40 series as well as the 30 series...because until there is a significant RAM bump, there is no need to upgrade...gaming aside, from a productivity standpoint there isn't a huge difference (in some cases, no difference) between a 20 series card and a 40 series card in performance terms and the only reason to upgrade is more RAM for generative AI...which NVIDIA currently isn't really offering...the difference between an 8GB card and a 16GB card for this purpose is basically nothing because you can run pretty much the same models. You need a jump to 32GB to move into a completely different category of models to see a significant difference in output quality.

    1. Anonymous Coward
      Anonymous Coward

      Re: "due to supply chain constraints"

      "he sooner we move away from CUDA the better. There is evidence out there that shows AMD cards absolutely trounce NVIDIA in some AI workloads. Especially with larger models because of NVIDIAs meagre RAM offerings."

      I'd like to see that evidence, because in our tests (and we do a lot of AI stuff) AMD cards are rarely worth the trouble. Also because AMD has shown little interest to even compete with Nvidia on the GPGPU/AI field, they pretty much ignore everything non-graphics like OpenCL, and even their enterprise GPUs get lackluster support which AMD seems only eager to end prematurely. AMD had so many opportunities to get a foot in the door but blew it every time, and there no sign that this about to change in the near future.

      If any GPU maker is taking away from Nvidia then it will be intel, even their 1st gen ARC GPUs (Alchemist) perform surprisingly well, and drivers have been improving massively since their launch. Give it one or two generations and intel ARC may well compete with Nvidia for high performance GPUs.

      "It wouldn't surprise me at all if the next generation of NVIDIA cards are only fractionally faster, but have tons more RAM onboard and come with misleading meaningless marketing claims like "can run AI models twice the size as previous generations"...I suspect this is why the jump in memory wasn't as large as it could have been this generation."

      The jump in memory for the Geforce RTX 4000 was not as big mostly because of two things:

      1.) Graphics doesn't really need that much more RAM, especially not games. 8GB is still fine for the majority of games, and 12GB will run pretty much anything on max for the lifetime of this GPU generation

      2.) Gamers are a tiny niche of GPU buyers (and the number of consumers dabbling with LLMs is even smaller), and Nvidia doesn't really care much about some gamer spending $1200 of his savings on a Geforce card when they can sell the same GPU in different packaging for between $4k-$20k (or more) to business/enterprise buyers. And with the AI craze, the focus towards datacenter GPUs has only become stronger for Nvidia.

      "At the consumer level, the RTX 4090 should have been 32GB RAM, the RTX 4080 should have been 24GB etc."

      No, they shouldn't. There is no need for a consumer/gaming GPU to have such large memory capacities. Neither from the POV of a user (games don't need anywhere near that amount of video memory), nor from a POV of Nvidia (why would they want to offer a $1200 GPU with pretty much the same GPU processor and VRAM as their much more expensive DC GPU?).

      "Those RAM jumps alone would have made the "higher than expected" pricing a lot more reasonable."

      No, they wouldn't, because lack of memory isn't really the biggest issue with Geforce RTX 4000 cards. It's price, price, lack of reasonable entry/mid range GPUs, price.

      And I doubt that will change much with the next generation. My guess is the days of Nvidia being somewhat affordable in the consumer GPU space are over.

      1. Anonymous Coward
        Anonymous Coward

        Re: "due to supply chain constraints"

        "Graphics doesn't really need that much more RAM, especially not games. 8GB is still fine for the majority of games, and 12GB will run pretty much anything on max for the lifetime of this GPU generation"

        Yeah, we've heard the NVIDIA line on this, but the truth is, a lot of games hit the ceiling quite easily...so I have to wonder, is the reason games require "less RAM" because developers have got used to the constraints and limited themselves in some way?

        "No, they wouldn't, because lack of memory isn't really the biggest issue with Geforce RTX 4000 cards. It's price, price, lack of reasonable entry/mid range GPUs, price."

        Would the price seem more reasonable if the specs matched the price? If so, one could argue that the price isn't the problem, the spec is.

        I think it would, especially at the higher end.

        " lack of reasonable entry/mid range GPUs"

        Look at the GeForce 2. I'm an old timer relatively speaking, and I can remember GPUs going back to the mid 90s and the range that was available and the pricing. GPUs have always been relatively pricey...what has changed is the choice available...which gives the impression that GPU prices have skyrocketed...when in fact, relatively speaking, they haven't if you've always been in a certain tier.

        The GeForce 2 Ti and GeForce 2 Pro came in various flavours that could have up to 64mb RAM. Some came with 32MB...in both variants. It was the same with the MX200, MX, MX400 and GTS. The only model of that era that had a fixed amount of RAM was the Ultra which always came with 64mb...mid and low end had tons of options that allowed the consumer to scale with their own price point.

        What I'm trying to say here is that memory was a choice back then, you could opt for a card with less RAM, to save a few bucks, or if you needed it, opt for a higher RAM card. You could even opt for a higher end card, with lower RAM if you needed the gaming performance but weren't interested in productivity etc or you were limited to a certain resolution....or if you had the money, you could have both! These days, that choice is gone, which is why the pricing is all to fuck. There is one card per tier...you as a consumer have no way to adjust your own pricing / expectations. You pick a tier, and that's what you get. You can't really "spec up" a GPU anymore. You have to go as high as you can afford...which was never the case in previous generations.

        If we had the same thing today, then we'd have a higher RAM variant of the 4080 (for those who want to push that little bit further, for less money than a 4090) and 4070 (for those that want 1080 with various settings cranked up) in each tier...and the 4090 would have more RAM because there would be only one option.

        https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce2_series

        There is a massive area in the GPU space that nobody ever considers that would benefit from higher RAM on a lower end board...and that is render scaling. Rendering at a higher resolution and sampling down in lieu of anti aliasing etc...it requires more RAM, but has less of a performance overhead if you have the RAM for it.

        The direction we seem to be going in though is rendering at a lower resolution and scaling up...which is part of the argument for having "enough RAM".

        If you believe that line of bullshit, you're an idiot. It's fucking bollocks...scaling back the RAM just makes their margins bigger and if they can convince someone like you that you don't need more RAM and they can still hike the price while providing you with less, then they've fucking won and we're doomed to have shitty GPU prices for the foreseeable future...unless we start demanding what we're paying for.

        NVIDIA has been eroding choice for years to force people into higher price points...fooling people into thinking that there are only "three tiers"...it's fucking garbage and not how they used to operate. Especially with board partners.

    2. Anonymous Coward
      Anonymous Coward

      Re: "due to supply chain constraints"

      None of those you list are AI cards. See Nvidia H100 etc for the real stuff.

      1. Anonymous Coward
        Anonymous Coward

        Re: "due to supply chain constraints"

        They're pretty real if AI is to go properly mass market and not stuck behind paywalls and gatekeepers.

        1. Roland6 Silver badge

          Re: "due to supply chain constraints"

          Mass market AI will be what ever Microsoft decides to require for Windows, also it will be integrated into the CPU package.

  2. Doctor Syntax Silver badge

    And yet whole warehouses are supposed to be full of stuff that can't be shifted because nobody's buying. Maybe they're so full it take 39 weeks to find the right SKU.

  3. 43300 Silver badge

    Hardly surprising. Dell's supply chain for mid-to-high range stuff has been iffy for a while. It took us nearly a year to get some new switches last year, and we had to go for a more expensive model in the end to get the wait time down as the ETA for the ones we'd ordered kept on getting moved a few more months into the future.

    1. Gene Cash Silver badge

      > we had to go for a more expensive model

      Hm. So you rewarded them for having an iffy supply chain?

      1. Doctor Syntax Silver badge

        Alternative view - the iffy supply chain did its job.

      2. 43300 Silver badge

        We temporarily used a different brand of switches to connect the servers and storage together - bad move! Not at all reliable. In theory, of course you can mix and match - but in practice...

  4. Bitsminer Silver badge

    39 weeks? Meh

    Back when I was just a little bit older than a kid, we had to wait 14 months for a PDP-11/44.

    You youngsters don't know when you have it so good.

    1. John H Woods
      Joke

      Re: 39 weeks? Meh

      Says the person whose name suggests they've got all the GPUs ....

      1. Bitsminer Silver badge

        Re: 39 weeks? Meh

        Alas, my moniker predates NVIDIA and CUDA. It used to mean extracting info (data mining) from bits, but, you know....

        1. Yet Another Anonymous coward Silver badge

          Re: 39 weeks? Meh

          I had to get up in the morning at ten o'clock at night half an hour before I went to bed, work twenty-nine hours a day down bit mine, and pay mine owner for permission to come to work, and when we got home, our Dad and our mother would kill us and dance about on our graves singing Hallelujah.

          And you try and tell the young people of today that ... they won't believe

  5. aerogems Silver badge

    Dude, you're getting a Dell!

    In about 8-9 months.

    1. Roland6 Silver badge

      Re: Dude, you're getting a Dell!

      So the real question is whether it is worth getting a place in the queue and the selling at a premium Dell servers to those who can’t wait….

  6. Benegesserict Cumbersomberbatch Silver badge

    Your order is gestating...

    No coincidence: 39 weeks is how long it takes to build an unartificial intelligence, and that's with unskilled labour.

    Do we really know what's going on in their factories?

    1. Yet Another Anonymous coward Silver badge

      Re: Your order is gestating...

      How many kids with a copy of wikipedia can you fit in a 42U ?

      The ultimate mechanical Turk

  7. Ian Johnston Silver badge

    So people who order now will get their servers just in time for the AI bubble to burst? Oh well, they can store them in the same cupboard as their crypto mining rigs.

  8. MOH

    "There'll be 300 million PCs turning four years old next year"

    There's one 3-year-old PC for every 24 people on the planet???

    Come on, Reg. This is clearly nonsense.

    1. Roland6 Silver badge

      “ Worldwide PC shipments totaled 261.2 million units in 2019…”

      Gartner…

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like