back to article AMD puts Intel in rear view mirror with Threadripper Pro 9000 high-end desktop chips

AMD aims to extend its lead over Intel in the high-end desktop (HEDT) and workstation arenas with its 9000-series Threadripper workstation CPUs teased at Computex this week. Compared to previous Threadrippers (TR), the 9000 series appears to be a fairly sedate update with most of the gains coming from process improvements and …

  1. AMBxx Silver badge

    350W!!

    Not sure I want to be in the same room as the cooling fans for that. Need a BIG upgrade for my UPS too.

    1. Wellyboot Silver badge

      Re: 350W!!

      Liquid cooling with a big radiator

      1. NoneSuch Silver badge
        Boffin

        Re: 350W!!

        "Liquid cooling with a big radiator"

        You do understand that a radiator RADIATES heat. The water in the loop makes this heat transfer more efficient, so your room will actually be hotter as your PC runs cooler.

        That's the First Law of Thermodynamics.

        1. FeepingCreature

          Re: 350W!!

          Liquid cooling with a big radiator stationed on your balcony :)

          1. The man with a spanner Bronze badge

            Re: 350W!!

            ...or heat your hot water tank and have anice shower, or even heat the outdoor swimming pool.

        2. award

          Re: 350W!!

          And the second law of thermodynamics...

          Heat, cannot of itself, pass from one body to a hotter body

          You can try it you want to, but you'd far better notta!

    2. Hans 1
      Windows

      Re: 350W!!

      The 7000 series already required 350W, I guess you are a bit late to the party, and yes, you will need a good powersupply and colling equipment to handle it.

    3. FeepingCreature

      Re: 350W!!

      I have a ~350W GPU and I can confirm that after a few hours of load, my room is noticeably warmer.

      1. The Dogs Meevonks Silver badge

        Re: 350W!!

        My home office has 3 systems in it, the mediaserver that runs 24/7, a work rig that runs 30hrs a week and my personal gaming system that's probably on for an avg of 8hrs a day. It's also in a South facing room that gets lots of direct sunlight.

        So I actually leave the doors to the office and the bedroom next door open, to allow the heat to spread throughout the upstairs rooms. It reduces the need for radiators being turned up during the winter. In fact, it's only the North facing master bedroom and ensuite that have their radiators on full as those rooms can easily be 4-5ºC colder than the rear South facing ones... partly due to the bay window and less insulation above it that the rest of the roof. So allowing the extra heat from those rooms to spread out evens the temps upstairs and reduces the need to heating to come on upstairs.

        It's a terribly inefficient way to heat a home, but they'd be on anyway... Might as well make the most efficient use of the waste heat that I can.

        A friend has the same issue... his solution was to buy a portable AC unit... increasing his power usage significantly. He's much cooler than me in the summer as all I use is a fan to circulate air and open windows... But he's got a smaller house and uses 20-30% more power than me... and that's without factoring in the solar and battery system I have that reduced electric import from around 5500kwh a year to 1500kwh as well as earning me a little extra money of my bills from the export (£151 in 2023, £95 in 2024), this month alone has earned me £36 and the gas/electric bill is currently £25 inc all charges.

        1. FeepingCreature

          Re: 350W!!

          I'm fully on board with using a computer to heat, but portable AC units are actually amazing. :) I have one on my balcony that I've converted to dual-hose operation, so it doesn't waste half its energy cooling fresh outdoors air. Two thirds of the year it does nothing, but in the summer months it's definitely a lifesaver.

    4. Gary Stewart Silver badge

      Re: 350W!!

      May I suggest a widow air conditioning unit ;) And think of the money saved in winter;)

      More seriously, this is not an unreasonable amount of power for a workstation CPU with current technology using 100's of billions of transistors running at multiple GHz. Physics can't be defied (god I hate the use of that phrase in reverse) with these working parameters and water cooling would probably be required. Although I said it in jest above, a small window AC unit in the summer would help greatly and cost a whole lot less than just the CPU by itself.

    5. Kevin McMurtrie Silver badge

      Re: 350W!!

      Dual incomes: Cloud computing racks indoors, heated pool outdoors.

    6. Anonymous Coward
      Anonymous Coward

      Re: 350W!!

      The same TDP at 12 cores as at 96 cores... perhaps they're binning cores and putting *extremely* inefficient cores in cheaper processors. That seems odd.

      It also seems like AMD is throwing away *all* of the lower-power lead that they'd held over Intel for so long.

      Something smells funny.

      1. druck Silver badge

        Re: 350W!!

        Different base clocks. You can either have lots of cores going slowly, or a smaller number going faster, and then one or too getting the really quick boost speeds for a very limited time.

  2. FeepingCreature

    192 threads at 5.4Ghz boost

    I know it won't run at 5.4Ghz across the chip because heat, but in theory if you hooked a monster of a water cooler up to it, you could run AVX512BW FMA, that's 128 int8 ops per cycle, over 192 threads of 5.4Ghz would give you 132TOPS on CPU alone, or 32TFLOPS if you used float32. And that's with zero dedicated matrix ops.

    If AMD added dedicated matmul accelerator units, their CPUs could compete with their GPUs, and arguably they should. No need for ROCm, just use the CPU backend.

  3. Anonymous Coward
    Anonymous Coward

    Man...

    ...I wish the price of these was more down to Earth.

    1. Geoff Campbell Silver badge
      Go

      Re: Price

      If you have the kind of workload that genuinely requires that sort of horsepower, the cost of the CPU(s) will be a rounding error in the overall budget.

      GJC

      1. elsergiovolador Silver badge

        Re: Price

        There is plenty of hobbyists who have ideas but don't have funds. The view that only rich can be solving problems that need massive compute power is nonsense.

        Unfortunately the gap is widening. We had small window of time where working class had almost equal opportunity, now it is again getting out of rich and only the wealthy are in the game.

        1. cornetman Silver badge

          Re: Price

          > We had small window of time where working class had almost equal opportunity, now it is again getting out of rich and only the wealthy are in the game.

          Sorry, that's complete bull. You can pick up a second hand Ryzen 3700x from AliExpress for just a tad over $100 CAD. 8-core 16thread CPU that chomps through any developer workload I can throw at it. Multithread compiles are awesome. What is *actually* happening is that *all* computing is getting incredible, but affordable options that don't suck are now available to everyone at the mid- and lower-levels.

          What we *do* need is better access to reasonably priced mid-level graphics cards. AMD are staying away from the very top because they are just prestige products and there isn't really much money to be made there. Intel are doing good work at the lower levels. At some point, I might check one of their GPUs out.

          1. elsergiovolador Silver badge

            Re: Price

            Sure, CPUs are cheap now. You can grab a used Ryzen, run some containers, fine-tune a model, compile your project fast. It’s never been easier to feel like you’re in the game.

            But the actual frontier - training cutting-edge AI, simulating biology, discovering materials, building tools that could reshape industries — now lives behind racks of H100s that rent for more than your rent. And let’s be clear: it’s not that ordinary people lack ideas - it’s that the barrier to test them has become financial, not intellectual.

            Back in 2010, a VC-backed company had better engineers and more reach - but their hardware looked a lot like what a determined hobbyist could piece together with off-the-shelf parts. The playing field was uneven, but not out of sight. Money bought exposure, not exclusion.

            Now? VC money buys entry. A single H100 costs £2k a month to rent. You don’t need AWS credits to scale - you need them just to begin. And while the cost of “doing something” dropped, the cost of doing anything that could compete exploded.

            So we hand out cheap dev tools and tell people the future is wide open. Meanwhile, real progress happens behind “contact us for pricing” logins and private APIs. The next big thing isn’t waiting to be discovered - it’s waiting to be afforded.

            People think being able to code means they’re in the race. But the race already left. They’re running laps in a fenced-off training ground - and calling it innovation.

            1. cornetman Silver badge

              Re: Price

              When I was starting in computing, actually owning a computer was completely beyond comprehension. They could only be afforded by large corporations.

              Later, you could buy a home computer by the likes of Commodore or Sinclair which were capable but filled a fairly narrow niche. Not exactly cheap but affordable. However, the cutting edge mini and mainframe machines were again affordable only by corporations or universities.

              Now you can buy a powerful GPU and CPU and make an affordable power house, but yet again if you want the best that money can buy, you need a lot of money indeed. You *can* do AI tasks on a machine that is affordable if you are fairly well-heeled professional but if you want to set up something that can serve "AI" to an entire country or the rest of the world, you need a stupendous amount of money. That software and the hardware behind it scaled *well back* is attainable by mere mortals but you need some decent resources.

              However, it has *ever* been true that if you wanted access to the serious, cutting edge, you were talking about serious money. I don't remember there ever being a time where the cutting edge was accessible to those who were not extremely rich. That balance has ebbed and flowed over time but it is true in all areas of technological life. I don't see that changing any time soon.

              1. elsergiovolador Silver badge

                Re: Price

                Yes, you can buy a 4090, maybe even a 5090. You can fine-tune models, run small experiments, maybe train something from scratch if you've got weeks to spare. But let’s be real: saying that makes you competitive is like saying you can open a lemonade stand and take on Amazon. Technically true, practically absurd.

                Training isn't just about running a few hundred epochs. It’s running those epochs again and again - after every dataset tweak, architecture change, preprocessing idea, or bug fix. Each cycle takes hours or days. Multiply that by the number of dead ends you’ll hit before getting anything remotely useful. You're not iterating. You’re aging.

                Meanwhile, serious labs are flooding GPUs with experiments around the clock, burning through models you haven't even read the papers for yet. You're slouched in front of a terminal at 3 am, watching your system crawl through epoch 7 of 300, knowing full well it’s dead on arrival - but you'll run it anyway, because it's all you’ve got. The fans scream, the power meter spins, and somewhere deep down you already know: even if it works, it won’t matter.

                Ten years ago, compute was still a bottleneck, but not a wall. Now it's a gate. And if you don’t have capital behind you, you’re not even in the race - you’re in the crowd, fiddling with scraps, being told it's a fair game.

                1. cornetman Silver badge

                  Re: Price

                  If you are doing serious cutting edge research in the area of AI, you aren't in that game. These enormous bit barns are leveraging massive scale to do business. They just also happen to be doing their own research because that's good business.

                  Anyone that is trying to do fundamental research by hammering problems with enormous compute aren't really doing anything new. They are trying to out-Google Google, or out-Amazon Amazon. That's not research: that's an exercise in futility.

                  1. Anonymous Coward
                    Anonymous Coward

                    Re: Price

                    Not really, because the likes of Google and Amazon don't publish a whole heck of a lot of their research in a timely manner...so you have to retread the same ground in order to figure out how to get ahead. It's all moats and walled gardens man. They don't want the competition (which is natural I guess) which leads to less progress. So yeah, Amazon and Google are doing their own esoteric research, but it's not research for the sake of progress, it's research for the sake of pumping out another subscription based service.

                    In the old days, it was always a race to the bottom...you wanted to figure out how to make the latest thing run on the oldest of crap by optimising it and refining it and eventually making it commodity tech...these days it's all about pricing people out of competing and preventing progress to keep shareholders happy.

              2. Anonymous Coward
                Anonymous Coward

                Re: Price

                "However, it has *ever* been true that if you wanted access to the serious, cutting edge, you were talking about serious money."

                Gaming kit was always the cutting edge if you go back 20 years. There was nothing above it unless you went into server territory...most of the industries that rely on GPU technology have a lot to thank the gaming industry for...most of the software developers that are older than 30 probably started out learning their craft on gaming kit which is why we have things like CUDA now.

                You may have had the odd bit of "professional" kit, but it was never orders of magnitude more expensive...like a Quadro graphics card for example. It might have been 30-50% more expensive...and for that extra money, you actually got extra features that were worth forking out for that weren't possible on "gaming" kit...like higher floating point accuracy for example...features that were more like compromises...if you wanted higher precision, you usually got less throughput (which is why Quadro cards have historically sucked at gaming)...there were actual reasons.

                The price of a Threadripper is just arbitrary at this point. AMD are just taking advantage of their market position. Nothing more.

            2. Anonymous Coward
              Anonymous Coward

              Re: Price

              So much this. Fucking "contact us for pricing" is a fucking sham...even worse is the "you need to sub to access this" bullshit which is creeping in everywhere on Azure and AWS. I got an alert the other day on AWS for me to sort something out in order to prevent an account being locked...which is fine, but the instructions sent me to a page that I couldn't use unless I paid, and thus I couldn't proceed without forking out or signing up for a 60 day trial that I will inevitably forget about and end up being billed annually for that won't get picked up until the cost review in 3-6 months time. It's nonsense.

              "So we hand out cheap dev tools and tell people the future is wide open. Meanwhile, real progress happens behind “contact us for pricing” logins and private APIs. The next big thing isn’t waiting to be discovered - it’s waiting to be afforded."

              So much this, people don't realise how walled in they are with "free" dev tools...but you can't really escape them.

          2. Kevin McMurtrie Silver badge

            Re: Price

            I don't think that's going to work for researching new AI algorithms. Good luck buying a fast 1TB of RAM too. Training takes an insane amount of resources even for tiny datasets.

            If your workload is compiling and testing ordinary app code, any modern multi-core desktop chip is fine.

          3. Anonymous Coward
            Anonymous Coward

            Re: Price

            Yeah that's just compiling though, stuff that you can just wait for.

            I have work loads that can benefit from a Threadripper that can't justify the price...like modelling a web farm setup and load testing etc to find bottle necks and so on. It's not a perfect model, but it's "good enough" for most stuff...for a lot of folks I do this sort of thing for, the cost of a Threadripper is their annual hosting budget and I'd need hundreds of customers to pay for it...at which point, I'd need more staff and still wouldn't be able to afford enough Threadrippers.

            It's not just curing cancer that you need loads of compute for, there are plenty of other use cases that don't attract the same kind of capital.

            I agree with you from a hobbyist point of view, but I'm more of a "prosumer". I can make do with lower spec stuff, but I really shouldn't have to for arbitrary board room level shareholder bullshit.

            Your GPU argument doesn't really stack up either...because with 128 PCIe 5.0 lanes, I could merrily get tons of performance out of shit loads of low end GPUs rather than mid tier ones...the problem with GPUs is feature lock. As in, stuff only available on newer cards (even though the older ones could support it). There are no features on the 50 series that couldn't work on a 20 series for example...the 20 series might be a lot slower, but that doesn't mean it won't work...and second hand 20 series cards, even the top end, are so cheap now that you could have several of them to make up the difference...if you had the PCIe lanes.

        2. Geoff Campbell Silver badge
          Facepalm

          Re: Price

          It was ever thus. Hobbyists in the '70s couldn't afford a Cray to do serious atomic modelling on. Hobbyists in the '90s couldn't afford a good CFD rig to fine-tune their racing car. Hobbyists in the '20s can't afford a room full of H100s to run AI on. So what?

          GJC

  4. Grindslow_knoll

    Performance with compatibility

    As long as you need to rewrite code (or translate) that uses CUDA to run on GPUs that don't support it, it won't catch on unless you're at a scale where you can pay for that overhead and the performance difference or availability makes it worthwhile.

    I'm not a fan of NVidia and I like AMD's offering, but each time I have to recommend something in a quote, this means it's almost always a no (at least for GPUs).

    It's quite curious how this is still not a solved issue, is the CUDA API so legally encumbered that at least partial compatibility isn't feasible? (not a lawyer).

    (yes, I'm aware of rocm, opencl, and HIP, but it's a non-trivial overhead for most end users)

    1. FeepingCreature

      Re: Performance with compatibility

      Pytorch mostly runs on both. I've had pretty good experiences; that is to say, things that are not explicitly written for NVidia (ie. direct shader code, nvcc calls etc.) will mostly just work now. ROCm is a function-for-function reimplementation of CUDA in the first place; I assume they just didn't want to fight a lawsuit over it.

  5. Rich 2 Silver badge

    R9700 and AI

    I read “workstation GPU” and thought “ah, a GPU for ’normal’ (ie not games) stuff”

    So can this Graphics Processing Unit actually be used as a graphics card? To render graphics? Or is it just a pointless power-gobbling AI “thing”?

  6. cookiecutter

    Amazing what happens

    When you prioritise research and creating products rather than grifting and stock buybacks

  7. Bitsminer Silver badge

    350 watts

    Or 0.469 horsepower.

    That's a lot of hay.

    1. FirstTangoInParis Silver badge

      Re: 350 watts

      So basically a Shetland Pony. I’ll never look at them the same way again. Never mind how many hands high, how many Watts?

  8. Creslin

    So slower than a 5yr old Nvidia RTX-3090 with 24Gb

    For AMD to compare a 16GB Nvidia card against their 24GB card when running a 23GB model tells everything we need to know.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like