back to article AMD pushes 64-core 4.2GHz Ryzen Threadripper Pro workstation processors

AMD has revealed a new range of microprocessors intended for use in workstations. The Ryzen Threadripper Pro comes in four variants, detailed below, and has been juiced to ensure that users of demanding desktop applications can run reams of RAM and have bulk bandwidth to shunt data around. Here’s what you can get your hands on …

  1. NATTtrash

    AMD has said the new silicon will be confined to “OEMs and system integrators,” which sadly seems to rule out individual chip sales to enthusiasts who fancy building their own machines based on the new processors.

    That's a pity, but maybe also good protection against hurting myself. News like this always gives me an itch "to build it and see what it can do". Then again, I must also admit that such a system might be a bit of overkill for my daily Solitaire use case nowadays...

    1. Eek

      I believe it uses a different socket and motherboard to the none Pro chips - which given the limited market size means none OEM motherboards would be very expensive.

    2. PhoenixKebab
      Thumb Up

      Overkill?

      I feel that one core per playing card could be a little excessive.

      1. chuckufarley Silver badge

        Re: Overkill and over powered?

        If it's like the Ryzen 3000 series then the lowest idle clock supported will be 2200MHz per core. That 280 Watt TDP on the high end would then be book-ended with high idle power draw. Which means C-State 3 or higher would be needed after a just a few minutes of idle to save money.

        1. Boothy

          Re: Overkill and over powered?

          As far as I'm aware Ryzen puts cores to sleep if not currently in use. I can't imagine Threadripper being any different.

        2. prismatics

          Re: Overkill and over powered?

          As to independent measurements from the overclocking scene (known for going excessive lengths to achieve their goals for world records etc.) a single Zen 2 core in its default configuration for Windows operation can go as low as 200 MHz, 0.2 V core voltage and a resulting power draw of about 120 milliwatts per CPU (not entire package, just a single CPU). Since Threadripper Pro has 64 of these, a simple linear extrapolation results in an idle power consumption of about 8 W, which is right in line (about double as much as) what my Threadripper 3970x 32-core setup at home consumes when doing nothing and not moving the mouse cursor in booted-up state.

          Cool'n'Quiet it is.

          1. chuckufarley Silver badge

            Re: Overkill and over powered?

            That is odd because in Linux if I open a terminal while my system is idle and run "watch -n 5 grep MHz /proc/cpuinfo" I see ~2200 MHz on all cores.

            1. Anonymous Coward
              Anonymous Coward

              Re: Overkill and over powered?

              What kernel are you running?

              To accurately read CPU speeds with Ryzen 3000 series you need a 5.4 or later kernel.

              1. chuckufarley Silver badge

                Re: Overkill and over powered?

                5.4.0-40-generic #44-Ubuntu SMP Tue Jun 23 00:01:04 UTC 2020 x86_64 x86_64 x86_64

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Overkill and over powered?

                  And all BIOS/AGESA updates are applied?

                  Depending on purchase date, there were some issues with power management caused by manufacturers trying to squeeze a little more performance out of their products.

            2. Anonymous Coward
              Anonymous Coward

              Re: Overkill and over powered?

              Is this down to the scheduling? On my PC there seems to be a background 1-2% CPU utilization on each core even when the system is "idle". I know sod all about Linux, but my guess is that all the processes have been split across all the cores to minimize the amount of switching between tasks. Are there any settings which can avoid using some of the cores until utilization on the active ones exceeds a threshold (i.e. not assign any tasks to a core until the others are exceeding 10% utilization or something). That would allow the unused ones to sleep properly.

              Corrections (or links to sites to educate myself on this topic) welcome.

              1. chuckufarley Silver badge

                Re: Overkill and over powered?

                I don't know what Ubuntu would be scheduling across all cores when I have load average of 0.1 or less. That doesn't rule it out but I think it makes it unlikely.

                1. Gordon 10

                  Re: Overkill and over powered?

                  2 possiblities.

                  Ubunutu's power management may be worse than Windows for these cpu's - wouldnt be the first time a CPU ID needed putting into a table somewhere.

                  OR

                  Ubuntu may not have the porky pie patch that windows has....

    3. Bronek Kozicki

      I wonder which UK-based "OEMs and system integrators" will start producing workstations with these CPUs inside.

    4. 9Rune5

      Hang on...

      That there CPU contains more cores than there are cards in a deck of cards. Surely that must give some sort of boost to your game of solitaire?!?

      But yeah, your comment echo my own thoughts. I would love to have that beast on my desk, but putting all those cores to good use is a headscratcher for sure. Can it play Crysis?

      1. hnwombat
        Pint

        I'd kill for one of those, with two or more chips in, in fact. I do agent-based modeling with one process per agent-- I could load one of those machines in a heartbeat. :-)

    5. RLWatkins

      > protection against hurting myself

      [grin] Yeah. At full tilt this CPU consumes a bit more than 1/3 horsepower worth of electricity.

      If I want one of these, I want an experienced technician to build it.

  2. Anonymous Coward
    Anonymous Coward

    Yawn.

    A new CPU is out, that's a bit faster than the previous one. That's happened more or less every six months for the last 50 years. There will be an even better one along in six months time.

    The only relevant factor in determining the speed of a computer is its age. The computer you buy today will be much faster than the one you bought 3 years ago and much slower than the one you buy in 3 years time. The relative advantage of any particular chip compared to the others available at the time, is negligible in comparison.

    1. Halfmad

      I could walk into any shop and buy bread, hell I could pick up 3 different loaves in the same shelf and get different best before dates.

      Same applies to computers, it's not WHEN you buy it but WHAT you bought and in many cases longevity is more linked to the motherboard than the components which are incrementally updated which insert into it etc.

    2. Peter2 Silver badge

      This is not just "a bit faster than the previous one". This is an entirely new level of power so monumentally massive that it's mindblowingly absurd.

      AMD took us from single cores to double, then went to quad cores. We've largely been sitting between "do you have 4,6 or 8 cores" since.

      A new option of "do you fancy sixty four physical cores in your desktop PC" is going to be quite popular with a certain market segment. Like 3D rendering and games designers for instance, where a single PC could now push more performance than a small rendering farm. Or possibly games designers if somebody wanted to produce a worthy successor to Crysis, which lest we forget was designed on overpowered hardware and nobody could play it on full detail and full resolution for about a century after it was released.

      And this probably also gives notice that in the future AMD will be upping core counts quite a bit for desktops in the future, which will no doubt be greeted cheerily at Intel HQ this morning.

      Intel's response has to either be to produce better processors to compete (in which case everybody wins) or reduce their prices from their current significant premium to be more competitive. (in which case everybody wins) so everybody wins even if they want to pay more for an Intel processor.

      1. 9Rune5

        ...or Intel continues to not respond, in which case AMD wins.

        And that is 100% okay with me.

        1. Qumefox

          It's not ok with me. We need competition to continue progressing forward. Without it, you just end up with AMD ending up doing the same thing Intel did for the last decade.. Jack squat, because there wasn't need to innovate because they had no real competition.

          The absolute best outcome for consumers will be if intel and AMD keep being neck and neck and trade blows at rough parity. When one side gets too far in the lead for too long, you end up with things like 14nm++++++++++++++++++++++++++++++. Only if AMD gets to be the dominant player, it'll end up being Zen (number)+++++++++++++++++++++++++++++

    3. Boothy

      Quote : A new CPU is out, that's a bit faster than the previous one.

      Technically, these are actually slower than the previous ones, at least on paper.

      As an example, the top two Threadrippers', pro vs 'old'...

      3995WX (64 core) clocks at 2.7/4.2 GHz (base/boost). (new)

      3990X (64 core), clocks at 2.9/4.3 GHz (base/boost). (existing)

      3975WX (32 core) clocks at 3.5/4.2 GHz (base/boost). (new)

      3970X (32 core), clocks at 3.7/4.5 GHz (base/boost). (existing)

      Plus the older X versions are all unlocked, so can overclock, whereas the WX pro versions are locked, so can't be overclocked.

      As far as I can see, this about thermal stability for the OEMs, so making case design easier, plus warranty support etc. Plus the introduction of ECC memory, and the increase in the amount of RAM supported.

      As I see it, existing X is basically aimed at enthusiasts and early adopters, that perhaps don't mind a bit of instability, or even building their own machines.

      Whereas the new WX is aimed at people who want an off-the-shelf, solid system that can happily sit running at 100% all day long without cooking itself.

      1. Fenton

        8 Channel Memory will however give it a massive performance boost in some scenarios

        1. NetBlackOps

          Out of all the specs, that's what caught my attention. Quite common scenario here.

    4. Piro Silver badge

      Wrong

      There's no way you can say in blanket terms, that a PC you buy today will be faster than the one you bought 3 years ago.

      Maybe you've forgotten that laptops exist, or budget products exist?

      The work laptop I received at my new job was slower than the work desktop PC I had 3 years before it.

      1. Korev Silver badge

        Re: Wrong

        A quick looks at Tom's Hardware's CPU hierarchy's "bins" will show that 5+ year CPUs are still perfectly fast and you won't get a noticeable increase in speed for many workloads...

  3. Brex
    FAIL

    ARM will rule them all

    Since Apple is coming soon with ARM-powered Macs that will dominate these irrelevant offerings by AMD in both heat dissipation and overall performance, why should we care?

    iPads/iPhones are already more powerful than most laptops in the market; now imagine a multiple-core Mac with 100x the processor power of iDevices.So only unemployed geeks will be worried about using any of these weak Ryzen processors.

    1. Anonymous Coward
      Anonymous Coward

      Re: ARM will rule them all

      You should know that any post here praising the Cupertino 'tat slinger' will be downvoted and especially one slagging off AMD who are liked here simply because they are not Chipzilla.

      the ARM powered Mac's will be fair game for the Apple haters here. The 'Walled Garden' taunts will get even worse.

      1. Anonymous Coward
        Anonymous Coward

        Re: ARM will rule them all

        Err the post said "...imagine a multiple-core Mac with 100x the processor power of iDevices." OK. Now I'm imagining AMD's Ryzen-based chips keep advancing, and they produce a CPU with more power than that. Now I'm imagining that Intel get their act together in the next few years and produce their own chips which are just as competitive. Imagination is a wonderful thing.

        None of this imagining is providing any useful contribution to the debate. Not every downvote is because the poster mentioned Apple.

      2. Peter2 Silver badge

        Re: ARM will rule them all

        AMD is not liked here simply because they aren't Chipzilla.

        AMD is liked because they a very long history of creating innovative products and forcing competition on both speed and price going back to when Intel tried to wipe them out by refusing to license the 386, at which point they tinkered with the 286 design to the point it ended up with double the clockspeed of the original 386 while maintaining compatibility with the existing motherboards, meaning that one could simply swap the processor and have a "286" with double the speed of the early 386's. Somewhat handy if you were on a budget, given that computers back then cost rather more than they do now, especially if you take inflation into account.

        This continued on to introducing the mixed 32/64 bit processors, and also multi core processors. None of these would be with us if not for AMD creating them. Intel is quite unlikely to have done on their own.

        With no AMD competition we'd likely still be running single core processors in desktops and massively expensive Intel Itanium processors for servers. Companies only up their game through competition, and despite very well known anti competitive behavior AMD has provided that competition and repeatedly either pulled Intel forwards in terms of performance, or dragged their prices down.

        1. Roo

          Re: ARM will rule them all

          This isn't intended as a nitpick - but when you say "processors" you appear to mean x86 compatible processors. The RISC guys were generally ahead in these milestones by a few years. Case in point MIPS shipped the first 64bit microprocessor in 1991, and IBM shipped their multicore POWER4 in 2001. :)

      3. Glen 1

        Re: ARM will rule them all

        The Apple "hate" largely comes from Apple itself not responding well to being called out on it's bullshit.

        As in no longer replying to requests for comment if you're not prepared to brown nose them. Biting the hand that feeds IT and all that.

        Microsoft might not have a glowing reputation around these parts, but I don't think *they* ever blacklisted el reg.

        Its as if you're new around here...

        1. P. Lee

          Re: ARM will rule them all

          Apple don’t love tech. The provide “just enough” cpu for x% of their use cases.

          Possibly sensible, but neither fun, cool, interesting to most people here.

          They make sports cars which look great but aren’t that quick.

          Sometimes they do dumb stuff like a virtual brake (esc) button.

          1. Glen 1

            Re: ARM will rule them all

            Sensible like a $999 + tax monitor stand?

            We call it the "Apple Tax" for a reason.

    2. Captain Scarlet
      Trollface

      Re: ARM will rule them all

      "iPads/iPhones are already more powerful than most laptops"

      Can they run Crysis reasonably?

      1. Tom 7

        Re: ARM will rule them all

        "iPads/iPhones are already more powerful than most laptops" is the software so shit they need that kind of power or do people generally run massively complicated calculations while LOLing to each other?

        1. Anonymous Coward
          Anonymous Coward

          Re: ARM will rule them all

          They are so powerful as they have been used as test beds for Apples laptop/desktop ARM processor evolution, which has benefitted iPad/iPhone users. A very sensible strategy from Apple.

        2. Paul Shirley

          Re: ARM will rule them all

          You haven't used a mass market consumer grade laptop recently have you? It's a very low hurdle.

          1. Captain Scarlet

            Re: ARM will rule them all

            Yeah thats why I tend to get second hand HP ProBooks, try to get one with a mechanical HD as they often go cheaper on eBay. Most have inbuilt Windows keys now and are simple to upgrade.

            Pretty sure any business machine from other brand like Lenovo are as easy to get documentation on full disassembly so upgrades aren't the nightmare of the crappy £300 notebook computers on offer.

    3. RLWatkins

      Re: ARM will rule them all

      Sure, ARM makes good chips, but judging by their specs it's difficult to see how they'll "dominate" any "irrelevant" AMD chips, which presently outperform them.

      Moreover, anyone who thinks an Apple handset or tablet outperforms "most laptops" then hasn't seen a modern laptop in a while.

      Tell us the truth: is this post satire?

      1. Anonymous Coward
        Anonymous Coward

        Re: ARM will rule them all

        AMD won’t be irrelevant, but you don’t seem to know what the current laptop market is, nor how well the A12/13 perform. They comfortably keep up with my work laptop, which admittedly is only an i7 Kaby Lake. The majority at work have much slower laptops than that. (Over 1000 employees)

        Yes at the high end there will be laptops which are faster than what we currently see but they are not the majority sold. We are also yet to see what Apple are going to put in desktops/laptops. The current iPads are using a 2 year old CPU.

        1. Anonymous Coward
          Anonymous Coward

          Re: ARM will rule them all

          Without telling us what you're running on both your A12/13 device and your laptop, your comments are meaningless.

          Different software on different hardware runs differently. Who knew?

          Come back and tell us that your A12/A13 device is keeping up with your laptop whilst running several applications in true multitasking mode, and like-for-like software, and then we can have a discussion.

          1. Anonymous Coward
            Anonymous Coward

            Re: ARM will rule them all

            or we can just wait until Apple release and you will see you don't know what the fuck you are talking about. when you get a fucking clue then we can have a discussion.

  4. Mike 137 Silver badge

    Questions

    Given 2 TB of RAM, wonder how long it will take for software bloat to eat up this performance.

    It also seems to me as an engineer that 2 TB of RAM is a huge target for transient bit errors, so I wonder what the long run time reliability will be.

    1. Robert Sneddon
      Mushroom

      ECC

      One of the improvements over the existing Threadripper workstation CPUs these new devices promise is support for ECC memory. It's mentioned in The Fine Article as well as all the press reports I've seen around.

      Error-correction support for RAM is long overdue for modern commodity desktops and laptops now that 16GB is "a good start" for most out-of-the-box machines and 64GB is affordable and attainable even on medium-level hardware. The AMD Ryzen desktop I'm planning to build this winter will start with 64GB (because I run Firefox) but I've got the option on existing motherboards for 128GB of RAM as a mid-life kicker a few years from now. I'd really like that RAM to be ECC but it's not going to happen, at least with the existing and affordable mobo offerings currently on the market.

      1. -tim

        Re: ECC

        The Ryzen Matisse and later support ECC. I don't have terabytes of RAM but I've had different systems report 3 uncorrectable ECC errors this year with one doing a system panic. All new systems will be getting ECC from now on out.

  5. This post has been deleted by its author

  6. LB45

    Lots of cores and 2 TB

    Crossfire (cuz AMD) a couple of high end video cards and it should be able to run the new MS Flight Sim with multiple screens.

    Of course should and can are two different things.

  7. rich_a

    I need one of these for worldcommuniygrid

    If a lottery win comes up this week I'm setting up a 64 core worldcommunitygrid task muncher/space heater in the garage.

  8. Rainer

    No idea what I would do with that

    I don't game, I don't do ML, I don't do sims, I don't do 3d, I don't to video.

    Just a couple of VMs, Youtube, reading El Reg and reddit...

    1. Anonymous Coward
      Anonymous Coward

      Re: No idea what I would do with that

      Rainer,

      "No idea what I would do with that ...."

      But .... you would not turn down the offer of a 64 CPU 2TB Workstation, if offered !!!

      [If you do ..... I will take it off your hands ..... just to be friendly :) ;) ]

  9. Anonymous Coward
    Anonymous Coward

    Pst if you use a hacked bios you can remove the power limits on regular epycs.. the new 280W 4.2GHz top speed 3995WX is just a 7702P with the 200W 3.35 GHz limit removed! a great board to put it in is the asrock ROMED8-2T which gives you 7 16x pcie4 lanes :D

    Make sure you cool your VRMs else they'll melt XD

  10. Lorribot

    At a rough guess the to end 64 core TRs will sell at around £3k per processor at least, if the can sell 1 million of them that is around £3bn, All in the workstation market could be as much £5-7bn a year to AMD. who would not be interested in that kind of cash flow?

  11. Agent Tick
    Facepalm

    Another new hardware (sigh)

    AMD has become the new hardware Kmart - every month or so something new...but what about their driver quality?

  12. Brex
    Mushroom

    Apple will rule them all

    It seems like my original comment ruffled a few AMD-fanboy feathers around here - yet they fail to counter any of the arguments I put forward when stating that this Ryzen development is basically irrelevant.

    Again: Geekbench shows that, even with TODAY's A12/13 processors, iDevices already outperform most of the laptops on the market today - this is a fact. And to respond to another (un)informed poster: we are talking here about processors that are used for AAA gaming, Photoshop-grade apps and advanced audio and imaging, including multitasking. Now imagine what could happen in Macs with vastly larger heat dissipation areas/cooling parts.

    And let us not forget what the acronym ARM means: "Advanced RISC Machine" - so Apple is now in a position to further leverage RISC superiority with much lower heat production and even higher core scalability.

    Both AMD and Intel will be dead in no time.

    1. NATTtrash
      Trollface

      Re: Apple will rule them all

      Now imagine what could happen in Macs with vastly larger heat dissipation areas/cooling parts.

      Indeed, that would be a "Genius" move...

      Fixing Apple's Engineering in an Hour

      https://www.youtube.com/watch?v=MlOPPuNv4Ec

      ...acronym ARM means: "Advanced RISC Machine"

      Well, maybe it comes as a kind of a new discovery for you, but most of us here kinda know what ARM is. I personally already had one in the 80s.

    2. Anonymous Coward
      Anonymous Coward

      Re: Apple will rule them all

      Quote: "Geekbench shows that, even with TODAY's A12/13 processors, iDevices already outperform most of the laptops on the market today"

      So you're basically saying a nice new shinny A12/A13 outperforms older laptops. So what? Not exactly relevant when the discussion here is around high performance desktops.

      Also if you compare against a modern mobile CPU, such as the Ryzen 4900H, it leaves the A12/A13 in the dust, and that's a mobile Zen 2 part, the desktops are even faster, and Zen 3 is due out shortly which will be even faster than that.

      Quote: "we are talking here about processors that are used for AAA gaming"

      On what platform? AAA games only really exist on PC and major consoles, with a very few AAA games getting onto Mac. These platforms, all except one, use x64 based CPUs, either directly (PC & Mac) or as a custom AMD x64 solution in the major consoles, PS and XBox (which in turn, is based on a desktop AMD x64 chip anyway).

      The only major console not using x64 is the Nintendo Switch, which does have AAA games, and this does use an ARM processor. But Switch games are also very cut down graphically compared to XBox/PS4 versions (and even more so compared to PC versions). With the platform struggling to play game like 'The Witcher 3', which on that platform has very low frame rates, poor looking GFX, massive pop-in issues etc when compared with say the XBox version of the same game.

      There is no other platform around that plays AAA games using ARM that I'm aware of, and no, mobile devices do not count, as they are unable to cope with real AAA titles. Mobile titles, even if it's part of a AAA franchise, are mobile specific versions, and are very cut down and simplified compared to the full blown versions found on PC, XBox, PS etc. And games like Fortnight don't count, as they are very basic graphical games, designed for high frame-rates, where the non mobile platforms have the advantage, and always will.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like