back to article When two tribes go to war... Intel, AMD tease new chips at Computex: Your spin-free summary

AMD and Intel both teased details of their upcoming processors on Sunday at Computex, the computer industry's annual jamboree in Taiwan. Here's a quick summary of what went down between the CPU-GPU rivals. AMD AMD's second-generation Zen-based 7nm Epyc data center processor, codenamed Rome, is, we're told, launching Q3 2019. …

  1. john.jones.name
    Mushroom

    benchmarks ?

    any actual benchmarks would show that for server workloads (with security patches applied for Side Channel Attacks) it looks like Intel is toast...

    that combined with custom design projects means that AMD is taking most of the data center and embedded design wins so intel is left with laptops and desktops... not much growth there...

    they had better build that irish fabrication plant quick...

    1. Korev Silver badge
      Terminator

      Re: benchmarks ?

      any actual benchmarks would show that for server workloads (with security patches applied for Side Channel Attacks) it looks like Intel is toast...

      Aren't the point of benchmarks to show the actual situation in a reproducible way[0] so people don't have to speculate[1] about it?

      [0] Obviously how to benchmark is controversial and each vendor has its own preferred "method"

      [1] Pun semi-intended

  2. Tom 64
    Thumb Up

    Lookin' good

    AMD's new lineup does look good (I'll be picking up a Ryzen).

    Great to see them back in the game and giving intel a run for their money.

    1. Anonymous Coward
      Anonymous Coward

      Re: Lookin' good

      I saw some reviews of AMD's new x570 motherboard chipset, and combined with Gen 3 of Ryzen it appears they have not just stolen a march on Intel, but possibly the whole parade for a few years to come.

      I just hope Intel doesn't do a Boeing in trying to catch up.

      1. Peter2 Silver badge

        Re: Lookin' good

        Intel doesn't need to do a Boeing, they do an Intel.

        Look up their previous behavior when AMD comes out with hardware many years in advance of what they have. Or what they are doing this time, for that matter.

        1. whitepines
          Devil

          Re: Lookin' good

          Let's see...

          * Silently disable all security protections that might slow things down, while making sure their competitors have all protections turned up to max when benchmarking.

          * Cheat on the benchmarks with ICC

          * Try really hard to make all benchmarks about ST performance, then issue a 1kW chip clocked to some insane speed that just barely squeaks out a win over a standard part from another vendor

          * Lie about TDP

          * Rename their lagging process (again) to something like 14nm++SuperSpiffy

          * Crank up the DRM in their PAVP / IME. Maybe if Netflix requires a new CPU, they can sell more

          * Bribe Microsoft to accidentally crash Windows 10 on install / upgrade unless the CPU is new enough

          * Spread FUD about competitors

          * Pretend to embrace open technologies until they can fix their fab, then go "whoopsie, we didn't mean that. Suckers!"

          * Engage in shady back-room deals with cheap Chinese OEMs to keep Intel prices artificially low for consumers, while shafting corporations.

          Did I miss anything?

          1. Anonymous Coward
            Anonymous Coward

            Re: Lookin' good

            * It's suspiciously hard to order AMD powered PC's through the same large well known vendors who were found to have been bribed to lock AMD out of the market via the rebate scandal Intel had running.

            Dell basically doesn't stock AMD stuff. (Ok, technically they have an AMD desktop for like £5k which doesn't exactly play to AMD's traditional strength of being as or about as good as Intel but a lot cheaper...) Skeptical? Look at their website.

            HP sort of has a handful of AMD desktops, but good luck finding them, because they aren't obviously listed on the HP site. Have a look for cheapish AMD boxes on the HP website. Failed to find anything? Funny that.

            Now search for the HP Elitedesk 705. This is a Ryzen5 4 core/8 thread processor with an external Radeon R7 430 that can do duel screens out of the box, a 256GB SSD and 8GB RAM for ~£400.

            The reseller I was using had to ask me for the SKU because they couldn't find it through their system. When they did get it, and placed an order, I got a call shortly afterwards saying that they had been told that it was out of stock and would I like a more powerful Intel equivalent for the same price?

            I declined this kind offer, only to find that the utterly unavailable order was delivered within 48 hours anyway. Call me suspicious, but that looks like a deliberate effort to keep the numbers of AMD hardware shifted down while (probably) allowing them to claim that they aren't doing the same anti trust stuff as last time this time around.

    2. John Gamble

      Re: Lookin' good

      Yeah, I'm in the market for a new desktop, and the 3700X is looking good (the power use drop was one of the selling points for me).

      Haven't decided which OS I'm installing on it, though. I'd want to see what the BSDs and Linuxes are doing with it first.

      1. Tom 64
        Pint

        Re: Lookin' good

        It will probably be very well behaved under a modern linux kernel (e.g on fedora 30). Keep your eye on phoronix after release.

  3. John Savard

    Definitely of Interest

    One thing that had me wondering was why only 12 cores? But on further thought, it seems like a stroke of genius. Making a top-of-the-line part, and using only one of these fancy multi-chip cases, with two dies with six good cores is a better way to sell those cores than a six-core part that would not be exciting and would have to sell at a low price. As well, issues with not enough memory bandwidth to feed more than 8 cores are at least reduced.

    1. chuckufarley Silver badge

      Re: Definitely of Interest

      "One thing that had me wondering was why only 12 cores? But on further thought, it seems like a stroke of genius. Making a top-of-the-line part, and using only one of these fancy multi-chip cases, with two dies with six good cores is a better way to sell those cores than a six-core part that would not be exciting and would have to sell at a low price. As well, issues with not enough memory bandwidth to feed more than 8 cores are at least reduced."

      If I remember right the chips are built using dual core dies "glued" together so they can reduce the impact of manufacturing defects. Rumors state the Ryzen 3000 line has a 70% yield rate across the board. Rumor also has it that the Ryzen 3000 chips will support DDR4 running at 5GHz OCed and 3.2GHz out of the box.

      1. phuzz Silver badge
        Thumb Up

        Re: Definitely of Interest

        "Rumors state the Ryzen 3000 line has a 70% yield rate"

        AMD used to sell three-core CPUs. These were originally four-core parts where one core was faulty, so AMD just disabled the faulty one and sold it for cheap. Perhaps they'll do the same with some of the Ryzen 3 left-overs? A sept-core (hept-core?) processor would be an interesting upgrade.

        1. Peter2 Silver badge

          Re: Definitely of Interest

          They did. I personally know somebody who had one, and re-enabled the disabled core and used it as a quad core without any problems.

          1. whitepines
            Facepalm

            Re: Definitely of Interest

            re-enabled the disabled core and used it as a quad core without any problems.

            Without any *known* problems. The thing could be silently corrupting every thousandth result of a specific operation, but until it corrupts enough of the OS's core functionality or whatever game was being played, the damage stays hidden.

    2. Anonymous Coward
      Anonymous Coward

      Re: Definitely of Interest

      “One thing that had me wondering was why only 12 cores? ”

      My initial guess was that the 7nm process was producing too many defects or that they had issues with thermals, particularly when combined with the delay/information black hole around threadripper.

      Based on the news yesterday, I’m less pessimistic. Thermals are about what was expected so no obvious issues there and performance/clocks are a little better than I expected.

      Either the 16 core parts will become Threadripper parts with full cache and 12/16 cores or AMD are maxed on production and all fully working parts are being used for servers. Yes I know servers are supposed to be released in Q3, but “custom” Intel chips were making it into cloud data centres months before they hit mainstream.

    3. eldakka
      Holmes

      Re: Definitely of Interest

      One thing that had me wondering was why only 12 cores?
      So that they've got something up their sleeve to snatch the headlines back in 6 months - announcing a 16-core consumer part.

      1. Robot W

        Re: Definitely of Interest

        Exactly. Because there is no market need to ship a 16 core part at this time.

        It also allows them to release the 12 core part at $499 now, which also allows them to release a 16 core part at a slightly higher price later, perhaps $599.

        1. -tim
          Coat

          Re: Definitely of Interest

          I've been hunting for a lower power AMD Ryzen appliance type server with no luck. I can't be the only one who is replacing very old gear with newer and finding I don't need anywhere close to what a modern server delivers. I want 1 RU, dual power supplies, ECC, dual ethernet, lights out management and the ability to put about the slowest modern cpu I can find in it. Not everyone can virtualize everything and the load is never going to need the power of a modern cpu.

          1. Anonymous Coward
            Anonymous Coward

            Re: Definitely of Interest

            You're building a file server?

            :)

        2. chuckufarley Silver badge

          Re: Definitely of Interest

          "Because there is no market need to ship a 16 core part at this time."

          I do not think that there is a "Consumer" market for 8 or 12 core chips at this point in time. Any workload that needs more than 4 cores (and therefore requires more processing power a laptop CPU could provide) is not really the kind of thing the general consumer is likely to engage in. While there are some notable exceptions (i.e. gaming, content creation, and perhaps software development) I think that high core counts in consumer parts will just add idle cores most of the time.

          Based on that I am willing to predict that after the marketing hype battles between Intel and AMD have calmed down these more powerful "Consumer" CPUs from both brands will eventually be considered niche products and be folded back into the workstation and server class SKUs in order to cut the costs of production and maximize profits for all of the manufacturers in the channel. I don't think this will happen in less than five years but I have been wrong before.

          1. Anonymous Coward
            Anonymous Coward

            Re: Definitely of Interest

            "I do not think that there is a "Consumer" market for 8 or 12 core chips at this point in time. Any workload that needs more than 4 cores (and therefore requires more processing power a laptop CPU could provide) is not really the kind of thing the general consumer is likely to engage in."

            Processing raw files from digital cameras tends to hammer the CPU quite a bit, especially with 20-45MP being commonplace in DSLRs and CSCs, and especially if you're rendering 100+ raw files into JPEGs. Somewhat of a "niche" hobby, but definitely within the consumer sphere...

            1. John H Woods Silver badge

              Re: Definitely of Interest

              ... and gaming is even less niche

              1. Anonymous Coward
                Anonymous Coward

                Re: Definitely of Interest

                PC gaming accounts for around 5m x86 processors a year out of around 250m total. And dropping, with the sales emphasis moving to consoles now they’re on x86 too.

                Ie, it’s niche at 2%

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Definitely of Interest

                  I mainly intended my home computer for productivity (office, etc *yawn*) when I built it about a year and a half ago. However, I made sure it would be at least competent at games (Ryzen 7 1700, GTX 1070, etc.), and I figure I'm probably not alone there in the home computer crowd. Therefore, you've likely got people buying computers for "productivity" as far as the survey results would show, but who are still interested in gaming performance.

      2. Anonymous Coward
        Anonymous Coward

        Re: Definitely of Interest

        "So that they've got something up their sleeve to snatch the headlines back in 6 months - announcing a 16-core consumer part."

        Don't forget that AMD do not have unconstrained production with TSMC - they have to pay for any wafers baked and AMD are competing with a lot of other big companies for wafer baking time.

        While keeping something up their sleeve for later is good marketing sense given the performance details released so far, selling as many chips/wafer is even more important if you want to show what's up your sleeve in 6 months.

        You know the old saying - "a 16-core CPU up your sleeve is better than empty pockets and no one willing to produce your marvellous designs"

    4. a pressbutton

      Re: Definitely of Interest

      One more than eleven?

    5. takyon

      Re: Definitely of Interest

      Unmentioned in the article are two 6-core Ryzen 5 CPUs:

      Ryzen 5 3600X = 6 cores, 12 threads, 3.8 - 4.4 GHz, 95 W, $249

      Ryzen 5 3600 = 6 cores, 12 threads, 3.6 - 4.2 GHz, 65 W, $199

      They were in a press release, not mentioned at the Computex keynote. They will also be available on July 7 (or more realistically, the day after, as July 7 is a Sunday).

      I think they can make the 16-core part work with the available bandwidth, they just don't need to. For once, Intel is the one not putting competitive pressure on its rival. AMD can announce a 16-core "3950X" or something later in the year, with the "50" again alluding to the company's 50th anniversary.

      1. diodesign (Written by Reg staff) Silver badge

        "Unmentioned in the article"

        Yeah, they weren't included in the keynote but are in the linked-to announcement. I'll throw them into the article, too.

        C.

  4. chuckufarley Silver badge

    I have been watching...

    ...the Ryzen 3000 rumors closely for months. Our company plans on jumping off of the train wreck with Intel inside when July 7th comes. We are a small shop but over the next year will be replacing every single Intel CPU with new AMD parts. Our test systems with the first gen Epyc CPU's have convinced the managers and bean counters to go AMD when the new server chips are released. The one question everyone is asking: Can AMD's supply meet our demand?

    1. Anonymous Coward
      Anonymous Coward

      Re: I have been watching...

      Ditto here - I just specced a dual DC setup, and I may just keep a few VM boxes off the procurement list so that I can stick those on an AMD chassis to see what the difference is.

      For the rest it's too early, I want those new CPUs and associated board chipsets to be in use for at least 6..12 months before I stick them anywhere mission critical.

  5. Charles 9

    Ray Tracing

    Honest question. nVidia's new RTX architecture I'd supposed to bring great improvements in real time ray tracing. What is the word about RDNA relative to this?

    1. Anonymous Coward
      Anonymous Coward

      Re: Ray Tracing

      Word on the street is that Sony and AMD were jointly working on this, and whilst AMD can include it in the consumer products, they can't include it in their OEM parts, essentially meaning the next XBox (if Microsoft bother with another one), will have some serious deficiencies compared to PS5 and PC gaming.

      1. Down not across

        Re: Ray Tracing

        essentially meaning the next XBox (if Microsoft bother with another one), will have some serious deficiencies compared to PS5 and PC gaming.

        If? They've already announced Scarlett Lockhart and Anaconda being in development. Rumours include DirectX ray tracing support, but rumours are rumours. Maybe we'll find out more in in June at E3.

        1. Anonymous Coward
          Anonymous Coward

          Re: Ray Tracing

          Microsoft say lots of things. The sheer number of about turns last gaming generation is anything to go by, they could have already canned things.

          1. Down not across

            Re: Ray Tracing

            Microsoft say lots of things. The sheer number of about turns last gaming generation is anything to go by, they it could have already canned things.

            So does Sony. Backwards compatibility in hardware, ah sorry not for you in Europe. Want to run Linux, sure go ahead, actually we've changed our minds and dropped it now you bought the console.

            About turns in gaming is the way the industry goes. Of course they can still can it, but despite having released specs, so could Sony. In gaming nothing in certain until it has actually been released for real, and even then it might not have been what was anticipated.

            1. Anonymous Coward
              Anonymous Coward

              Re: Ray Tracing

              Butt hurt fanbot alert. Nobody was talking about Sony. Both these things you mention have a very sane reason behind them. The hardware BC was removed because everyone complained about the PS3 price (be careful what you ask for, springs to mind). The Linux support, wasn't just really unusable, it became a security liability.

              Microsoft's turnarounds had no sane reason. Every week the Xbox One had some huge about turn and fumble in it's product direction and feature set. It also happened with the Xbox360, where they split their user-base, and removed the harddrive to make a cheaper model, and game manufacturers just coded for the that, thereby gimping the more expensive hard drive enabled models forever more...

  6. Kobblestown

    The Ryzen 9 3900X has 70MB of cache (6MB L2 + 64MB L3)! Holy mother of God!

    Now I know what CPU would Jesus run...

    1. Anonymous Coward
      Anonymous Coward

      And that’s with one core complex disabled - the 16 core parts hit 72MB and I guess the server parts will hit 64 cores/144MB cache...

      1. Kobblestown

        Well, the server parts should be even twice that. Do the math: 2 chiplets - 72MB max, 8 chiplets - 288MB max. Most people can still remember the times when that was considered a good amount of main memory. Man, that's the amount of main memory the PS3 had!

        1. Tomato42

          for quite few years the PCs had more cache memory than my first PC and have main memory faster than the cache in my first PC (I distinctly being amazed by seeing over 1GB/s benchmark for it)

          now we have copper network cables with more bandwidth than that!

          1. John Brown (no body) Silver badge
            Coat

            640KB should be enough for anyone.

            1. defiler

              Pfft - 48KB was enough for Manic Miner.

              1. John Brown (no body) Silver badge

                "Pfft - 48KB was enough for Manic Miner."

                Pfft! Spoiled rich kids with 48KB Speccies. Never knew they were born! Manic Miner played perfectly well on a 16KB Speccy as we poor, hard done by kids knew!.

                1. Anonymous Coward
                  Anonymous Coward

                  Hah! Luxury!

                  Try wumpus on a KIM 1 in 1K of RAM. And no posh screen, a hex display and that was it.

                  :)

                  (now waiting for someone to come up with PDP11 games or first having to wire in some magnet cores or read cards to fire up a game :) ).

            2. GrumpenKraut
              Thumb Up

              > 640KB should be enough for anyone.

              If it is first level cache: yes.

              Enough for the next few years, that is.

            3. Anonymous Coward
              Anonymous Coward

              Acorn Atom - 6kB will be enough - oops, the case dimensions we were given were external and not internal so then PCB won't fit in. Sigh, cut out a column of RAM chips to shrink the width so it fits and 5.75kB will be enough.

              1. Michael Wojcik Silver badge

                5.75kB

                Luxury! The first1 machine I did any real coding for had only 4 KB of RAM.

                I think that was the smallest physical memory space of any general-purpose machine I worked on, but I also wrote some programs for the HP-15C, which had, what, 2 KB? for its RPN goodness.

                1Yeah, I'm a relative newcomer to the industry, compared to some of the greybeards around here. Though later I used older machines. If memory serves, the oldest systems I actually wrote code on were a couple of PDP-11s, likely from the early 1970s.

      2. Robot W

        64 core Rome is expected to ship with 256Mb of L3 cache (32Mb per chiplet) + 512Kb/core L2 cache.

    2. Marco van de Voort

      That is indeed interesting. 32MB per chiplet. I haven't seen any details yet if Ryzen3000 is again an less efficient exclusion cache.

      Also many image processing benchmark show great "3000" improvement, but be warned that AVX performance doubled since the "2000" series. Some benchmarks are biassed towards AVX, but if your workloads are not AVX wymmv.

      In the end, it is probably still a wait till the embargoes on benchmarks are lifted. (and even then, selecting the benchmark that is relevant for you is an art (*))

      (*) I usually go with the compiler benchmarks, since even though I"m in image processing, I found these benchmarks are closer to my applications real world performance, probably because my vision apps are not stuffed with whole image operations.

    3. takyon
      Boffin

      cache me outside

      Hmm, I wonder if the huge cache will alleviate potential issues with having 2 chiplets and 12-16 cores.

      By the way, 64-core Epyc will have 32 MB of L2 cache and 256 MB of L3 cache.

  7. Anonymous Coward
    Anonymous Coward

    The 5000 was picked because it's AMD's 50th anniversary

    Maybe it's just me, but surely 50 or 50A would be more appropriate?

    e.g. RX50-1, RX50A-1 etc.

    :-D

    1. Tomato42

      Re: The 5000 was picked because it's AMD's 50th anniversary

      oh, you silly AC, everybody knows that Grundmaster 5000 is better than Grundmaster 3000 but worse than Grundmaster 7000!

  8. Permidion

    Intel: anything about spectre meltdown and related?

    Do the Intel benchmarch take into consideration a degraded mode due to mitigation for spectre, meltdown and others funny stuff of the same sort ?

    1. GrumpenKraut

      Re: Intel: anything about spectre meltdown and related?

      No. Neither disabling hyperthreading.

      IMO benchmarks with hyperthreading off should be shown no matter what, there are other reasons beyond security to disable it. On HPC clusters hyperthreading off seems to be the default, am I correct?

      1. Korev Silver badge
        Boffin

        Re: Intel: anything about spectre meltdown and related?

        On HPC clusters hyperthreading off seems to be the default, am I correct?

        Yes, in most places.

  9. David Shaw

    I've just built an DIY AMD PC system from all the very cheap leftover bits in Warehouse Deals,

    have put a £20 quad-core AMD something in the £20 AM4 mobo , so that I can flash the bios, then put in the lowest spec Zen2 Matisse Hexa-Core 3.2 GHz when it is released later this summer. It's working great 'till then for email etc

    I'm not sure if my mobo can be flashed to PCIe 4.0, for at least the M.2 and first PCI slot - I've read that X470 and B450 from Gigabyte has just allowed this retrospective upgrade, when paired with the matisse cpu range. Those with a bigger budget can look for the X570 and B550, but they are just incremental upgrades from last-year's models.

    (this "AMD 50" story had a mild positive benefit as the RX 570 that I bought for about £100, came with two free games, associated with the fifty years promo)

  10. whitepines
    Unhappy

    Too bad these are insecure because of the AMD "Platform Security Processor", which runs AMD-signed, AMD-controlled code that limits what you can do with "your" system. Plus, like the Intel IME, presents a massive hazard to your privacy and system security, a hazard that AMD takes no responsibility for.

    I'll pass, thanks.

  11. Anonymous Coward
    Unhappy

    Sigh.

    Heartbleed? Check.

    Spectre? Check.

    NSA backdoor? Check.

    Totally undocumented 'Management' engine? Check.

    Press fawning over Intel and AMD; forgetting all the architectural blunders and security holes? Check.

    Same old shit, slightly different die? Check.

  12. Anonymous Coward
    Anonymous Coward

    About 20 years ago I was upgrading to faster kit at least once a year - sometimes several times a year. The step performance improvements were always very significant.

    For the last decade I have not considered it worth moving up from my Core i7 870 and 16GB ram with a PCIe 2.0 graphic card.

    What is the likely speed improvement by going to the new cpus etc?

    1. cb7

      It really depends on what you use the machine for.

      If you do a lot of video encoding, the speed increase could be significant, but then some of that could also come from a graphics card upgrade.

      In day to day use, some of the speed increase comes from the faster storage interfaces the newer chips support.

      The i7-9900K is around 2-3 times faster than your chip:

      https://cpu.userbenchmark.com/Compare/Intel-Core-i9-9900K-vs-Intel-Core-i7-870/4028vsm961

  13. ColonelClaw

    "Onto the Zen 2.0 third-generation 7nm Ryzen 7 processors"

    What is it with tech companies and their bizarre naming schemes? No doubt they have their reasons, but generally '2.0' and 'third-generation' in the same name is just going to confuse the average punter.

    1. chuckufarley Silver badge

      The Average Punter...

      ...is confused by the Electoral College.

  14. anonymous boring coward Silver badge

    Thank God for AMD!

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like