back to article AMD claims high-end Big Navi Radeon GPUs leave Nvidia's ray-tracing cards in the dust

AMD showed off three new graphics processor units for PC gamers on Wednesday, claiming its chips offered better performance than the ones in Nvidia's latest GeForce RTX 30 line. If you missed CEO Lisa Su's presentation, you can catch it below. In the short but sweet video, Su holds up a square metal-encased chip. Lisa Su …

  1. Def Silver badge

    That's all very well, but are their drivers any good these days? And do they still ship that god-awful control panel that looks like something designed by a five year old with an unlimited supply of crayons?

    1. Anonymous Coward
      Anonymous Coward

      "... drivers any good these days?"

      FWIW, on Linux they're better than ever (remember, FWIW), but I'm not sure on Windows or when the hardware is at full tilt under any OS :-/ (multiple displays suck, but they do everywhere so...)

      Considering AMD is trying to break into "AI", already has laptops as well as game consoles, I don't think AMD cares about desktop GPU or at best they simply see it as a testing/research platform. Of course with Nvidia's purchase of ARM, I'm not too sure how long Nvidia will care about desktop GPU either (it kind of looks like Nvidia wants to become AMD). Who knows, but I do wonder what the percentage of sales is for any companies desktop GPU's and would it be noticed if it was lost.

      1. Gene Cash Silver badge

        on Linux they're better than ever

        That's not a very high bar...

        1. Teiwaz

          That's not a very high bar...

          Got to be better than my Desktop experience with Nvidia.

          After several years, Wayland is still unreachable, the Plasma desktop is flitchy as hell.

          Really a step back from my last build, despite a discrete graphics card this build vs. an onboard chip on last (AMD), and it was pre-amdgpu driver.

          1. Anonymous Coward
            Anonymous Coward

            Re: That's not a very high bar...

            Was your build an RTX20XX or GT1650, aka Turing based ?

            The drivers (or silicon) has been plagued with lots of issues for various games.

            1. Teiwaz

              Re: That's not a very high bar...

              GTX 1080. They've had plenty of time to iron out the issues.

              But then, they're probably itching to terminate support

      2. Kevin McMurtrie Silver badge

        My Nvidia GTX 1060 has been nothing but trouble in Linux. It's missing/corrupted after a sleep/wake cycle and it took over a year to get drivers that could hit double-digit frame rates in a low-end game like Torchlight ][ at 4K. I would definitely try an AMD card, though probably not ones like these that need a dedicated mains breaker.

  2. Anonymous Coward
    Anonymous Coward

    If remotely true ...

    this is going to be big ! NV will have to be worried and it may explain their "back to sanity" pricing strategy.

    Hopefully, as stated above, the drivers will be good. I see not reason why not, but still.

  3. Anonymous Coward
    Anonymous Coward

    A proprietary ~7%.

    I think the article should of mentioned or maybe even highlighted the proprietary angle of CPU+GPU. Or maybe not I guess, maybe everyone is stumped on why you would want your CPU to directly access the memory of the GPU (and can't it already?). I see the advantages, but they seem slight to me to the point where I'm wondering why you wouldn't just use some ASIC and throw in more system RAM. Whatever, this is still a proprietary full stack move that kind of mimics the NVidia "GSync" crap and to gamers an extra ~7% "performance" is a lot (of course 7% of what we'll have to wait to see).

    1. Demmers

      Re: A proprietary ~7%.

      If you make both the CPU and the GPU, why would you NOT want them to work better together? It's not like the GPU won't work well enough with Intel, it's just that you get a slight performance boost when paired with their own 5000 series chips. I honestly don't see the problem there. It's no different to Apple making both the hardware and the software, and people wondering why iOS runs so well on less memory than Android equivalents.

    2. Lorribot Silver badge

      Re: A proprietary ~7%.

      There is an advantage to go AMD on CPU and GPU, probably developed as part of their console work. I am sure AMD will licence the tech to Intel if it wants it as they both have with other technologies, but to be honest if you have an Intel platform you are going to lose out anyway as Intel has yet to provide support for PCI 4, and will likely do that just in time for PCI 5 from AMD.

      Intel have been cruising for the last few years as they tried to get their next node up and running.

      They are about to release their own discrete graphics cards which will be built on a 3rd parties node not their own.

      If you were building a new gaming system now, I don't see anything compelling in buying anything Intel and an all AMD package would make sense. For some specific use cases you could make a case for Intel but that is an ever shrinking pool, not even cost would make the case.

  4. BigAndos

    But will you actually be able to buy one? I gave up and cancelled my 3080 preorder after a few weeks. I'm expecting demand for these cards to far outstrip supply as everyone frustrated with NVIDIA will want one! Decided just to get a refurbished 2070S for now and maybe upgrade next year when supply evens out and we know more about the true comparison of these cards

    1. cornetman Silver badge

      That's a big if, indeed.

      AMD did send out a letter to resellers "urging them" to put in place provisions to weed out bots and restrict order counts so that people at least have the chance to get one. Whether or not they will, time will tell.

    2. MiguelC Silver badge

      Lately, availability has been a recurring problem with AMD

      I'm still waiting for Ryzen 4800U availability for my next laptop - they were announced in July but, although Lenovo has publicized the Yoga Slim 7 equipped with one, that version is nowhere to be seen - or anyone else's, for the matter.

    3. Qumefox

      I have no interest in upgrading from my 2070 super for the foreseeable future. And while i'm not a super hardcore gamer, I do game some, and so long as the framerates of the stuff I do play stays above 60fps, I see no huge need to throw money at the latest and greatest there is.

  5. cornetman Silver badge

    Point of order though on the headline:

    The AMD cards are pretty much matching NVidia for performance apart from the 6090XT which, with some of their additional provisions, looks to beat the 3090 in gaming workloads. It's not mind blowing but (apart from raw ray tracing performance) they seem to have taken back the performance crown. AMDs cards to score a plus with a significantly lower power draw as well.

    The interesting difference for me is the segmentation of the models. All of them are pretty much the same as regards their "infinity cache", main memory size and form factor. The big differentiator seems to be the number of CUs.

    A lot of people will be disappointed that there isn't a budget model for those with a more realistic bank account.

    I thought about this a bit and it may well be that they anticipate a flood of used cards hitting the market (e.g. 5700 from them, 1080/1070/1060s from NVidia) and they wouldn't like to compete against those. No doubt they are still smarting over the ex-mining glut some time ago where they allegedly couldn't shift new stock.

  6. Anonymous Coward
    Anonymous Coward

    Is it just me, or is proudly announcing that your graphics card consumes 300W of power....not a good thing?

    I would have thought it would be something to keep quiet about...

    1. Mattknz1

      Amazingly it's a good thing?

      AMD recommends a 650w PSU to go along with their 6900xt whereas Nvidia recommend a 750w PSU for their RTX 3090

      Some obscure you-tuber demonstrated two RTX 3090's in SLI and did manage to brown-out a 1000w PSU.

      1. K

        Just when I was thinking, lots of amazing new games coming along (CyberPunk, Mass Effect Remastered), so time for a CPU and GPU upgrade - But jeez, those power figures scare the crap out of me.

        I run a small home-lab with 5 i7 based ESXi servers, and 2 NAS boxes with 12 drives between.. and total they draw less than that the requirements for just 1 of them cards!

    2. Amentheist
      Thumb Up

      For maximum load yes that much but when on idle or not under load, like most modern hardware (except some AMD CPUs) they're incredibly economical in power usage when used in media playback or browser hardware acceleration.

      1. seven of five Silver badge

        > except some AMD CPUs

        This is not true anymore, theses days intel CPU exceed their TDP up to 250W (but only for 56 sec max, I promise, guv) while AMD now stay within their TDP.

        But yes, this used to be very different two years ago.

        1. Amentheist

          Yeah actually I've a Zen 2 one myself, I only said that because it uses in practice a bit more electricity (as measured at the wall socket) but it's insanely faster than my old 5+ year old i7..

    3. Anonymous Coward
      Anonymous Coward

      "Is it just me, or is proudly announcing that your graphics card consumes 300W of power....not a good thing?"

      GPU power consumption has been going up over recent years - a lot of recent nVidia boards draw 250-300W+, too :(

    4. mark l 2 Silver badge

      Well they can double as heaters to warm your house during the coming winter months with those sort of power draw numbers

  7. Captain Obvious

    I did read on this in way more detail than this article

    Notice how they state their card will beat NVidia in several titles and at 4K resolution? If you do a Google search, you will find that the games they listed are pretty much the ones they beat NVidia in for FPS. Other games perform much better on the Nvidia cards. also, not sure how many people game at 4K, but the 1080p comparisons are in the favor of Nvidia as well as 4K titles not listed by AMD (I think about 5-6 titles performed better on AMD).

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2021