back to article The new GPU world order is beginning to take shape

For the first time in what feels like an eternity, customers have a third choice when it comes to graphics processors with the launch of Intel's mainstream Arc GPUs. It could be that AMD and Nvidia's long-standing duopoly has come to a close. The timing of Intel's launch couldn't have come at a better time. After nearly two …

  1. Binraider Silver badge

    ARC's poor performance on old stuff (due to emulation of e.g. DX9 calls) is a non-starter for me. But then I like to muck around with old rubbish. Nvidia have been off limits for too long because crap linux drivers and more than slightly inflated retail prices.

    And that leaves AMD with something of a monopoly for the time being, at least from my own viewpoint.

    I may never touch a ARC GPU at all, but it's existence is generally a good thing.

    1. JT_3K

      It's a shame. As a casual modern gamer, the only two "modern" games I'd play are GTA V and I could be convinced to take a run at the new Flight Simulator. ARC is in that price range where I'd consider one to replace my ageing GTX 1080, but I understand I'm likely to get a performance hit for that?

      Good luck to them though, someone needs to add competition in that sphere.

      1. Binraider Silver badge

        It's probably a step up from a 1080. The benchmarks I've seen suggest the ARC is about equivalent (or better) than a 3060 on current APIs; but is about 1/2 the performance of a 3060 in DX9.

        The drivers are still "early access" so to speak; which should be no surprise to anyone. Unless you're in the mood for being an early adopter I'd give it 6 months or so for the major glitches to be ironed out.

        Gamers Nexus has done some good reviews of the card if you want to see detail that my own comments are based on.

        The video encoding features, if you use them, are a genuine differentiator. Multiple reviewers saying that they make interesting co-processor option for that purpose alone.

        1. Anonymous Coward
          Anonymous Coward

          "The video encoding features..."

          Quick note: The $350 ARC 770 has 16GB of RAM making an amazing poor man's video scrubber.

        2. Chz

          Half the performance of a 3060 in DX9 is not a step up from a 1080!

          1. Binraider Silver badge

            As I commented above, in "current" API's it is an improvement.

            The answer to should I get one basically depends on what software you want to run, and budget.

    2. The Dogs Meevonks Silver badge

      It's worth considering a budget ARC GPU as a secondary card just for the onboard AVI encoding/decoding as even the lowest spec cards have it.

      it's locked behind the 40xx series paywall on nVidia cards... no idea about AMD as yet.

      So if you are any kind of content creator, streamer and any related fields that require excellent hardware encoding/decoding abilities... It makes far better sense to go ARC and keep an existing card than it does to fork out for anything from nVidia anymore.

      1. Sampler

        Someone watches LTT...

  2. Cederic Silver badge

    hurrah

    More options, greater competition, all fantastic.

    The midrange isn't just midrange though, it's also 'well above what is needed' these days for anybody running on 1920x1080 screens, and (without seeing the specs) I suspect it's also meeting the needs of 1440p.

    4k and VR are the only real drivers to go above the midrange these days, and not many people are bothering with those. High frame rates at 1440p with all the prettiness are (for me) preferable to mediocre framerates at 4k, even if my eyes could read a 4k screen.

    Real time ray tracing may make a difference, but given the RTX3080 collapses in a puddle if you try and use it at a mere 1440p (and I can't even tell the difference in aesthetics, just the horrific impact on frame rates) that's still very much a gimmick.

    So Intel have an opportunity to meet most peoples' needs at a price point far lower than their competition, and hopefully run cooler (and thus quieter) too.

    1. Joe W Silver badge

      Re: hurrah

      "The midrange isn't just midrange though, it's also 'well above what is needed' these days"

      Yes. And graphics cards are not the only product where this is true...

      1. John Brown (no body) Silver badge

        Re: hurrah

        Just wait and see what MS do with Windows 12 to slow things down. Again.

      2. Prst. V.Jeltz Silver badge

        Re: hurrah

        "Yes. And graphics cards are not the only product where this is true..."

        Yes i noticed yesterday LIDL's generic razors have 5 blades!

    2. Halfmad

      Re: hurrah

      I can't see me replacing my second hand 1080TI for another few years, it's been a faithful card since 2018.

  3. devin3782

    Well I'm going to buy an A770 for a giggle to see what it's like I'm sure it'll have a similar driver mess to the early 2000 era ATI Radeon 8500XT which after the drivers were fixed was an excellent card giving rise to the legendary Radeon 9800XT. That said my GTX 1070 is still playing everything I want but I would like to play with ray tracing.

    I do need to check that qemu supports resizable-bar when using PCI-E passthrough, although I think these cards support SR-IOV which which have to buy an enterpriced version of nvidia's hardware if you want that feature like what Intel do with their CPU's if you want ECC

    1. Little Mouse

      "but I would like to play with ray tracing"

      After "playing with ray-tracing" on my co-pro'd 386DX back in 1992, I could weep reading that.

      1. Binraider Silver badge

        Heh. I endured it on a A2000 with 1MB of Ram and a stock 68000 CPU. Impulse Imagine on a coverdisk of Amiga Format.

        It was fun at the time! Also funny how modern rendering is actually not that far removed from the tools that appeared in Imagine waaaay back in the early 90's.

        1. ChrisC Silver badge

          Similar though slightly earlier start for me, with a 1MB A500 running Sculpt 3D at the back end of the 80s. Definitely a lot of fun being around in those early days when stuff like this was first hitting the "mainstream" market, being able to get an early feel for where things were heading.

          1. juice

            Hah

            > Similar though slightly earlier start for me, with a 1MB A500 running Sculpt 3D at the back end of the 80s

            Hah! Vu 3D on the ZX Spectrum.

            Admittedly, I don't think I ever rendered anything other than the built-in goblet model, but hey.

            I'm just going to wait for someone to say that they first tinkered with 3D models by feeding paper tape into a PDP-9, getting a print out of coordinates and then chiselling said coordinates into a large lump of stone...

            And they were lucky!

            1. just another employee

              Re: Hah

              PDP-9 Ha!

              I still remember the increase in render calculation time when we moved from the abacus to the slide rule....

              ;-)

              1. Tom 7

                Re: Hah

                The accuracy dropped enormously but then so did the jitter!

            2. AIBailey
              FAIL

              Re: Hah

              I used PoV (Persistence of Vision) on my Atari ST to render an example file as a 640x480 background image for my dad to use on his PC.

              It took all night and much of the following morning, but I was rewarded with a ~900KB file on my hard drive.

              Unfortunately, the file was too large to fit on a floppy disk, and also too big to zip down (not enough memory on my 1MB ST once GEM and a zip program were loaded) so I've no idea what it eventually looked like...

        2. John Brown (no body) Silver badge

          "Heh. I endured it on a A2000 with 1MB of Ram and a stock 68000 CPU. Impulse Imagine on a coverdisk of Amiga Format."

          Same here, but the GUI based stuff never really attracted me. Like the heinous, vulgar and misguided ST user (spit!) below, I used PoV from the command line originally on an A1200, eventually upgrading it with an 030 accelerator +FPU which helped a lot. With some scripting, I was able to render animations and even got asked to make the title anims for a couple of PD CD collections. By then, I was after every bit of speed I could get, so used PARNET to network to an A500+ with the A1200 and used a lock file to tell each Amiga which frame was being rendered so when one finished a frame it would start on the one after the other Amiga was rendering. It still took at least a week to render the 50 or so frames at the final required quality though. IIRC, the A500 did about 1 frame for every 8 or so the A1200 did, but like I said, any speed increase was worth it back then :-)

          Even nowadays, I still do computational intensive stuff at the command line on FreeBSD so there's no "wasted" CPU cycles on a GUI. Do my video editing in a GUI where necessary, outputting an edit list, then do the final spilt/join/re-encode at the command line. PoVRay is still around. That's next weekend already spoken for now and since my Amiga HDD has been imaged for FS-UAE, I should still have files on it somewhere. I wonder how long at anim will take to render on a modern fast PC :-)

    2. The Dogs Meevonks Silver badge

      Picked up a 6900XT back in June for £785 and then sold my 5700XT for £150 to a friend (£100 below avg prices at the time)... So a 6900XT for £635 was a great deal... I got it from Amazon warehouse deals and the only issue was a slightly damaged open box... Contents were mint and unopened.

      Sure, if I'd waited I could have saved some more money... But I'd already been waiting nearly 2yrs to replace the old card that I got for less £80 than MSRP new... and that was fine for 1080p 100fps gaming, but I now had a 165hx 1440p HDR monitor... and wanted to push it further.

      Tried raytracing and wasn't hugely impressed... Far more impressed with FSR 2.0 and am using it on almost every game now... 1080p to 1440p, maxing out settings and getting well over 100fps on every new game released in the last year that I've tried... and maxing out monitor in a lot more.

      Happy with the purchase... and it cost a lot less than a 4070... sorry... shitty 4080 will

  4. Anonymous Coward
    Anonymous Coward

    In a high energy price world...

    Why pay £1500 for a card thats sucking down 700w anyway.

    Raw power is no longer all people look for in a card anymore, where is the efficiency gain? Nvidia are barking up the wrong tree if they think that the world and their dog can not only afford the up front price and but also stomach the ongoing price of their kit at the 4xxx level.

    1. Anonymous Coward
      Anonymous Coward

      Re: In a high energy price world...

      We sorely need international efficiency requirements. California already has these and they're preventing people pushing carbon emissions to play their pointless games at 4k 240fps.

      1. Binraider Silver badge

        Re: In a high energy price world...

        The limits of a US-ian plug socket for current represent more or less the hard limit on power draw, hence 1300W PSU's are more or less the upper limit of what will ever practically appear in consumer space.

        That a PC is into vacuum-cleaner power draw territory is quite ridiculous as of itself; the PPC G5 died by horrible power consumption / thermal output, as did the Pentium IV and Itanium.

        But yeah, I won't be buying a 700W Gfx card for consumer use, no matter the FPS. I can just-about justify one for work if the main piece of software we use can exploit it (not yet).

    2. Boothy

      Re: In a high energy price world...

      Where do you get 700 Watts from?

      The new top end RTX 4090 is a 450W card, same as the previous 3090 Ti, but the card runs anywhere from 50% to 100% faster than the earlier card, depending on game. So is much more efficient than the previous generation cards.

      Bear in mind these are now on a much newer TSMC node, compared to the previous Samsung node.

      Unless you're running uncapped framerates you'll likely pull less power from the wall with the 4000 cards, compared to the 3000s, especially with a more regular mid tear card, rather than the 450W 4090.

      1. The_Wisest_One

        Re: In a high energy price world...

        The transient spikes on the 4090 are x2 to x3 of it's requirements.

        Hence you will need a larger PSU to be able to deal with them, otherwise it will trip and fail.

        Sure, it only requires 400W to run. But it NEEDS 800w to get it started.

        The fact you're not aware of this, makes you a sucker.

        1. Boothy

          Re: In a high energy price world...

          Quote: "The transient spikes on the 4090 are x2 to x3 of it's requirements."

          No they are not, you need to update your knowledge. The x2 to x3 was just rumours, supposedly taken from early PSU testing of the 4090 chips, which were apparently mounted on modified 3090 boards early on.

          The power delivery for the 4000 cards is new, or at least updated, since the earlier 3000 cards. The actual transients are much less now.

          Gamers Nexus (who have a full lab for this type of testing), showed transient spikes being a maximum of 40% over nominal, not the 200% to 300% you're talking about.

          Also don't use cheap PSUs if you fitting a top end GPU, as good quality PSU cope with transients just fine.

  5. khjohansen

    i740

    .. Some of us olde geezers have not forgotten ...

  6. fishman

    Recession?

    If there is a worldwide recession Nvidia may end up sitting on a whole lot of product. And if they slash prices to move it, they will piss many off those who paid the original price.

    1. Dave 126 Silver badge

      Re: Recession?

      How do times of recession track with people playing PC games? For sure, people might hesitate to buy an extensive GPU, but conversely, some people play PC games as a cheaper option than going to the pub five nights a week.

      Once you've got the hardware, PC gaming is a relatively inexpensive hobby. Well, financially at least - the heath consequences of being sat down for long periods of time are becoming better known.

      1. Richard 12 Silver badge

        Re: Recession?

        Games themselves tend to sell better, but the hardware does not.

        People game more but keep their existing PC or console for longer.

  7. Anonymous Coward
    Anonymous Coward

    Optimism abounds

    But the die is already cast. Intel is pushing another round of mediocre crap and the only people excited by the prospect are system integrators that miss the days when they could add 300$ to the system price for crappy built in shared memory graphics on the Intel cpus. The last round of Intel graphics was literally a box checking exercise. These new Arcs ARE better, mostly due to banishing the crippling performance hit of that shared memory architecture.

    The commitment to the meaningless middle will mean the project is born under a cloud, and will likely never escape the fart like smell it arrived with. These cards are destined to be ripped out by Christmas, children's last memory of the old cards resting at the bottom of a waste bin as they eagerly slot in the new shiny red or green present Santa left them. Probably not going to put a huge dent in Quadro sales either, which could have been closer in reach.

    Seems like the one ray of sunshine is for people who may want to keep the second card for video encoding, but if it catches on i'd expect AMD and NVIDIA to answer it but moving similar encoding down to the midrange( with just slightly better specs, and for just a few dollars more).

    1. ecofeco Silver badge

      Re: Optimism abounds

      ...crippling performance hit of that shared memory architecture.

      That was indeed Intel's fatal flaw. But they will never get another chance with me.

  8. Lorribot

    Waiting for the enevitable

    Current Arc is a first Gen get it out the door model. in that light it is a good starter for 10, Intel will iterate and get better quickly.

    Nivida seems to have missed a few pointers. They are competing with consoles that cost about $400 all in, why would spend $900 on just a graphics card and then all the supporting hardware? only people that can afford these things are overpaid or Youtubers who get them for free.

    PC shipments are starting to tank and there is no Crypto mining to support card shipments, Nvidia also have a massive surplus stock of the last model left to shift. Nvidia's answer to raise prices even further on the new model to help support the prices of the old model, but each sale of a 3000 is someone not buying the newer model. They are also cheating/lying on model names so they can again overcharge.

    This started with the 2000 models that were around 20% over priced continued with the 3000 which was supported by excess demand and is now just running out of control.

    Along with the excessive power requirements (add a new $300 PSU) and the loss of FVGA highlighting their poor partner practices, it feels like Nvidia's house of cards is teetering towards collapse in a perfect storm. Just needs AMD to come in low and hard for knock out blow.

    Maybe ARM will buy Nvidia.......

    1. Boothy

      Re: Waiting for the enevitable

      Quote "Along with the excessive power requirements (add a new $300 PSU)'

      The increased power was just rumours. If you're doing a like for like swap, the power requirements are basically the same.

      e.g. The 3090 Ti was a 450W card, the new top end 4090 is also a 450W card. (Granted a Ti variant in the future might pull more).

      1. Fading

        Re: Waiting for the enevitable

        Not sure of the numbers that fit your demographic - the intersection of gamers that had a 3090Ti (and have had it long enough to want to upgrade) and are now looking to purchase a 4090 isn't going to be anywhere near the majority. 450W is excessive (my 1080Ti was considered excessive at 250W) and whilst I do have a PSU that can run a 4090, I don't have the desire to spend £1700 on just a GPU. Given the difficulties getting GPUs at reasonable prices until recently I suspect there are more people with 10 series cards looking to upgrade than 3090Ti owners looking to upgrade.

  9. Anonymous Coward
    Anonymous Coward

    Had Intel targetted 3070 performance, I'd have been intrigued, but a 3060 just isn't enough to drive my monitor at full resolution. I'm looking more to AMD for my next video card; I flat out refuse to submit to NVidia's gouging for games.

  10. ecofeco Silver badge

    Intel can rot in hell

    I will never willingly buy another Intel GPU of any sort.

    Right now I have the best of both worlds. Both an Nvidia and Radeon in one PC. While not top spec, they are both way above average and both do their specifically assigned tasks flawlessly.

    If I really need more juice, I have controller software that lets me run them both at the same time.

    Screw Intel and their finicky, erratic crap. Power means nothing if it does not play nice with others. Which is the fatal flaw of Intel's GPU culture.

  11. GraXXoR

    inflation seems to be a self fulfilling prophecy these days...

    Companies see prices around them rising so feel they can arbitrarily raise prices and blame it on the government / the free market / stupid consumers / uncle bob.

    Even though inflation is supposedly 3.x percent where I live, it's common to see prices rising by anywhere up to 75%. or the same product at the same price but the box is almost half empty (8 weetabix instead of 12 in an identical box).

    kind of starting to get annoying.

    1. nintendoeats Silver badge

      We dry cleaned 2 coats and a pair of pants the other day. It cost half my daily pay (after taxes). I wasn't pleased.

  12. Zippy´s Sausage Factory
    Joke

    "I want a graphics card"

    "Great, an Nvidia?"

    "No, cheaper than that."

    "AMD?"

    "It doesn't actually have to be any good."

    "The Intels are over there, next to the rack of Garbage Pail Kids cards"

    1. Anonymous Coward
      Anonymous Coward

      It's bad enough being number 2, as Intel is in CPUs but it's just a distance turd in the GPU world.

  13. WilliamBurke
    Unhappy

    HPC

    The problem in high performance computing (using GPUs for scientific simulation) is that everybody is using CUDA, which locks them into NVIDIA, giving the company a monopoly in this market. It's nice that OpenACC exists, but porting to a new platform is a major effort, and the scientists just aren't doing it.

  14. JoeCool Silver badge

    Is the author positing an inverse relationship between performance and price ?

    "Even if the 12GB 4080 manages twice the performance of the 3070, at nearly twice the price, that doesn't make it better value."

    In general, I would think that doubling performance would cost at least double.

    1. nintendoeats Silver badge

      Re: Is the author positing an inverse relationship between performance and price ?

      Moore's law would like a word with you.

      1. anonymous boring coward Silver badge

        Re: Is the author positing an inverse relationship between performance and price ?

        Moore's law isn't an actual law, and I'm also pretty sure that we aren't following that curve any longer.

        The "law of diminishing returns" would be more appropriate in this case.

  15. Snowy Silver badge
    Facepalm

    Some cheap enough with enough power and stable drivers

    Out of the three none of them do all three.

    Nvidia

    To expensive, the choice is overpriced last generation or a very very expensive next generation which we still can not buy and they have only announced the high end cards.

    AMD

    Likely to be a bit cheaper than Nvidia but not by much and they are like Nvidia are just going to release the high end first and it will be well into next year before we see anything affordable.

    Intel

    For what they are offering the price is to high and the drivers make the worse of AMD (AMD have improved a lot) look like the gold standard of coding.

    "The new GPU world order is beginning to take shape" and it is expensive. These cards where designed to mine on and that ship sailed when mining died when Proof of Stake replaced Proof of Work.

  16. Anonymous Coward
    Anonymous Coward

    Today

    I went outside. The resolution was incredible. And it was realistic 3D without making me feel sick. I played "drive a car to the shop that sells food and beer"

    1. Anonymous Coward
      Anonymous Coward

      Re: Today

      That "Tactile Feedback" system is amazing on that rig, isn't it? :)

  17. anonymous boring coward Silver badge

    Seeing nothing exiting on the horizon (the 40 series) I upgraded to a second hand RTX 3070.

    Expensive, but my 1660 was just too slow for VR. And not as expensive as store bought 3070 or the silly 40 series.

  18. anonymous boring coward Silver badge

    "Even if the 12GB 4080 manages twice the performance of the 3070, at nearly twice the price, that doesn't make it better value."

    I don't think it's anywhere near that fast. Interpolated extra frames don't really count in my book when you compare raw power of cards.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like