back to article Nvidia releases $1,999, 8K-capable GeForce RTX 3090 Ti GPU

Nvidia is finally releasing its latest monster of a graphics processor, the GeForce RTX 3090 Ti GPU, which it said will super-power content creation applications and enable 8K gaming, assuming you're willing to part with a couple thousand dollars. Launched Tuesday, the new GPU is available in graphics cards made by Nvidia and …

  1. Victor Ludorum

    Don't all rush at once...

    At the time of writing, Scan have got the ASUS one in stock for only £1,943.99.

    1. Cederic Silver badge

      Re: Don't all rush at once...

      Convert $1999 into pounds, add 3.7% import duty, apply 20% VAT and that's only about 44 quid out.

      That's less than the UK markup on a 13" macbook pro (although the other thing I checked was a Surface Laptop Studio and the UK non-tax price works out cheaper than the US).

    2. Anonymous Coward
      Anonymous Coward

      Re: Don't all rush at once...

      Might as well just call it what it is, the next big card in mining.

      That's the crowd with the pockets deep enough to easily afford these.

      1. Anonymous Coward
        Anonymous Coward

        Re: Don't all rush at once...

        There's a small but significant crowd of people like me using them for content creation (3d rendering, non-game unreal engine visuals, live shows, etc). We don't need quadros, because they aren't really faster and cost a lot more, but we can justify this price for having the extra vram which is actually useful. But yeah it's a niche admittedly.

  2. Nick Ryan Silver badge

    From the reviews I've seen, it's only for the gullible of users, not the most ambitious... Huge jump in price over the top of an already overpriced card, for quite minor improvements.

    1. Zenubi

      Anyone using a RED camera system (8K) is going to be opening their wallets. This costs less than many camera add-ons (like preview screens).

      2 Grand is more or less nothing in 8K video land.

      (Yes I know that few can watch 8k - not the point - that massive frame lets editors do wonderful things before it all gets downscaled to 4k)

      1. Sgt_Oddball

        You forgot...

        It's not really 2 grand... Since you'll need to fork out at least another £250-300 for 1000W+ PSU... the minimum expected is 850W but since they've already been clocked drawing 475ish.. A top shelf PSU is pretty much a must to run it for any significant length of time.

      2. Michael Habel

        >Implying that Red would let you use a cheaper non propriotay means of redering. When they have their own shiniey shiniey to sell you.

        1. Anonymous Coward
          Anonymous Coward

          I wouldn't be at all surprised if they have some proprietary software/codecs that "require" a quadro card, i.e. a mug-tax to make their hardware look less exorbitant in price.

    2. Anonymous Coward
      Anonymous Coward

      And for those of us who need high-powered cuda cores for our jobs, I suppose we're all gullible too. How wide is that brush of yours?

      1. Yet Another Anonymous coward Silver badge

        Except those of us using CUDA get to shell out even more for the A5000/A6000 versions to have the latest compute capability and fans that work 24x7

        1. Anonymous Coward
          Anonymous Coward

          People working in graphics (eg rendering using redshift or octane, or doing unreal engine stuff) don't need the latest CUDA compute cores or ECC ram. The 3090 is the sweet spot for this kind of thing.

      2. Michael Habel

        Miners needing "Cores" is perpahs more of the underlinings problem than a legit complant. but, you make that what in fifteen minutes so yeah what are you complainging about?

        1. Anonymous Coward
          Anonymous Coward

          I make Lego movies for using Bricklink Studio. It has the Cycles engine that renders using Cuda cores. I don't really earn that much, and I don't really appreciate your mean commentary, but thanks for contributing.

    3. MikeLivingstone

      Who can actually see the 8k benefit ?

      This really has to be peak GPU.

      I have 32 inch 4k monitor, loads of real estate on it and I can't actually see the pixelation. This is why NVIDIA is diverting to DPUs, this is the last generation of non commodity graphics cards. We also know using GPUs fot AI is too difficult for most,, this feells like a last stab at Graphics before the DPU and Omniverse dream.

      1. doublelayer Silver badge

        Re: Who can actually see the 8k benefit ?

        I doubt that it will be. While there probably are people who have 8K screens for watching stuff, most of it will be people who record 8K video so they have lots of room to edit. The final product will be 4K, but edits will be less obvious. Editing and converting will still require a bunch of graphics processing. Similarly, I don't think it will be "the last generation of non commodity graphics cards" because game designers and players constantly find new ways to need even more graphics processing. They have 144 Hz screens and so, even if that rate isn't necessary (and I wouldn't know), they have a target to aim for that can stress a GPU.

        "We also know using GPUs fot AI is too difficult for most,,": Not really. It depends what you're doing, but the people building big models tend to want GPUs to do it with. There's a good market in GPU-intensive servers from cloud providers, and I doubt they're being used to play games.

      2. anonymous boring coward Silver badge

        Re: Who can actually see the 8k benefit ?

        8K capability is mostly useful for multi-monitor and VR usage.

        1. Michael Habel

          Re: Who can actually see the 8k benefit ?

          i.e. Idios with deep pockets then. Or People who actually belive the steaming pile of bovine excrement about how 10 of these running in some rig, will make then a million a week in DogEcoin.

      3. A Non e-mouse Silver badge

        Re: Who can actually see the 8k benefit ?

        What about ray-trace rendering? Doesn't that need a huge leap in graphics card performance?

    4. DiViDeD

      Re: quite minor improvements.

      3-4% improvements in performance at the cost of pretty substantial power & cooling requirement hikes just sounds like someone in marketing just said "But what if we hook it up direct to the mains and turn everything up to 11? Will it catch fire, or can we just pump more water around it?"

  3. Pascal Monett Silver badge

    I guess it's no use

    to ask if it can run Crysis ?

    Okay, okay, stop pushing . . .

    1. Jellied Eel Silver badge

      Re: I guess it's no use

      Run it with quad SLI, and it'll probably get close to 30fps.

      1. David 132 Silver badge

        Re: I guess it's no use

        I’d love to get one of these, but first I need to find an EISA-to-PCIe interposer so I can use it with my motherboard.

  4. Loyal Commenter Silver badge

    The main question is... much slower will it be than the RTX 4080 when it arrives in six months' time, at half the price...

    1. Sgt_Oddball

      Re: The main question is...

      "half the price"

      I love your optimism...

      1. Loyal Commenter Silver badge

        Re: The main question is...

        The touted MSP for the 4080 has reportedly been as low as £700, although getting cards at the MSP is likely to be next to impossible. You'll probably end up paying something like £1250 for a 4080 when it lands, if you are desperate to buy one in the first 6 months. You can get 3080s now for £850, which is about £500 less than 2 months ago.

        So I think £1,000 for the 4080, rather than £2,000 for the 3090Ti isn't too far off the mark. You might even get it for less if you can find a "founders edition" one on launch day, although I wouldn't hold your breath.

  5. Boris the Cockroach Silver badge

    2 minutes on sale

    and the scalpers will have bought the lot.

    Then sold them to the miners.

    And thus the masses are stuck with their voodoo 1 cards (I guess tech may have moved on a bit... still hazy here)

    1. Timbo

      Re: 2 minutes on sale

      "Then sold them to the miners."

      I thought NVidia (and AMD) had introduced new firmware to limit the amount of mining these GPUs can do...

      I assume this latest GPU will have such firmware installed from the start ?

      1. Gob Smacked

        Re: 2 minutes on sale

        Nvidia won't cripple their new top notch card for greenwashing purposes. But it's basically not that good for mining as it comes with little performance gain on top of the regular 3090, a hefty additional price hike and way more PSU power needs to make it work for mining. Mining is a strict cost/benefit game, this card won't do.

        Only users that don't have any issues with money and need the performance boost are ready to step in.

        1. Loyal Commenter Silver badge

          Re: 2 minutes on sale

          If you can get your hands on an old RTX 2060 super, that's still better for mining from a hashes/watt perspective, and I know, because I've got one in my gaming PC alongside a newer RTX 3060 Ti, which, because it is crippled, will put out a slightly lower hash rate, even with the "LHR unlock" techniques in the current generation of mining software, but at about 1.5 times the power consumption.

          It's great for gaming though, and the older card is still in there, because it's still profitable to leave it running 24/7 mining ether.

          There seems to be this misconception that you can be a gamer, or a miner, but not both. If you've bought the hardware for gaming, I can see no reason not to use it to mine when you're not gaming, because the value of the cryptocurrency is (currently) greater than the energy cost from running it. The energy consumption is far from being the main energy use in a typical household, so those bewailing the end of the world due to cryptocurrency mining are possibly over-egging the pudding a little.

          Bitcoin mining, however, well that's a different order of magnitude of power consumption. One of those rigs costs £10k and puts out more heat than a fan heater. Come to think of it, it's probably still a more cost-effective way of heating your home than buying a Dyson fan heater though.

      2. Michael Habel

        Re: 2 minutes on sale

        How does that addage go again.....

        If a man can make it.... A Man can break it!"

    2. An_Old_Dog Silver badge

      Voodoo video card

      No problem -- they're still making new maps for Quake 1! Check out:

  6. Anonymous Coward
    Anonymous Coward

    I can finally run Crysis on my 2K monitor *sheds tears of joy*

    1. Michael Habel

      2k (As in Y2k), with 1024x768 Max Res....

  7. sanmigueelbeer Silver badge

    8k p*rn -- `tis coming.

  8. anonymous boring coward Silver badge

    That's like buying a couple of iPhones.

  9. Michael Habel

    I can't wait for AMD/ATi to release some nex-gen Radeon Card for 1700€ that kicks the mother-loving snot out of this Card.... Ya know like the way its always been. I susspect that Card to drop anytime now.. But, 2k on a Videogame Graphiccard?


    .... Yeah thats gonna happen. bless the Miners for they can keep this one to themselves.

  10. Ashto5

    Love IT

    We do need these brainiacs to keep pushing the boundaries

    So if people (limited) can afford them I say crack on.

    I will pick up the benefits later when we get back to normal costs.

  11. Binraider Silver badge

    When I have a house large enough to host a monitor capable of taking advantage of such a card, only then does it become relevant.

    So, as a millennial (barely) that’s not happening as long as I remain in the UK.

  12. Shepard

    I guess...

    That's another paper launch?

    Even if it is available, there is very little advantage of buying this over plain 3090, and even some downsides, especially if you consider that next gen cards will probably launch in November.

    Better save money until then and upgrade to 1,000+ W ATX 3.0 compatible PSU while you wait. Oh, and make some room in that PC case of yours, I hear the new shiny will be 3.5 PCI-e slots in height.

  13. andy 103

    The 24GB of GDDR6X memory is a big deal

    The 24GB of GDDR6X memory is a big deal.... because....

    I don't think any explanation was needed!

  14. tullio

    Many distributed volunteer Computing projects like, Folding@home and Einstein@home use GPUs. I am using two GTX nVidia boards but I cannot afford costlier GPU boards like RTX 3090.But many volunteers spend a lot of money to upgrade their PCs, Windows, Linux or Mac.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like