Don't all rush at once...
At the time of writing, Scan have got the ASUS one in stock for only £1,943.99.
Nvidia is finally releasing its latest monster of a graphics processor, the GeForce RTX 3090 Ti GPU, which it said will super-power content creation applications and enable 8K gaming, assuming you're willing to part with a couple thousand dollars. Launched Tuesday, the new GPU is available in graphics cards made by Nvidia and …
Convert $1999 into pounds, add 3.7% import duty, apply 20% VAT and that's only about 44 quid out.
That's less than the UK markup on a 13" macbook pro (although the other thing I checked was a Surface Laptop Studio and the UK non-tax price works out cheaper than the US).
There's a small but significant crowd of people like me using them for content creation (3d rendering, non-game unreal engine visuals, live shows, etc). We don't need quadros, because they aren't really faster and cost a lot more, but we can justify this price for having the extra vram which is actually useful. But yeah it's a niche admittedly.
Anyone using a RED camera system (8K) is going to be opening their wallets. This costs less than many camera add-ons (like preview screens).
2 Grand is more or less nothing in 8K video land.
(Yes I know that few can watch 8k - not the point - that massive frame lets editors do wonderful things before it all gets downscaled to 4k)
This really has to be peak GPU.
I have 32 inch 4k monitor, loads of real estate on it and I can't actually see the pixelation. This is why NVIDIA is diverting to DPUs, this is the last generation of non commodity graphics cards. We also know using GPUs fot AI is too difficult for most,, this feells like a last stab at Graphics before the DPU and Omniverse dream.
I doubt that it will be. While there probably are people who have 8K screens for watching stuff, most of it will be people who record 8K video so they have lots of room to edit. The final product will be 4K, but edits will be less obvious. Editing and converting will still require a bunch of graphics processing. Similarly, I don't think it will be "the last generation of non commodity graphics cards" because game designers and players constantly find new ways to need even more graphics processing. They have 144 Hz screens and so, even if that rate isn't necessary (and I wouldn't know), they have a target to aim for that can stress a GPU.
"We also know using GPUs fot AI is too difficult for most,,": Not really. It depends what you're doing, but the people building big models tend to want GPUs to do it with. There's a good market in GPU-intensive servers from cloud providers, and I doubt they're being used to play games.
3-4% improvements in performance at the cost of pretty substantial power & cooling requirement hikes just sounds like someone in marketing just said "But what if we hook it up direct to the mains and turn everything up to 11? Will it catch fire, or can we just pump more water around it?"
The touted MSP for the 4080 has reportedly been as low as £700, although getting cards at the MSP is likely to be next to impossible. You'll probably end up paying something like £1250 for a 4080 when it lands, if you are desperate to buy one in the first 6 months. You can get 3080s now for £850, which is about £500 less than 2 months ago.
So I think £1,000 for the 4080, rather than £2,000 for the 3090Ti isn't too far off the mark. You might even get it for less if you can find a "founders edition" one on launch day, although I wouldn't hold your breath.
Nvidia won't cripple their new top notch card for greenwashing purposes. But it's basically not that good for mining as it comes with little performance gain on top of the regular 3090, a hefty additional price hike and way more PSU power needs to make it work for mining. Mining is a strict cost/benefit game, this card won't do.
Only users that don't have any issues with money and need the performance boost are ready to step in.
If you can get your hands on an old RTX 2060 super, that's still better for mining from a hashes/watt perspective, and I know, because I've got one in my gaming PC alongside a newer RTX 3060 Ti, which, because it is crippled, will put out a slightly lower hash rate, even with the "LHR unlock" techniques in the current generation of mining software, but at about 1.5 times the power consumption.
It's great for gaming though, and the older card is still in there, because it's still profitable to leave it running 24/7 mining ether.
There seems to be this misconception that you can be a gamer, or a miner, but not both. If you've bought the hardware for gaming, I can see no reason not to use it to mine when you're not gaming, because the value of the cryptocurrency is (currently) greater than the energy cost from running it. The energy consumption is far from being the main energy use in a typical household, so those bewailing the end of the world due to cryptocurrency mining are possibly over-egging the pudding a little.
Bitcoin mining, however, well that's a different order of magnitude of power consumption. One of those rigs costs £10k and puts out more heat than a fan heater. Come to think of it, it's probably still a more cost-effective way of heating your home than buying a Dyson fan heater though.
I can't wait for AMD/ATi to release some nex-gen Radeon Card for 1700€ that kicks the mother-loving snot out of this Card.... Ya know like the way its always been. I susspect that Card to drop anytime now.. But, 2k on a Videogame Graphiccard?
Hummmmm....
.... Yeah thats gonna happen. bless the Miners for they can keep this one to themselves.
That's another paper launch?
Even if it is available, there is very little advantage of buying this over plain 3090, and even some downsides, especially if you consider that next gen cards will probably launch in November.
Better save money until then and upgrade to 1,000+ W ATX 3.0 compatible PSU while you wait. Oh, and make some room in that PC case of yours, I hear the new shiny will be 3.5 PCI-e slots in height.