They are slow...
Not the card, the manufacturer. All IT-News were on the "WTF why 2 different RTX 4080" for more than a month. Their radar must be broken.
Just weeks after unveiling its 40-series cards powered by the all-new Ada Lovelace architecture, Nvidia said Friday it was pulling the plug on the RTX 4080 12GB before the card had even hit store shelves. “The RTX 4080 12GB is a fantastic graphics card, but it’s not named right. Having two GPUs with the 4080 designation is …
I bought my kiddo a GT1030 after checking the benchmarks online. (It was the height of the market panic, we just needed something that works.) Turns out, I bought the newer version, which uses cheaper memory and gets half the framerate of the older one. Bastards.
Unfortunately NVidia has for a long time sold same models with DDR (slow) and GDDR (fast) memory configurations. The difference in gameplay can be night and day as you wrote.
One really needs to dig into the specifications and read the small print when buying GPUs.
Yes, they're bastards.
Good. Having a GTX4080 with 2 different RAM amounts at launch, all sorts of cards have shipped wth a cetain GPU but differing amounts of RAM. the practice of shipping the same designation for different spec GPUs (other than desktop versus mobile maybe) is IMHO deceptive and I'm glad their stepping away from that practice.
I've been calling it a 4070 ever since the announcement... but there's actually an argument that it's a 4060 really.
When you look at things like the CUDA core count... the 12GB 4080 had 46% of cores as the 4090.... so did the 3060 compared to the 3090. The 3070 actually had more like 58% of the cuda cores against the 3090.
It'll be back shortly... as a 4070, they'll probably knock $100 of the MSRP which is suicide for a xx70 class card at $799... and leaves such a massive gap between the xx70 & xx80 class... that I can't see how they'll maintain the price of the higher cards now.
Massive blunder on their part... such greed and short-sightedness has led them down this path... they still think there's a shortage and a cryptoboom... or are banking on another one coming along shortly.
Time for AMD to shine and not try to jack up their prices... if they can compete with RDNA3 within 5% on average of the 40 series... at a similar price to their last gen... they can do to nvidia what they've been doing to intel.
As for intel... they're targeting that lower to mid range market... sure they're not competing with nvidia at all except on price... and if you're interested in playing older games.... not worth it. But if you want to make use of AV1 encoding/decoding... right now, an ARC GPU even as a secondary card to compliment your existing one... could be the right call to make and save a lot of money.
You make some good points. From what I'm reading, when you take into account the relative core count vs. the flagship and compare with what was released last gen, the 12GB 4080 was really more akin to what a 4060TI should have been. What might be even more awkward for nvidia is the same type of analysis suggests the 4080 16GB is rather gimped compared to the 4090, and on relative performance terms that card should probably have marketed as the 4070. Right now its left them imposing a massive price hike for the 4th gen xx80 for a card that performs relatively worse vs. it's xx90 - it's really not a good look.
Intel has priced their flagship ARC card very low compared to it's capabilities, and it's truly an impressive debut for a company that's effectively new to dedicated GPUs. The decision to use emulation for older DX titles does kneecap the card a lot, but it would be good for the market if ARC succeeds. My worry is without a crypto boom jacking up prices Intel may lose interest in developing the line any further.
AMD is definitely the one to watch, but at this point I'm kind of expecting to be disappointed. They've been winning in the price/performance stakes for years, and what nvidia have pulled this time around just gives them more breathing space to stay in that zone. I'd love to see them come in like a wrecking ball with their next gen prices but I just don't see it happening.
Comparing GPUs with the price of a steam deck or any of the current gen consoles makes it clear that the crypto boom has caused a complete failure in the GPU market, and prices are going to have to fall a long way before high-end gaming on the pc will get anywhere near mainstream again.
I suspect that falling PC sales have little or nothing to do with what's going on in the gamer market, and the corporate market is still what defines overall PC sales trends. My bet would be that a lot of firms bit the bullet on hardware rollouts and upgrades for remote-working staff during the Covid lockdowns, and those assets aren't ready to be refreshed yet; the relatively poor numbers this year represent this distortion of the refresh cycle rather than a real downturn.
Was looking for a pun for their technology vs naming conventions - live and let die - came to mind about the dies on the chip, but honestly I went from building and selling PC’s 15 years ago and being a walking encyclopaedia on hardware specs to being utterly lost where all CPU’s seem to have the same 4 number code over the last 10 years and there is nothing in the naming that even suggests you are getting a 4 year old dog, a 2 year old bargain, or something released this year that costs a bomb and may or may not be a knackered thoroughbred…
Seems that you can still rely on onboard intel graphics being lame though, whatever their marketing team says trying to convince you of the opposite…