back to article Nvidia admits mistake, 'unlaunches' 12GB RTX 4080

Just weeks after unveiling its 40-series cards powered by the all-new Ada Lovelace architecture, Nvidia said Friday it was pulling the plug on the RTX 4080 12GB before the card had even hit store shelves. “The RTX 4080 12GB is a fantastic graphics card, but it’s not named right. Having two GPUs with the 4080 designation is …

  1. Jou (Mxyzptlk) Silver badge

    They are slow...

    Not the card, the manufacturer. All IT-News were on the "WTF why 2 different RTX 4080" for more than a month. Their radar must be broken.

  2. Ace2 Silver badge

    Terribly misleading model names

    I bought my kiddo a GT1030 after checking the benchmarks online. (It was the height of the market panic, we just needed something that works.) Turns out, I bought the newer version, which uses cheaper memory and gets half the framerate of the older one. Bastards.

    1. Sandtitz Silver badge

      Re: Terribly misleading model names

      Unfortunately NVidia has for a long time sold same models with DDR (slow) and GDDR (fast) memory configurations. The difference in gameplay can be night and day as you wrote.

      One really needs to dig into the specifications and read the small print when buying GPUs.

      Yes, they're bastards.

  3. Anonymous Coward
    Facepalm

    I don't believe it

    Nvidia managed to be even more obscure than Intel when it comes to chip names.

    Although the verb 'unlaunch' has definite potential. If only Suckerberg could unlaunch Meta.

    1. Evil Auditor Silver badge

      Re: I don't believe it

      Unlaunch Meta? I find its forthcoming crash landing more satisfying... (wishful thinking)

      1. a pressbutton

        Re: I don't believe it

        A meta unlaunch button

        It is all getting a bit self referential

    2. pip25
      Meh

      Re: I don't believe it

      You can't unlaunch something that has never really launched in the first place... and it's doubtful whether it ever will.

  4. seven of five

    confuses the customer...

    Soooo nice of NV to rename their card so us little customers don't get confused.

    Greedy, lying Bastards, thought they get away with it, thought they? Well, they can fuck right off, especially at these prices.

  5. MikeLivingstone

    NVIDIA is slowly dying. They are are having yield issues and are putting out rubbish products to plug the gap.

    Don't buy anytyfrom NVIDIA, there are loads of better alternatives.

    1. Anonymous Coward
      Anonymous Coward

      So true, I'm not sure myself what chipset to buy. Will it be a Tseng Labs, an S3, a Trident, or maybe one of those newfangled 3dfx?

      1. Piro Silver badge

        ATi, or Intel even!

      2. NightFox

        Voodoo FTW!

    2. Lennart Sorensen

      So alternatives like intel that can't make anything with decent performance or ATI/AMD that has never figured out how to write stable drivers (which is a shame since their hardware is good).

  6. Oh Homer
    Devil

    Unlaunch My Card

    I love that song.

    1. Auntie Dix
      Devil

      Unlunch My Tray / Re: Unlaunch My Card

      I unlunched what I had eaten at Chipotle, shortly thereafter. I did not know that they made bio-weapons.

  7. Henry Wertz 1 Gold badge

    Good.

    Good. Having a GTX4080 with 2 different RAM amounts at launch, all sorts of cards have shipped wth a cetain GPU but differing amounts of RAM. the practice of shipping the same designation for different spec GPUs (other than desktop versus mobile maybe) is IMHO deceptive and I'm glad their stepping away from that practice.

  8. The Dogs Meevonks Silver badge

    Called it a 4070... I may have been wrong

    I've been calling it a 4070 ever since the announcement... but there's actually an argument that it's a 4060 really.

    When you look at things like the CUDA core count... the 12GB 4080 had 46% of cores as the 4090.... so did the 3060 compared to the 3090. The 3070 actually had more like 58% of the cuda cores against the 3090.

    It'll be back shortly... as a 4070, they'll probably knock $100 of the MSRP which is suicide for a xx70 class card at $799... and leaves such a massive gap between the xx70 & xx80 class... that I can't see how they'll maintain the price of the higher cards now.

    Massive blunder on their part... such greed and short-sightedness has led them down this path... they still think there's a shortage and a cryptoboom... or are banking on another one coming along shortly.

    Time for AMD to shine and not try to jack up their prices... if they can compete with RDNA3 within 5% on average of the 40 series... at a similar price to their last gen... they can do to nvidia what they've been doing to intel.

    As for intel... they're targeting that lower to mid range market... sure they're not competing with nvidia at all except on price... and if you're interested in playing older games.... not worth it. But if you want to make use of AV1 encoding/decoding... right now, an ARC GPU even as a secondary card to compliment your existing one... could be the right call to make and save a lot of money.

    1. Scunner

      Re: Called it a 4070... I may have been wrong

      You make some good points. From what I'm reading, when you take into account the relative core count vs. the flagship and compare with what was released last gen, the 12GB 4080 was really more akin to what a 4060TI should have been. What might be even more awkward for nvidia is the same type of analysis suggests the 4080 16GB is rather gimped compared to the 4090, and on relative performance terms that card should probably have marketed as the 4070. Right now its left them imposing a massive price hike for the 4th gen xx80 for a card that performs relatively worse vs. it's xx90 - it's really not a good look.

      Intel has priced their flagship ARC card very low compared to it's capabilities, and it's truly an impressive debut for a company that's effectively new to dedicated GPUs. The decision to use emulation for older DX titles does kneecap the card a lot, but it would be good for the market if ARC succeeds. My worry is without a crypto boom jacking up prices Intel may lose interest in developing the line any further.

      AMD is definitely the one to watch, but at this point I'm kind of expecting to be disappointed. They've been winning in the price/performance stakes for years, and what nvidia have pulled this time around just gives them more breathing space to stay in that zone. I'd love to see them come in like a wrecking ball with their next gen prices but I just don't see it happening.

      Comparing GPUs with the price of a steam deck or any of the current gen consoles makes it clear that the crypto boom has caused a complete failure in the GPU market, and prices are going to have to fall a long way before high-end gaming on the pc will get anywhere near mainstream again.

      I suspect that falling PC sales have little or nothing to do with what's going on in the gamer market, and the corporate market is still what defines overall PC sales trends. My bet would be that a lot of firms bit the bullet on hardware rollouts and upgrades for remote-working staff during the Covid lockdowns, and those assets aren't ready to be refreshed yet; the relatively poor numbers this year represent this distortion of the refresh cycle rather than a real downturn.

  9. PhilipN Silver badge

    Nerds rule OK

    Nonplussed that none of those commenting on the name noticed the possible confusion on the part of red-blooded males with another Lovelace, beginning Lin…

  10. bpfh
    Flame

    Core blimey…

    Was looking for a pun for their technology vs naming conventions - live and let die - came to mind about the dies on the chip, but honestly I went from building and selling PC’s 15 years ago and being a walking encyclopaedia on hardware specs to being utterly lost where all CPU’s seem to have the same 4 number code over the last 10 years and there is nothing in the naming that even suggests you are getting a 4 year old dog, a 2 year old bargain, or something released this year that costs a bomb and may or may not be a knackered thoroughbred…

    Seems that you can still rely on onboard intel graphics being lame though, whatever their marketing team says trying to convince you of the opposite…

  11. cornetman Silver badge

    > One might argue the 12GB 4080 should have been a 4070 in the first place.

    Many people have. It was a particularly bizarre move by NVidia TBH. I don't know what Jensen was smoking that day.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like