back to article Autumn's GTC shows who Nvidia really cares about

This week’s GPU Technology Conference saw Nvidia do something we haven’t see much of from the chip designer lately: refresh a consumer product. For the increasingly enterprise-obsessed tech giant, GTC has become less and less about GPUs for gamers and everything to do with capitalizing on new and emerging markets, such as AI, …

  1. The Dogs Meevonks Silver badge

    The 12GB 4080 is a 4070, it's even using the AD104 die like previous xx70 cards have... they're just trying to mislead consumers whilst they're upping the price some 40% over the last gen.

    The only way to win is not to play.

    1. Francis Boyle Silver badge

      I play

      quite happily using my Radeon. Got tired of Nvidia's shenanigans* years ago.

      *like effectively killing hardware physics.

      1. The Dogs Meevonks Silver badge

        Re: I play

        I've been around long enough to have been a 3DFX customer, my first Voodoo II and then Voodoo II SLI set up back in the later half of the 90's. I had the top end Voodoo 3 and then a Voodoo 5 (6500?).

        I remember when they went out of business and nvidia bought them out... promising to provide support and drivers for 3DFX owners... and then once they had that SLI IP... went nah... fuck you lot.

        Never owned an nvidia GPU since and likely never will. My whole process of buying my PC stuff is one of value for money vs performance. nVidia have always fallen way down the list in that regard.

        trying to sell a 4070 for 200 more than the last gen 3080 whilst 'pretending it's a 4080 is a bullshit move.

        I paid £635 for a 6900XT a couple of months back after I sold my 5700XT to a mate for £150 (about £100 below avg used prices). You couldn't even get a 3070 for that price in the UK.

        nVidia admitted to with holding stuck to try and keep the prices of 30xx series cards high as the market became flooded with used mining cards. I've seen used 3090 cards going for 500 now.

        This only further cements my position that nvidia are a scummy company... who are deluding themselves that they artificially high prices of the previous 18 months during the shortages, crypto boom and scalping... are here to stay and I hope they suffer because of it.

        Fingers crossed that RDNA3 can help AMD do to nvidia what they've been doing to intel... and that they stick as close tot heir previous pricing as before... a 7900XT for the price of a 16gb 4080 would certainly send a message, and still increases the price of previous gen by 200... even better would be to stick with the same 999 price tag

        1. Michael Habel

          Re: I play

          Gets the noggin jogging if nVidia would have it so to create YACC that can only be hashed best using their reassuringly expensive Cards. Which only start to make sense if you'd choose to see it as a Commercial item sold to recoop its loss inside six months. Problem is Ethereum has gone 100% ASIC, the Bitcoin mine has gone dry, and the other shite-tier Coins just aren't worth the bother.

          Of course it also keeps me up at night wondering if they could at least in theory, even get away with it, assuming they can manage to stay three steps behind it, with out anyone finding out about it.

          But, if nVidia think for a second anyone sane is going to blow 2k€ on a Graphics Card, they must be high AF!

          More is the genuine pitty that intel's leadership sucks balls. First by getting perma-stuck at 14nm, than not being able to develop a working ARC Chip, so as to be competitive against these two knuckleheads. Cause AMD are definitely not any better. 850$ RX580s in late winter 2020 anyone?

  2. devin3782

    DLSS still just smacks of we didn't develop the GPU for consumers and as such we've decided to saddle you with tensor cores and so you don't think we're wasting silicon and creating a power virus we've decided AI upscaling is worth while, meanwhile AMD does some upscaling without the need for AI and users would be hard pushed to tell which is which and it works on every GPU.

    Sure it means that if you have an nvidia you can have a play with doing machine learning but I suspect that's a vanishingly small number of people.

    I guess my GTX1070 should be upgraded, but I'm not bothering it was the first nvidia card i've bought since the geforce 256 all those years ago and its always been bothersome with displayport, i've exclusively bought Radeons since then and I think when I do finally upgrade I'll be buying a Radeon again.

  3. Soreuser

    The market leaders: First Intel and now Nvidia, are leaving a wide opportunity in their existing customer base for AMD or a fresh competitor to capture. They don't appear to be overly concerned with better serving their current customers.

    1. Michael Habel

      >Implying that some Cheeto dust covered Booty-Caller gamer is (or was), their customer. Perhaps more like EVGAs customer. But, the saw the light, and bailed. There is only one way you could ever justify nVidias insane pricing, and that is precuring their Card(s) to do production work cranking out the odd bit/eth/Dog eCoins. Where you stood a chance to make your money back, plus a nice bonus on top.

      Alas Bitcoin got mined out, and Ethereum closed the door on GPUs.... For the moment, given how big that market was, and still is to some... I suspect either nVidia will either develop proper ASICs for them, at the current prices, Or some new wonder coin will crop up again. Either way this is just the eye of the hurricane.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like