back to article Nvidia will unveil next-gen GPU architecture in September

Graphics giant Nvidia plans to unveil the architecture for its next-generation consumer GPU, Lovelace, at its GTC conference in September, CEO Jensen Huang has said. On an earnings call for the quarter ended July 31, the company saw revenue decline in the quarter, and forecasts a further decline due to weakening demand in the …

  1. Cederic Silver badge

    cost

    Nvidia gaming GPUs have been insanely expensive for a couple of years, and now a lot of their primary gaming markets are suffering significant inflation that reduces discretionary spend.

    It isn't surprising me that they're seeing a drop in demand for their gaming graphics cards. People will be going 'that PC can last another year' right now..

    1. jollyboyspecial

      Re: cost

      Of course high prices and low availability of GPUs have been blamed on crypto mining. You could argue that GPU manufacturers have been only too happy with this situation. The problem for them being twofold, firstly the demand for GPUs for gaming has fallen as a result, but secondly the demand for GPUs for mining is also seems to be falling. As such pandering to the mining market could well have killed the gaming goose that laid the golden egg in the first place.

      But you could also argue that high end GPUs are already better than most gamers actually need and that could also account for a fall in demand.

      1. DS999 Silver badge

        Re: cost

        A lot of those GPUs gamers couldn't buy because miners were snapping them up at inflated prices are now coming back onto the market at fire sale prices as miners have thrown in the towel, so Nvidia will be competing with its own stuff.

        1. Anonymous Coward
          Anonymous Coward

          Re: cost

          I'm not so sure, a good second hand RTX 2080 is still listed for a lot more than I paid for it brand new.

          According to my Amazon history, I paid £369 for my RTX 2080, which was an upgrade from a GTX 970, which was also an upgrade from a GTX 760...and so on. As you can see I went from a relatively low end card in the 7x generation to a relatively high end card in the 20x generation, because that was the only way I could see some sort of tangible jump in performance. Going from a 760 to a 960 wasn't that much of an upgrade, and going from a 970 to a 1070 wasn't much of an upgrade either. I went from the 970 to the 2080 because I wanted to bump up to 1440p (which is the resolution sweet spot as far as I am concerned, 4K is just weird for work and desktop use).

          I think the RTX 2080 has caused people like me to switch to a two cycle GPU upgrade pattern rather than a one cycle because the actual benefit of upgrading has got a lot thinner in recent years. Going from a 2080 to a 3080 wouldn't really change anything for me apart from the power consumption! I'd still be at 1440p, I'd still get framerates under the 165hz in more demanding titles and so on.

          The jump from a 970 to a 2080 wasn't as massive as some people might think it is. The 970 was and still is a solid 1080p card...my oldest son still uses the 970 and for everything he wants to play runs absolutely fine around 60fps...what I did to squeeze a bit more life out of the card was put him on a 1050p (1680x1050) monitor...so from a casual glance, it looks 1080p, but it's a big enough jump down in resolution that it allows the 970 to stretch its legs a bit more. The visual difference between 1650x1050 and 1920x1080 is marginal at worst, an entirely unnoticeable at best.

          If we go way back to the early GPU days, upgrading from a Voodoo 2 to a Voodoo 3 was compared to today a pretty massive jump.

          You also have to factor in that the difference between a top end AMD card vs a top end NVIDIA card doesn't actually matter anymore for the vast majority of people.

          The majority of people are also still gaming at 1080p because we haven't yet seen the leap in the mid range cards that allows for consistent performance across the board at higher resolutions.

          Even top end cards don't really get past 60fps at higher resolutions in most AAA titles. The 3090ti paired with a solid CPU can only barely top 100fps at 4K...in GTA5 (an ancient game at this point) if you turn MSAA off.

          If you have to run a card that is two generations old at 1080p, you're going to get around the same framerate...so whats the point in upgrading if you aren't going to go 4K, if 1080p is good enough and you're going to see bottlenecks with higher end GPUs on mid tier systems?

          My advice to anyone out there considering buying a new GPU for a bit of a performance bump is to forego the GPU upgrade entirely and move from a 1080p display to a 1050p display. You'll get the same performance jump that you'd see if you bought a new GPU for a fraction of the price. In day to day use you won't notice the difference...if anything you might find it more comfortable to use because 1050p is 16:10 aspect ration whereas 1080p is 16:9 so you end up with what feels like more vertical desktop space.

          We're living in strange times right now, and as weird as it sounds...downgrading your monitor just a touch will get you a better performance uplift than upgrading your GPU.

          Even better, most 1680x1050 monitors are 75hz...so as an added bonus, you also get a higher refresh rate. Going from 60hz to 75hz is a significant and noticeable change in games that support changing the refresh rate...it certainly makes a difference in games like CS:GO, PUBG etc.

          https://www.ebay.co.uk/itm/154898720131?hash=item2410aee583:g:CdcAAOSwe2FhlTSZ&amdata=enc%3AAQAHAAAAoNUCA6tTr2RFsSwYh8hsoYp%2Fz6%2FojdK6lKIht9bw%2F31N38ecpvI4nlpx1pGq1hw72gRWVWFXihp5LHnHuOR3vk%2BjKy4YN3hxyeSG%2FKis1d%2Bw0oywJWzVT7gXWJWrg7d4TqKX4HbDq4wbix898q8LZegnTodRXMJ850C%2FFpUuez0ZDIflCuZhMf%2BbCBfo0riXdeewK0552ed6%2BAf2KdnZDj0%3D%7Ctkp%3ABk9SR-L9wr_cYA

          There's one as an example.

          So for £49 you can improve your framerate significantly with barely any visual loss in detail (in fact you might be able to push texture detail higher because of the VRAM you save on the resolution) and you get a higher refresh rate. That's the dream right?

          Bottom line:

          If you're smart, and you want to improve performance of your existing rig...take a look at your monitor first before you buy a GPU. You'll save tons of money and probably get a better uplift in performance.

  2. Il'Geller

    Meanwhile, Nvidia has not made a penny on the giant linguistic model created, and will not do so. This model cannot be practically used in any way and is completely stupid, since it does not have the basis of AI — the individuality (bias, its BIOS) of a human being. AI cannot exist without individuality! So Nvidia uses AI purely for advertising purposes to show what is at the peak of progress. Although Nvidia is not there.

    1. Il'Geller

      I don't think The Register censorship will permit me to publish this post, but I try:

      Nvidia has no understanding that its enormous AI model is a combination of many contexts and even more subtexts (explanations in somehow mentioned texts, dictionary definitions of words, other types of annotations), which makes it impossible to discover and provide the requered information. Impossible because when searching a jump from one context and subtexts, from one source, from one author to another happens, which creates a catastrophic cacophony; where such the cacophony provides the effect of absolute nonsense. See the texts that OpenAi displays? I bet Nvidia even worse.

      Therefore, Nvidia uses its idiotic AI model for advertising purposes only trying to sell its processors: Nvidia doesn't know anything about AI.

      1. tojb
        Headmaster

        Dude... they fund actual university fellowships in the subject... https://www.nvidia.com/en-us/research/graduate-fellowships/

        If you are paying for someone to do a research project at UCSD or similar, then you do that because you are committed to staying at the cutting edge of the field and getting an inside track on new developments. Sheesh.

      2. Anonymous Coward
        Anonymous Coward

        5 days later, your post is still published!

        1. Il'Geller

          Yes, I'm here. The existence of AI signifies a qualitative leap in Philosophy when it becomes a science: the necessity to follow certain steps, some solid rules punctually. For example, that incredibly huge AI model (that Nvidia is so proud of) is a dead end. Nvidia and others cannot use part of a mathematical formula, one needs to use its entirety. What Nvidia has already felt... It lost money and time. As the others.

  3. MrDamage

    There's some pun to be had

    With the name of the chipset, and the fact that most punters would choke at the price.

    1. Mayday
      Trollface

      Re: There's some pun to be had

      Deepthroat or deepfakes?

  4. Phil Dalbeck

    Good!

    The greedy b******ds have been willfully screwing the core PC gamer market for the last several years by pandering to crypto miners and allowing cards to be funnelled directly to them en-masse, while also allowing scalping of what does make it to the retail channel by both resellers and opportunistic hustlers.

    I hear they are currently stuck with loads of existing 3xxx series GPU inventory that crypto-bro's havent bought, and are also locked in to a massive non-cancellable wafer and fab time reservation from TMSC for all the new 4xxx series chips that miners now don't want and gamers simply won't pay for given the glut of servicable 3xxx stock now flowing from both the warehouses and miners trying to liquidate their farms.

    "Hell mend them all" as my old mum used to say.

    1. DevOpsTimothyC

      Re: Good!

      I wonder just how many people are also looking at the cost of electricity and the specs of the new cards requiring even bigger power supplies and thinking the GPU isn't worth the TCO change

      1. Marcelo Rodrigues
        Meh

        Re: Good!

        "I wonder just how many people are also looking at the cost of electricity and the specs of the new cards requiring even bigger power supplies and thinking the GPU isn't worth the TCO change"

        I did this. Was waiting for the black friday, but my old 1050Ti died on me. Got one 3060 12GB, as one 3060 Ti would force me to upgrade the PSU - and I couldn't justify this cost.

    2. anonymous boring coward Silver badge

      Re: Good!

      Nvidia primarily make GPUs and chipsets. The actual cards are chiefly made by others, so Nvidia can't control the prices. On top of that retailers crank up the prices.

    3. pimppetgaeghsr

      Re: Good!

      Should be easy for you to but some PUTS and make some money then next few quarters.

  5. Anonymous Coward
    Anonymous Coward

    Waiting for panic sales when the RTX 4000 series is released

    It would seem that my only hope of buying an RTX 3000 series card (I'd really like a 3060 Ti for some 1440p gaming goodness) is to wait for all the local dealers to start panic selling their stocks of 3000 series cards when the RTX 4000 series comes out.

    On my desktop is an image of a small spreadsheet which I created some time ago which takes the MRP for each of the 3000 series cards, adjusts the $ cost to local currency and adds on local VAT. To date, I have never seen a 3000 card available from any dealer for less than a 20% premium over what they should be charging for them. I simply won't pay it, so my old 2070 card will have to do for a while longer.

    Certainly in northern Europe, I'd put the poor sales of RTX 3000 series cards purely down to profiteering by the dealers.

    1. Anonymous Coward
      Anonymous Coward

      Re: Waiting for panic sales when the RTX 4000 series is released

      "But... but... but... Supply chain problems!!! It has NOTHING to do with greed." *snivelling*

    2. Snowy Silver badge
      Coat

      Re: Waiting for panic sales when the RTX 4000 series is released

      Not sure the 3060 Ti is an big enough upgrade over the 2070 for me to consider buying it, unless the price is well below the MSRP.

      May be even worth skipping the 3000 series altogether and looking at a 4060 sometime is the second half of 2023.

  6. Anonymous Coward
    Anonymous Coward

    Most of their 30 series cards that will handle a 2HD or higher monitor are in the $1000 range (CAD.)

    Just how much do they EXPECT people to fork over to play games?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like