back to article Nvidia just can't grab a break. Revenues up, profit nearly doubles... and stock down 20%

Nvidia has turned in growth in revenue and profit, but has been punished for missing its guidance in the third quarter of its fiscal 2019, all amid a continuing sharp drop in demand from crypto-currency miners. Its stock fell as much as 20 per cent after it reported on Thursday: Revenue of $3.18bn, up 21 per cent year-on- …

  1. whitepines
    Facepalm

    Couldn't have anything to do with their driver based DRM, lock in, active hostility toward open source (including purposefully locking out nouveau at the hardware level), could it?

    Nah, let's just blame that crypto mining stuff. While we add more stuff to the driver EULA. And AMD opens more of their driver stack with Linux as first class citizen for compute....

    1. Mark 85

      If overstocked and blaming it on cryptocurrency "demand" is the reason, that's a pretty poor excuse since they should have seen that one coming. They used to be the goto video cards for gamers after buying up a couple of their competitors who were just as good but lower in price.

    2. Anonymous Coward
      Anonymous Coward

      Oh please

      It isn't like they haven't had the same policies for years. You think NOW people suddenly care as much as you about DRM and open source? Sorry, but most customers are running Windows and don't have any issues there.

      Anyone refusing to buy Nvidia stuff over these issues in the last quarter was refusing to buy it five years ago - you can't lose customers who you've never had.

      1. whitepines

        Re: Oh please

        See any Windows supercomputers in the upper reaches of the Top 500? Didn't think so. Now where does NVIDIA make a ton of money on larger cards again?

        The wind is starting to blow a different direction. People I've talked to in that space are just about fed up with NVIDIA and their tactics, yet are stuck because CUDA (not Windows). If NVIDIA pushes more and more restrictions, and AMD can somehow catch up in perf/watt, there will be a shift, period.

        1. Anonymous Coward
          Facepalm

          Re: Oh please

          And do you believe those working on supercomputers care much about DRM and not having open source drivers?

          Of course you're free not to buy nVidia if it doesn't fit your ideology - just like nVidia could easily not care of the few people who buy hardware following an ideology...

          1. Lee D Silver badge

            Re: Oh please

            You mean those people running mechanical, engineering and physics simulations who can't afford to have errors creep in from the implementation of something that might be there "because it makes this game run faster"?

            Yeah, I think they care immensely. Especially when they're doing months or years of calculations and need a stable base to run it on, a predictable and consistent interface to do so, to squeeze every inch out of their hardware, and to do so without unrelated gaming/media functions or driver bugs rearing their head.

            1. whitepines

              Re: Oh please

              Someone that finally gets it! Errors, or just plain incorrect behaviour in the proprietary stack doesn't go over well when you end up idling a multi-hundred-million pound supercomputer (or throwing away the results from it) because of some bug you have to a.) allow NVIDIA to reproduce and b.) wait for NVIDIA to come out with a fix. Even if it's something simple that you could have tracked and fixed yourself (since you probably consider getting your machine up and running again a higher priority than NVIDIA does).

              And don't get me started on debugging obscure CUDA/OpenCL faults through the black box of that whole stack....

              1. BigSLitleP

                Re: Oh please

                If you're plugging in a gaming graphics card in to an engineering rig, you deserve all the problems you get and shouldn't be allowed near an engineering rig.

            2. Chz

              Re: Oh please

              "Yeah, I think they care immensely. Especially when they're doing months or years of calculations and need a stable base to run it on, a predictable and consistent interface to do so, to squeeze every inch out of their hardware, and to do so without unrelated gaming/media functions or driver bugs rearing their head."

              Then they shouldn't use gaming cards and gaming drivers, maybe? Teslas and Quadros exist for a reason. If you're too cheap for them, then them's the breaks. Oh the horror, people charge more for enterprise-class hardware and software, never would have thought it.

              1. whitepines

                Re: Oh please

                What part of "we don't get open drivers for those either" didn't you understand?

                Tell me, when my OpenCL application on a Linux cluster does this:

                #0 0x00007f5df188c201 in ?? () from /usr/lib/libcuda.so.1

                #1 0x00007f5df1893d0f in ?? () from /usr/lib/libcuda.so.1

                #2 0x00007f5df189412b in ?? () from /usr/lib/libcuda.so.1

                #3 0x00007f5df18944b1 in ?? () from /usr/lib/libcuda.so.1

                #4 0x00007f5df18b8ebd in ?? () from /usr/lib/libcuda.so.1

                #5 0x00007f5df188988a in ?? () from /usr/lib/libcuda.so.1

                and I'm not lucky enough to have direct access to an NVIDIA engineer dedicated to my site, how am I supposed to proceed?

                1. Anonymous Coward
                  Anonymous Coward

                  "I'm not lucky enough to have direct access to an NVIDIA engineer"

                  Evidently your work is not valuable enough. And are you sure the bug is in libcuda and not elsewhere? What the OpenCL people told you?

                  1. Anonymous Coward
                    Anonymous Coward

                    Re: "I'm not lucky enough to have direct access to an NVIDIA engineer"

                    How's the weather in Santa Clara today at nVidia HQ?

              2. MonkeyCee

                Re: Oh please

                "Then they shouldn't use gaming cards and gaming drivers, maybe? Teslas and Quadros exist for a reason. If you're too cheap for them, then them's the breaks."

                If the gaming cards are no good for datacentre use, then why am I prohibited (unless I fork over the same fees as for having Teslas) from using my gaming cards in one? I'm allowed to mine crypto in that situation, but fold proteins or run a neural network on it? That'll be 10k+ pa thanks.

                It's a bad choice for nVidia, since they have alienated the academic market. In the same way that I have academic versions of software that cost me nothing, but are normally 10-20k pa in fees, because when I find a way to improve a current implementation or create a new one, they get the benefit from it. Plus any time I use it in anger, then someone is paying the full fat fee for that pleasure.

              3. whitepines
                Holmes

                Re: Oh please

                "Then they shouldn't use gaming cards and gaming drivers, maybe? Teslas and Quadros exist for a reason. If you're too cheap for them, then them's the breaks."

                So in your little world, who decides what research is worthy of having access to computer time? Historically that was academics, now you seem to be saying the fattest pocketbook (i.e. whoever can pay NVIDIA enough to get attention) dictates what can or cannot be simulated or investigated. That's bloody dangerous; in the past well funded ideas put forward by people with money, but not brains, often turn out to be wrong, while the grassroots academic research from smart, but poorer, people tends to be right.

                Doesn't matter too much I suppose with AMD out there now. NVIDIA can bask at the top for a while, ignoring all the warning signs and milking this cash cow as long as they can, until they suddenly wonder what happened when the bottom falls out, like they do now with mining.

                1. Anonymous Coward
                  Anonymous Coward

                  "who decides what research is worthy"

                  Yes, because academics are always poor lads with empty pockets and don't work for institutions that get millions of funds from governments and industries for their researches... and you can still develop on a cheap card, and if your research are interesting, ask to run them on far bigger and sophisticated systems.

                  And most of the time those researches don't have the required skills to spelunk into a graphic drivers, nor it looks to me researches are complaining they can't use nVidia cards because of their closed drivers.

                  Those using supercomputers for critical computations will check the processing pipeline for issues.

                  People like you are just interested in the "purity" of their gaming rigs... ah, those infidel proprietary drivers - get a life...

                2. Chz

                  Re: Oh please

                  "So in your little world, who decides what research is worthy of having access to computer time? Historically that was academics, now you seem to be saying the fattest pocketbook (i.e. whoever can pay NVIDIA enough to get attention) dictates what can or cannot be simulated or investigated."

                  Academia has always got discounts, and still does. What in the hell are you talking about? Universities have still always shelled out a LOT of money for that kit, even with the discounts. Do you really think money doesn't talk in academia? That's crazy.

                  And yes, it had ALWAYS been the case that researchers with the right friends can call up IBM, Nvidia, you name it and get this stuff for cost. Below cost, if it's headline-grabbing enough. It's purest fantasy to even suggest that academics actually get to decide research priorities when there's massive expenses involved.

            3. Anonymous Coward
              Anonymous Coward

              Re: Oh please

              Do you also believe that those people building supercomputer don't have nVidia engineers working with them - and probably even access to code under NDA? Those aren't your bedroom PC, believe me...

              Nor the specific driver nVidia builds for high-end processing are the same for your little gaming rig. But it looks your perspective on IT is very narrow, and narrowed by those who brainwashed you into believing a very strict definition of open source - politically driven by people like Stallman - it's the only acceptable one.

              1. whitepines
                WTF?

                Re: Oh please

                You seem to be the only one bringing "religion" into this discussion. I've worked on large projects and worked directly with the technical folks that literally design the hardware and software of these supercomputers at very large corporations. Don't believe for a second that they get special treatment or access to source code; if anything that shows your naivete and blind trust in sales personnel...

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Oh please

                  I didn't work directly with nVidia (we do use their hardware and drivers, without issues) but I worked with other companies supplying high-end (and very expensive) hardware - and we always had good support and direct access to engineers and code when needed - often we had specific contacts to follow us. And even their sales personnel were helpful when issue arose to solve them in the shortest time.

                  Sorry if you've been so gullible to choose the wrong, unreliable suppliers, because you've trusted the wrong salesman...

                  I've brought in "religion" exactly because for some people "open source" becomes a religious obsession. There's a lot of code I had access to - usually under NDA - which wasn't ever "open sourced" in a GPL meaning.

                  Face it - a lot of companies will never "open source" their code, live with it, and stop whining when it isn't.

          2. katrinab Silver badge
            Facepalm

            Re: Oh please

            "And do you believe those working on supercomputers care much about DRM and not having open source drivers?"

            Yes I do, given that a supercomputer isn't an off-the-shelf product that you buy in PC World, and optimising these drivers for best performance is very important.

            1. Anonymous Coward
              Anonymous Coward

              "optimising these drivers for best performance is very important"

              And who do you believe will optimize those drivers? Maybe exactly the one who know best the hardware they run for?

              1. Anonymous Coward
                Anonymous Coward

                Re: "optimising these drivers for best performance is very important"

                And who do you believe will optimize those drivers? Maybe exactly the one who know best the hardware they run for?

                This is a key point. I'd like to see the Nvidia drivers open-sourced or, at least, enough that they could talk to an open opencl implementation (people seem to be missing that about the AMD drivers, AMDGPU-PRO is not open source). However, I know people who run a medium sized cluster (i.e. not supercomputer, but a lot of capacity) in a university computer science department, and I don't believe they have the capability to be debugging driver code. That's not to say there aren't a lot of clever people there, but the people employed to run the facility are not driver developers and the academic staff and students are there to solve research problems, not spend months debugging hardware. There are few things less inspiring of confidence than running code that depends entirely on something a PhD student has written, because even if they're very talented most don't care about other people's corner cases.

                Open source it and for someone somewhere it will be their topic of interest, and everyone benefits. But the idea that departments that size will stay away from Nvidia because they want to be able to fix their own driver code is based in a fantasy. These people are still buying Nvidia.

                As for the "what are you doing using consumer products", we're talking silicon here. If it's consumer it might not be up there in terms of reliability or capacity, but a bug isn't going to respect consumer/professional boundaries when they share the same architecture, see https://www.techarp.com/guides/complete-meltdown-spectre-cpu-list/5/ and look for Xeon. In theory people using GPUs should understand what they can mean for precision (though the days of 16bit only are past), but they should also in theory know about the differences between x87 and SSE calculations... In practice these things are increasingly bought to plug ML algorithms into, and the attitude to precision is that it'll just get taken care of in the algorithm.

            2. Anonymous Coward
              Anonymous Coward

              Re: Oh please

              Just because NVidia won't release the drivers as open source doesn't mean they won't make the source available under license/NDA to someone buying thousands of cards for a top 500 cluster.

      2. Starace
        Flame

        Re: Oh please

        Well their recent tactics in the cloud / virtualisation market certainly pissed me off.

        Deliberately stopping things from working because it suits their market segmentation is pretty annoying.

        And making me pay license fees to actually use the horribly expensive dedicated hardware I've bought is taking the piss.

        Kicked them right down my shopping list. Their kit is good but I can't tolerate their behaviour.

        1. MonkeyCee

          Re: Oh please

          "And making me pay license fees to actually use the horribly expensive dedicated hardware I've bought is taking the piss."

          Can't upvote this enough.

          Either it's my effing kit or not.

          1. Pascal Monett Silver badge

            Well one must admit that, these days, the trend is : it's not.

    3. steve 124

      or the fact that a flagship video card 3 years ago cost $300-$350 (within reach of a hardcore gamer, barely, but reachable)... and now the new RTX is $900-$1100 and their 3 year old cards are still $450-$550.

      I'm an engineer and make a very good living, no kids, no wife and I won't be buying an RTX (and I am definitely an early adopter, I preordered my oculus and bought my gtx 1080 for $600, because there wasn't a choice).

      AMD needs to step up their game and give these jerks some competition again. Nvidia's winning streak over the last few years has given them Hollywood Syndrome. They think everyone loves them because they are awesome and they don't realize we're only hanging out with them because AMD is over in the corner blowing snot bubbles and smelling their finger.

      Please let AMD 7nm cards have something to compete with RTX, we need performance competition to drive the tech forward and the prices down. This reminds me of the 9800pro vs Nvidia back in the early 2000s when AMD jacked those card prices up until noone would buy them.

      Anyways, Nvidia, cut your prices in half (which is still too expensive) and you'll sell 3 times as many cards... I don't know their overhead per card, but market share has it's value too.

  2. FlamingDeath Silver badge

    human $values

    They're fucked up right?

  3. Anonymous Coward
    Anonymous Coward

    Hopefully the ending of Moore's law will mean AMD catch up and competition will deliver better value for money.

  4. Christopher Reeve's Horse

    Maybe it's just greed!

    Certainly for the domestic graphics card market, the costs have inflated grossly and disproportionately against the performance increase. A new 2080ti costs £1,200 pounds or more. This is utterly mental! The previous generation 1080ti was an already excessive £650, and the new cards are nowhere near touching double the performance. It's just greedy pricing.

    And to add, the already very expensive 1080 series had been the performance leader for SOOOOO long, that most of the market of high-end users were already running one, and few will justify the extortionate re-investment at this point in time for such a small gain.

    1. Spazturtle Silver badge

      Re: Maybe it's just greed!

      "A new 2080ti costs £1,200 pounds or more. "

      Don't forget that the 2080ti has a habit of killing itself after a few weeks or even setting itself on fire. Nvidia have just put out a statement admitting that they released faulty cards.

      Apparently the QA machines were faulty and reported that every card passed the tests, so essentially no QA has been done on the 2080ti cards, your card may die a month after you buy it or a month after the warranty is up, who can say, not Nvidia because no QA was done.

  5. rav

    The pain is far from over for nVidia. Demand for nvidia GPU's will drop this year.

    Intel’s 10nm failure is going to hit nVidia pretty hard.

    Intel has already announced they will be producing 25% LESS or essentially cutting CPU shipments to the PC building sector by 2 million units in Q4 2018 in order to cover server and laptop commitments on their already overworked and obsolete 14nm process. Obviously Intel fears AMD’s EPYC market share gains in the Server and HPC space so much they treat the desktop space as a forlorn hope.

    According to Digitimes and a recent piece on Tom’s Hardware “Intel Cuts CPU Shipments by 2M, Motherboard Makers May Suffer – Report” does not bode well for nVidia. If mobo makers will suffer then nVidia is going to get hammered.

    If Intel is cutting shipments then NVDA is loosing sockets to put GPU's in.

    nVidia attachment rates to Intel motherboards is virtually 100%. Intel is essentially reducing the possible sockets available to nVidia by 2 million units through the end of this year.

    AMD Radeon attachment rates to Intel motherboards is relatively low so the decrease in Intel silicon will have zero impact. And of course the attachment rate of nVidia to AMD motherboards is extremely rare.

    AMD will grab market share gains in desktops and laptops from both Intel and nVidia.

    Jensen Huang could have at least warned of a miss several weeks back. Unless they had NO CLUE.

  6. Crazy Operations Guy

    If Moore's Law is ending

    Then hopefully we can finally get developers to actually optimize and clean up their code rather rather than just relying on the fact that next year's chips are going to be good enough to run their inefficient crap well enough the user doesn't care.

    Video chips are literally millions of times more powerful than they were in the 1980s, but graphics performance as really only increased by a factor of a thousand since.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like