Gambling on Crypto
Well there's your problem.
You can't make a silk purse out of a sow's ear, so found Nvidia's management team last night as they reported a pig of a quarter that saw revenues and profit crash. For Nvidia's Q4 ended 27 January, reported revenues slumped 24 per cent year-on-year to $2.21bn, operating expenses went up by a quarter to $913m and profit from …
I dabbled in bitcoin back in the early days, forgot all about it for years until l saw that the price had rocketed... Sold a few when it passed $5k, sold a few more when I realised it had passed $10k... and dumped the rest when it hit £15k... If I'd waited a little longer, I too would be mortgage free... Instead I still have £13k to pay off.
If I'd taken it more seriously 8yrs ago... I'd have had more than the 37.xx I did have... @ a little over $19k that 37 alone would have been over $700,000 (around £510,000)... Instead I ended up with a quarter of that... But it does mean that with a much lower mortgage for the last 5yrs, I can cut back on hrs and work part time, do some volunteering and now plan on retiring a lot earlier.
It also put a new roof on my house, replaced my car with a decent 2nd hand one and built a new gaming PC.
Some managed to cash out early enough to make a profit... some lost a whole lot... which is the danger of speculating on things... I didn't have the bottle to ride it all the way. But I'm very happy I did dabble all those years ago.
I don't know if all the downvotes are a reaction to the anecdotal nature of your post (which seems a bit unfair, since you were explicit about that), or sour grapes, or just general hostility toward cryptocurrencies. But for the record, while I'm not a fan of cryptocurrencies and never invested in them, I don't see any reason to be angry at someone who did, and timed the market well enough to make a profit.
As AC wrote downthread, it's speculation, and sometimes speculation pays off. Personally I'm a cautious investor - I'm too lazy to try to do well at any other strategy, and I avoid gambling because I'm afraid I might like it - but if someone else wants to take a chance with some of their disposable income, that's no skin off my nose.
Assuming they're not thereby supporting something I feel is immoral or some such, I suppose. But when it comes to that I expect my 401(k) portfolio probably includes Nestle and other corporations with rather vile behavior, and tu quoque aside I'm reluctant to cast a lot of stones.
Seems a bit early to say that. If they made strong profitable sales in a one-off opportunity market, then what was the gamble?
Only a gamble if:
(Returns on crypto-market sales) < (Losses on inventory/future sales + Dedicated but not fully depreciated production plant)
At the moment I can see no writedowns in the accounts, although there's the slower sales to clear inventory. Having experience the exorbitant price of replacing a failed graphics card a year ago, I suspect that Nvidia (and AMD) will have done handsomely out of the crypto-mining craze. All we're seeing now is that the heat is coming out of the GPU market as cryptominers move on to other get-rich-quick schemes.
But landfilling means zero RoI for the product. Investors won't like that and would rather nVidia fire-sale the stock to at least get something back, and bargain hunters such as these probably won't be interested in the top-of-the-line stuff in any event, meaning there would be little if any overlap.
Just napkin back math...
If an iPhone X costs $370 for Apple to buy, and they sell it for $1000...and Apple has to buy everything for it (like the screen).
I'm going to guess the gross margin on a $1000 video card is A LOT larger for Nvidia -- especially considering (I believe) they fabricate their own GPUs. They've already invested in the factory and tooling, the marginal cost for more GPUs is tiny.
Most of the investment in the ROI you're worried about isn't the $100 in silicon and labor to assemble each card...it is in the intellectual property to design, know how to manufacture, build the factory, and buy the tools to fill the factory.
========
Used to work at a newspaper. If there was a problem with the folders or inserters downstream of the press they'd rather run 5-10,000 copies on a 100,000 paper press run straight into the dumpster rather than stop the presses. It cost them that much in time to start back up again.
Any well matured industrial process your variable input costs per unit are pennies if not fractions of pennies on the dollar.
The crypto cards are not designed for video use. There is no physical video output. The official drivers for crypto cards are locked to prevent video output. Some tech writers/youtubers have managed to use them for gaming using unofficial and questionable methods in very specific hardware configurations, but for the most part the cards are currently useless for the average home user. They may be useful for non-crypto compute, but as far as I can tell no one has embraced GPU for production compute. Most data processing is IO bound, not compute.
As of right now nVidia would rather the card makers destroy the surplus inventory of crypto cards than provide unlocked drivers and guides to use them for video processing. They're trying to prevent the gaming channel from being flooded with the surplus crypto supply, pulling down gaming card prices further.
Used crypto cards are showing up on the secondhand market for cheap, but again without an official unlocked support path for video use from nVidia the cards are mostly useless.
They may be useful for non-crypto compute, but as far as I can tell no one has embraced GPU for production compute
CPU-GPU architectures are pretty common for high-performance scientific computing, and for running convolutional neural networks as part of "deep learning" systems. There are certainly non-crypto production applications.
Granted, there aren't a lot of CPU-GPU applications for general business use, which as you say is typically I/O-bound. But even for traditional business there are potential applications such as Dynamic Stochastic Vehicle Routing, which is of interest to many firms that deal in logistics.
They've been squeezing the market with overly expensive products.
Other than the 2080Ti, which is staggeringly fast, this generation of cards leans heavily on the RTX functionality which is poorly supported and not particularly fast/difficult to get working at an acceptable frame rate.
AMD have not been able to counter this as their higher end products use expensive HBM2 memory, and their new mid range architecture is some time off release. NVidia have changed their architecture more than AMD in the last decade, and it shows with their success.
Your entire last paragraph is wrong. First, do you think Nvidia's high end cards do _NOT_ use HBM? Because they do. Second. Nvidia isn't "successful" because AMD is not, they're "winning" in the desktop market because they charge 50% more for anything of theirs AND then 50% more off of others for simply using Nvidia APi's. It's just massive greed.
Combining the above obvious fact and with AMD bagging Google for game streaming, it's no wonder that this year's Nvidia spot at E3 can basically be seen as a thief begging for you to forget all this (slowly of course, they still want to rob you in the meantime).
P.S. since AMD holds the keys to HBM, I'm still wondering if AMD won't make a unethical play with that fact.
HBM is only used in the very high end Nvidia cards - the nose bleedingly expensive Tesla cards, and a few Quadros. Most of their cards use GDDR. At the price points where AMD use HBM (as low as a few hundred pounds), it's a major component of the cost and this hinders their ability to compete.
AMD's architecture is not as advanced as Nvidias. AMD cards run slower, hotter, require more power, and are noisier than the Nvidia alternative, at a similar price point.
Nvidia are winning in the consumer space because they have a massive investment in driver quality, work closely with game developers, develop new features that work well, and have (relative to AMD) quieter cards.
AMD's strengths lie in embedded systems (wildly successful in consoles and other appliances), APUs, and open source support (unfortunately a tiny market).
Additionally their recently released Radeon VII has truly excellent compute (double precision) performance, and is a bargain if that is your requirement.
It doesn't really give me a lot of pleasure to recommend Nvidia, because they have a repeated history of proprietary products, obstructive open source support, and are pushing their prices up. Unfortunately they tend to work well, and a fair few of the technologies they do develop (such as GSync) tend to work better than AMD's alternative, even if it's more open.
"at a similar price point."
Sure, but you can't just look at the price and stop. If people did that, people wouldn't buy Nvidia to begin with, as the Nvidia tax is very real now to even loyalists. Your mention of Gsync reminds me of that and how Nvidia announced at E3 that they're creating a compatibility list for "unofficial" Nvidia products *. That in itself is a clear sign they are trying to combat their own "Nvidia tax".
By suggesting that if you want a faster card you should choose Nvidia isn't debatable, you're right. But also, why is Nvidia now apparently sitting on hard-to-sell 2080's? I don't see having any other answer besides they are charging 50% more for the same thing, HOWEVER, they do give you more of which they are charging 50% more for.
Coca-Cola could artificially charge $4 for a 12oz can to make a gold rush, but for how long? I think Nvidia is about to discover a similar answer.
* Nvidia mentioned "products" in relation to their compatibility list, but I'm not sure if by products they mean "monitors" or something else. If there is something else, I'm not sure what it is outside of software, but that too might as well fall under monitors as that is what it is used for... I'm actually curious about this (possible new streaming A.I. streaming software coming?).
It's not an Nvidia tax, it's a graphics card tax, as AMD frequently either doesn't have a comparable product or has no will to compete. Hard to sell 2080s? Perhaps, because they're not a vast improvement over the prior generation, but AMD only has the Radeon VII which last time I looked was slower, a bit more expensive, hot, noisy, and basically only worth it if you either need excellent compute or open source drivers.
Gsync vs Freesync is different. When Gsync came out it was expensive, but worked. Freesync was cheap but rather variable. Freesync has now had a second revision and improved to the point it's providing competition to Nvidia, so yes, they're having to take steps to counter AMD.
The compatibility list is for monitors. There are a few monitors Nvidia rate 'fully compatible' and they're testing all the other monitors to expand the list. There's a lot of Freesync monitors out there that basically aren't that great, and wouldn't meet the Freesync 2 standards.
AMD usually don't have the resources to compete properly, but for a consumer, Nvidia does a lot more work. It was embarrassing when Nvidia released 3D Vision for stereoscopic 3D which worked quite well, and for supported monitors provided free drivers. AMD suggested a couple of third party suppliers as an afterthought (and to be fair, one of the third parties is quite good, but did cost another thirty quid on top of the cost of an already expensive graphics card)
Not a surprise that the RTX's arent selling that much yet - there are (at last check) only 21 games that support the whizz bang Live Ray Tracing effects. Paying a premium for special effects that only appear in 21 games? There are only so many people who are rich/stupid enough to pay for that at launch...
My 4 year old mid range card still does everything I want quite nicely thanks. If there were such a thing as a mid range priced card now I might consider upgrading it. The £100-200 card market however is all but nonexistent. Crypto is an easy blame shifter but one must also consider if products people want are on sale. £400gpu? Not necessary. And a £50 one is usually not an upgrade. There is also more competition - skylake onboard graphics are surprisingly good for the cost, so bottom end cards are even more pointless. A serious analyst will acknowledge crypto but it's not the only story in the graphics world.
I bought an RX580 a couple of years ago for the bargain price of £102 (thanks to Amazon's pre-order guarantee and a pricing error where it dropped over £130 before going unavailable).
With my new Ryzen gaming system, I've gaming in 1440p in hight/very high/ultra settings in every game and getting 50fps and upwards in everything except Far Cry 5 and Assasins Creed Origins where I'm getting 40fps.
There's no bad frame dips and as I've only got 75hz freesync monitors... not real need to have higher framerates anyway as anything over 75 isn't going to be noticed.
So I see no need to replace it anytime soon... I will however be keeping an eye on the new Navi gpu's when they come out later in the year.
I was nearly tempted to get an RTX2060 as that's faster than the old 1070ti... but then I saw the prices.
I might keep an eye out for the newly announced GTX1660 which is a 2060 without the RTX side.. and if that's in the mid £200 range and is still faster than a 1070ti... I'll think about that as a contender... after I've seen what Navi is like.
I must also admit that I've never fully forgiven nVidia for what they did to 3DFX after buying out they're IP all those years ago... I had a Voodoo 5, and we got hung out to dry with no drivers, and the 3DFX TV card I had lost support too and it was only thanks to other fans that we had XP drivers at all after that.
But I can't hold a grudge for ever... but I do put value for money vs performance over any preferred vendor. So if an nVidia card offers better VFM Vs Perf... I'll get that... I don't fanboi on anything.
That’s exactly why for the past few upgrades I’ve simply bought the card at the top of the Passmarks value-for-money list:- https://www.videocardbenchmark.net/gpu_value.html
This list should be on everyone’s go to, when they want a new graphics card. If I buy a card on here at the same time one of my mates buys the best card money can buy, we both need to upgrade still at the same time. 2x the cost never = 2x the performance.
More like : you're praying it will.
It will. But drop the mining shit, okay ? Or at least make a special mining line and leave a line for those who are actually interested in gaming.
Remember those guys ? Gamers ? Those who put you where you are now ?
They did make specialised mining cards. The main specialisation was removing all the video outputs from standard GPUs! Which is going to be part of this problem because while they do still work as GPUs its a PIA involving passthrough to another GPU to make them work.
So that's a pile of GPUs rendered worthless. Doubt they'll do that again.
"The significant volatility in our gaming business over the last few quarters has been challenging to model"
What a lot of tosh ! So, Jonny Gamer in China is not gonna cough 1200 USD (aka 6+ months income) for an, admittedly, high-end GPU ?
What a freaking surprise. And so many others in US/Europe either ??? Even dudes with Swiss income still have some remains of sanity on this ...
And they won't pay 800 for performance of the rank of my beloved 970 either ?
And this was hard to forecast, presumably, therefore the "volatility" !
The only volatility I see, here, is the amazing amount of empty space between both ears at NVidia.
"You gambled with Crypto and got burned. I have no sympathy for you since it was us gamers that got screwed in all of this."
NVidia didn't get burnt. They made a ton of money from crypto. Five times more than from gaming (just guessing). And now crypto is mostly over, so they make less money. The crypto money is still in their pocket. Now people who bought shares when profits went up because of crypto, they will be in pain. But NVidia isn't.
But Nvidia? Imagine someone offered you four times your current salary for three years, and after three years you are back to your old salary. Did you get burnt? No, you didn't. You made lots of money.
Machine learning or other GPGPU workloads usually written using nVidia's proprietary CUDA API (wrapped in TensorFlow et al) rather than the open OpenCL one. As another poster pointed out, nVidia imposed onerous licensing restrictions, you can no longer legally use CUDA on a GTX1080 or similar, you have to use even more overpriced Pro cards like Quadro or Tesla. A naked cash grab if there's one.
There are an awful lot of video cards in my datacenter, Sure they aren't pushing the pixels to a local display, but they are pushing them through the network to massive fleet of thin clients we have for the engineers and graphic designers (This way we can re-dedicate the unused render capabilities to power the physics simulation farm when AutoCAD isn't eating the cards)
Nvidia tried to have their cake and eat it too. If they want to get rid of the inventory, there is already a solution. They need to release drivers to allow the crypto boards without video headers to be used as video cards. This has already been proven. Linus Tech Tips did a video about a month ago where they picked up one of these crypto boards and with some driver tweaking were able to pass the video through the buss to the on-board video ports, much the same way a laptop allows this on gaming laptops where you use the Intel graphics for light duty and then use the laptop discrete GPU for games and heavy duty graphics. Nvidia would then have a channel for selling through the overhead. This was also shown to not impact frame rates dramatically.
What a suprise that BTC is still falling, given the inherent ponzi-scheme design whereby later coins require (exponentially?) more work to produce. There ought to be a law against that sort of thing.
Therefore the gradual fall in value must disguise a severe fall in demand. Get out while you still can, I'd say.
We've heard from the benefactors, but the losers seem rather quiet. I wonder, could Crypto loses trigger a global recession?
The way I see it the big risks at the moment are
Crypto
Isolationism - USA and No Deal Brexit
Brexit generally - Its the biggest piece of idiocy since the Tories dreamed up "Austerity" and flatlined the economy, all part of their scheme to send us back to the 1950s (i.e. no worker protection, no HSE, USA style fire at will, no safety net, no opportunities elsewhere for many) but with a large surveillance state to keep us in line.
Stagnant wages (no matter what the media say, I know plenty whose hourly wages are the same as they were making 18 years ago and its a double hit due to inflation. These aren't unskilled workers either, these are skilled manufacturing workers - something that should be the backbone and powerhouse of the economy, who were laid off multiple times through mismanaged companies, every re-hire wages dropped, only now are some of them back to where they were at the turn of the century
Weak consumer demand due to stagnant wages and cost of living rapidly increasing. Consumers no buy rest falls apart, this despite a ream of newspapers printing "Biggest boxing day turnout in a decade", "thousands queue for a bargain" "Relief for retailers" when the reality was virtually no one showed up and those who did, weren't buying much. Orwell would recognise much of this.
Glad to hear this.
Nvidia has betrayed the dedicated consumers after hiking the cost of the cards.
If they'd sell the 2080 for 500, the 2080ti 800 OR made the 2070 Nvlink capable I'd have purchased it.
Instead they figured they'd milk the consumer for every last cent; a trait that's shared by Apple.
And it's backfired for both.
This article prompted me to look for a card which will allow me to watch UHD output on my TV.
This doesn't seem to feature in the general selection criteria. Apparently you only need a high resolution card if you are a gamer.
Perhaps this is a non-issue because most current generation cards support UHD output?
I am idly wondering about a UHD monitor but it would have to be significantly larger than my HD monitor for me to be able to read text when using the full resolution. Upping the default font size doesn't seem much point. I currently have two monitors on my main system which works well for me (full HD landscape and lower res smaller portrait) so I think I would need a screen the width of the two combined to get the same usable real estate. Then up a bit more to make text readable again.
However without looking at UHD monitors it is difficult to visualise.
For UHD television, a dedicated GPU is completely unnecessary. Even the Intel HD 4600 can do 3840x2160@60 on a standardized video stream (H.264, H.265, VP8, VP9, etc). I have a NUC plugged into my TV and it has no problem with 4K streams coming down from Netflix, Hulu, and a few other services.
My on-board Intel graphics from about 2 years ago lets me output to a pair of 4k displays. It's not going to win any awards for performance, but it is fine for watching videos.
My 10 year old MacBook which has Displayport but not Thunderbolt, can go up to QuadHD on Displayport, but only 1080p with my HDMI adaptor. So obviously, you should check the spec sheet, but I think pretty much anything should be fine if you just want lots of pixels.
it is fine for watching videos
Well, yeah. The base-level built-in video in my nine-year-old Thinkpad is fine for watching videos.
I spent many years watching videos on NTSC televisions, with 483 scan lines and (just under) 30 interlaced frames per second, and a rather casual attitude toward color. That worked fine, too; I was able to see and comprehend the image, so I could follow along with the narrative. I always thought that was the point.
In my callow youth, I even found it acceptable - indeed a bit exciting - to watch video on a black & white NTSC set. I recall enjoying any number of monster movies and thrillers and the like in that format. It wasn't 4K, but somehow we muddled on. Well, what did we know?
The proof of work (like ETH or BTC) is dying!,
No justification to spend so much money on expensive GPU's and throw money on power
Therefor the GPU's companies will continue to decline (as well as the POW coins )
There are better technologies around
Take a look at existing POS -proof of stack (like NXT)
or even better new Tech like POT proof of Trust (like RYA)
proof of stack
Proof of stake, I think you'll find. I'm not aware of any stack-based cryptocurrency value metrics.
(Though, curiously, in the cryptocurrency context, there's a quite interesting use of the queue metaphor. Ross Anderson, Mansoor Ahmed, and colleagues at the Cambridge Cybercrime Centre have shown that the Clayton's Case model of FIFO tainting is useful in tracking stolen cryptocurrencies.)