And once more Nvidia lives up to their reputation for always trying to distract from competitors product announcements by announcing something of their own. All is fair in love and technology.
Nvidia on Monday upped the memory specs of its Ampere A100 GPU accelerator, which is aimed at supercomputers and high-end workstations and servers, and unveiled InﬁniBand updates. Compared to the A100 chip unveiled in May, the new version doubles its maximum built-in RAM to 80GB, and increases its memory bandwidth by 25 per …
I'll use whatever... Intel, AMD, Nvidia, etc. I don't care, however these timed announcements are actually a good thing among corporations as that means competition is now being observed.
P.S. I'm really getting tired of people calling fancy pattern recognition "Artificial Intelligence". Even a hamster on a wheel recognizes that if he keeps running, the wheel keeps moving (kind of like these "AI" articles).
Apologies for the over-used, tiresome meme, but I actually mean it in the sense that I'm curious: these are still labelled GPUs, but are they actually still usable as graphical processors or have they now become so specialised that despite their name there isn't much 'G' in there, it's all CUDA and AI/ML and TensorFlow and what-not?
Biting the hand that feeds IT © 1998–2021