back to article AMD threatens to go medieval on Nvidia with Epyc and Instinct: What we know so far

AMD teased its next-generation of AI accelerators at CES 2026, with CEO Lisa Su boasting the the MI500-series will deliver a 1,000x uplift in performance over its two-year-old MI300X GPUs. That sounds impressive, but, as usual, the devil is in the details. In a press release that followed Monday's keynote, AMD clarified those …

  1. DryBones

    Predictive Text, Collective Delusion

    I have it on very good authority from someone whose Master's focus was on actual AI, that what we are looking at is not AI. It will never be AI, it cannot be AI. LLM is a giant ball of expected responses to inputs. It does not reason, it is very easy to poison, twist, and otherwise tamper with. We have at least one article where they admitted that AI could never be "secured". Expert systems, neural networks that have been trained on matched patterns and work efficiently in specialized spaces not meant for abstraction, are much better solutions in almost all solution spaces.

    And these companies want to build it into everything, appear to be in fact be committing several flavors of financial fraud (round-tripping, for one) chasing the hope that by throwing enough computing power at this thing, they will have something close enough to a sentient mind that they own, to please their shareholders. We should just call it Skynet and get it over with.

    We are seeing a cost spiral to prop up a bubble that is starting to develop leaks. Micron is getting out of consumer memory. AMD and Nvidia are doubling, tripling prices. The resource costs in terms of power, water, are enormous, and they are concealed from those that should be concerned by them. Meanwhile, Intel appears to have not forgotten that consumers got them where they are, and are developing GPUs for the mass market, not data centers.

    I, for one, am watching carefully and will be taking note of those that abandon their end users to chase imaginary profits... and those that do not.

    1. Like a badger Silver badge

      Re: Predictive Text, Collective Delusion

      Meanwhile, Intel appears to have not forgotten that consumers got them where they are, and are developing GPUs for the mass market, not data centers

      Unfortunately they'll still be affected: BoM costs have gone through the roof, so whilst the product may well be excellent, the chances of it hitting that value/performance mix gamers want is negligible.

  2. Sorry that handle is already taken. Silver badge
    Mushroom

    What has CES become?

    It was once known as the Consumer Electronics Show. What the fuck do I want a multi-million dollar neural compute node rack for?

    1. gv
      Joke

      Re: What has CES become?

      To play Doom?

    2. I am David Jones Silver badge

      Re: What has CES become?

      To replace your old multi-million dollar neural compute node rack?

  3. Lon24 Silver badge

    Will nobody think about me?

    AMD is throwing itself into a make or break competition with Nvidia for the AI business. Micron have dumped their consumer brand to go after the AI memory business. RAM and flash companies are swerving their output to AI.

    Doesn't matter whether they will burn or prosper - their attention, investment and product is not coming my way. At the diminishing tail end of the business we have been dumped. Available consumer stock prices have escalated and expected to escalate more.

    Yet until AI the consumer memory business was a profitable business. Still is though declining as more and more of us hold off upgrades or new kit because it's no longer cost effective. It's amazing how you can curb your demand for RAM or storage if you have to. Squeezing the accumulated bloat is not too difficult but is a one-time gain. Surely we are open to new entrants who ignore the risky and technically cutting-edge AI market to capture the steady and less technically demanding consumer market at what was profitable prices?

    I've heard the Chinese DDR5 rumours but where else?

    1. EnviableOne Silver badge

      Re: Will nobody think about me?

      The RAM Cycle will kick back in, and all those fabs doing other things will switch back to RAM and flood the market, driving prices back down, before they become unaffordable, they switch to other things, prices go up, and they then switch back to make hay with the new prices

  4. simonlb Silver badge
    Stop

    1000x Performance Uplift

    This just means the hallucinations occur more often and are more weird. Still with no benefit to 98.7% of the general population.

    1. druck Silver badge

      Re: 1000x Performance Uplift

      The remaining 1.3% being AI snake oil salesmen.

  5. andy the pessimist

    medieval

    How do GPU"s and AI become medieval? Neither existed in medieval times.

    What am i missing here?

    1. LBJsPNS Silver badge

      Re: medieval

      https://www.imdb.com/title/tt0110912/characters/nm0000609/

      Pulp Fiction. Read the first listed quote which I will not post word for word here as I don't want to get banned.

  6. Irongut Silver badge
    FAIL

    AMD's 'X' Chips

    > AMD's X chips have historically featured 3D-stacked cache dies located under or on top of the CCD

    No they haven't, that would be AMD's X3D chips. See Ryzen 7 5800X vs Ryzen 7 5800X3D.

    Historically AMD have used 'X' to denote overclockable parts.

    Did you use an LLM to write this article?

    1. Rozzy

      Re: AMD's 'X' Chips

      Yes, they have, in the enterprise EPYC products. See AMD EPYC 9684X, aka 'Genoa-X'.

  7. harrys Bronze badge

    they acrtually had a sad american trump sychophant tosser on stage

    its all about the (tax payers) money honey :)

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon