back to article Desktop AI isn’t happening, says AMD, and might not for quite a while

AI-on-the-desktop is not yet a thing, and its uses may not be apparent for some time, according to Justin Galton, director and worldwide segment leader for AMD’s commercial client business. Speaking to The Register at an event in Sydney today, Galton said AMD has put its dedicated AI accelerator into just one CPU – the Ryzen …

  1. m4r35n357 Silver badge

    Quite the understatement!

    s/Desktop //

    Not that it will stop them selling "it" to you . . .

    1. ecofeco Silver badge

      Re: Quite the understatement!

      You just saved me dozens of words.

      I would like to add that first we have to have actual AI. Still not there.

  2. that one in the corner Silver badge

    Why shove this into the CPU

    when these workloads are also being targeted by GPUs - where you get far more processing units and associated RAM than you'll get onto the CPU die. Plus the obvious fact it is easier to add multiple GPU cards than to chuck out the CPU (and likely the entire motherboard) for the next round of CPUs.

    Could it possibly be that AMD's GPUs are losing ground to nVidia's in this arena so they want to convince you that using a much smaller co-processor closer to the CPU is better somehow?

    Moving co-processors into the CPU is a Good Thing - once there is a genuine *general* need for them and there is room for the entire unit: floating point and MMU in with the CPU makes sense, not convinced that the last condition is anywhere near met for AI co-processors (if only because the models seem to be getting larger and more "AI farm in the server room" like than desktop sized).

    1. Anonymous Coward
      Anonymous Coward

      Re: Why shove this into the CPU

      I'm assuming it's more for supplemental tasks rather than the brute force workloads GPUs are tasked with.

    2. RichardBarrell

      Re: Why shove this into the CPU

      > GPUs - where you get far more processing units and associated RAM than you'll get onto the CPU

      Speed yes, capacity no. Video RAM (GDDR) has higher throughput than normal RAM (DDR) but it's sold at fairly eye-watering prices per byte of capacity and you don't get very much of it even on the biggest most expensive GPUs. nVidia's current datacenter GPUs top out at 180GB.

      It is cheaper to buy a server with multiple TBs of RAM in it than to buy a GPU with 80GB of RAM on it. A H100 with 80GB of RAM costs north of £30k.

    3. big_D Silver badge

      Re: Why shove this into the CPU

      Probably because most people don't have graphic cards or slots to put them in. Most have laptops or compact desktops with integrated graphics.

      Also, GPUs are much more efficient than current CPUs at AI tasks, but they are still a long way from being optimised for AI. Then there are bandwidth issues, the graphics RAM is quick, but you still have to shovel all the data over the bus to the graphic card.

      This is where integrated designs, like the Apple Silicon range excel, fast memory directly integrated into the chiplets.

      Integrating the NPU into the CPU, along with graphic cores, makes sense for a lot of devices, where GPUs are not needed or there is no energy or thermal overhead to cope with them.

  3. terry 1
    Coat

    Another fad

    Bit like 3D TV etc

  4. big_D Silver badge

    Intel Meteor Lake AI

    I thought the AI cores were only going in the Core Ultra 5/7/9 and the Core 3/5/7 weren't getting AI cores?

  5. Snowy Silver badge
    Joke

    Depends

    On how big your desk is and how dumb your "Ai" can be for your work.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like