back to article Intel bets you'll stack cheap GPUs to avoid spending top dollar on Nvidia Pros

When it comes to AI accelerators, Intel isn't very competitive, and its newly announced Battlemage workstation cards don't do much to change that. But at least they're cheap. Really cheap. For the purposes of AI, we can mostly ignore the $299 Intel Arc B50, which is positioned as a more traditional workstation GPU for graphics …

  1. cyberdemon Silver badge
    Gimp

    It's Intel

    So Linux drivers will be shit or nonexistent, and the product will be killed off entirely in a few years..

    Other than my obvious cynicism re. Intel, the ability to shun cloud LLMs and instead run them locally would be quite nice. Not sure I'd spend the requisite $4000 to stack 8 of them for 192GB though

    1. NoneSuch Silver badge
      Linux

      Re: It's Intel

      Ordinarily, yes. However, Intel is on the back foot and needs to re-establish itself. They might even *gasp* listen to their customer base for once. Sticking to a lower cost GPU may pay off for them in the end in certain profit motivated executives can be sidelined for a while.

      Linux drivers will suck for 3-6 months after release, but that has typically been the case.

      1. Yet Another Anonymous coward Silver badge

        Re: It's Intel

        Yes but if you ignore the extra cost and complexity of all the extra motherboards, power, cooling, rack-space, interconnects, management issues and latency from many more weaker cards - I think you'll find they have a compelling business case. If only they had the software support that NVidia has

    2. Rich 2 Silver badge

      Re: It's Intel

      From what little I’ve seen and read, the new Intel cards work pretty well on Linux.

      nVidia meanwhile are making a total balls-up of their latest offerings, regardless of OS.

      I’m particularly aggrieved with nVidia because they have removed power control support from their Linux driver for (I think I have this right) 3000 series and older. Which means my laptop now cooks itself. Despite a forum chain of complaints as long as your arm, nVidia are completely ignoring the problem and offering no explanation.

      My next PC will not include an nVidia graphics card in it

      1. Yet Another Anonymous coward Silver badge

        Re: It's Intel

        Nvidia couldn't care less about your laptop or the consumer in general.

        The code and tools to tune CUDA and interconnects on HPC get a lot more attention

        1. collinsl Silver badge

          Re: It's Intel

          Nah, it's AI they're after. I do HPC work and they're reducing their support for FORTRAN in their software, which a lot of HPC jobs still use. The problem is that traditional HPC is a small market now for them since they're making 10x the money ($22bn last quarter) on AI than anything else.

  2. williamyf Bronze badge

    Intel already has an AI only solution in the formof the Gaudi chips, mostly developed by Habana labs, with some sprinkles of nervana and movidius technology.

    problem is, if you wanted to repurpose these for HPC (say weather modeling, nuclear modelling, geological seismic oil/gas surveys, movie/VFX rendering) the results were abysmal.

    meanwhile, Arc GPUs are decent to mediocre at AI, but are decent to good for HPC. While these are marketed at AI, I'd not be surprised if the vast majority of these "Battlematrix" cards end up doing HPC

    PS: intel was/is hard at work to produce a hybrid of gaudiand arc, but hit some snags. "ElReg" and sister site "The Next Platform" have covered this in varying depth

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like