back to article AMD to open source Micro Engine Scheduler firmware for Radeon GPUs

AMD plans to document and open source its Micro Engine Scheduler (MES) firmware for GPUs, giving users more control over Radeon graphics cards. It's part of a larger effort AMD confirmed earlier this week about making its GPUs more open source at both a software level in respect to the ROCm stack for GPU programming and a …

  1. Anonymous Coward
    Anonymous Coward

    Is this all to do with self driving cars, or has George Hotz realised that's actually a very difficult problem to solve?

    1. Dave 126 Silver badge

      Nah, Hotz's Tiny Box isn't to do with cars. His concept is that individual users should have their own AI / ML system running on their own hardware. It's a response to people's privacy concerns regarding cloud based ML applications.

      To that end, he's designed a system that is limited by the power from two domestic wall sockets. He says that AMD GPUs currently offer the most bang for the buck, largely because nobody is using them for ML applications because of their shoddy software.

      https://geohot.github.io/blog/jekyll/update/2023/05/24/the-tiny-corp-raised-5M.html

  2. HuBo
    Holmes

    Navi coolz

    Sounds like great news to me ... the more open source the firmware, the better the potential for innovation! And the RX 7900 XTX-based tinybox looks quite neat, especially if it can run FP64 HPC loads at 11.5 TF/s (and less than 3 microwaves of power, 3.2 KW -- for those of us not that interested in AI/ML).

    I'm not 100% clear though on how the RTX 4090 version can give 991 TF/s at FP16 unless the cards are doubled-up (and then power should exceed what the PS provides). Maybe Holz's team meant FP8 in its specs table for this one (green team)?

    1. Anonymous Coward
      Anonymous Coward

      Re: Navi coolz

      It says GPU RAM: 144 GB

      As both cards have 24 GB RAM a piece, that leads to 6 GPU. Possibly the designer wanted very low level access to drivers and hardware to limit power usage at an underclock / undervolt level. Would also align with the 15000$ to 25000$ price tag. If so, tiny is a relative thing.

      1. HuBo
        Thumb Up

        Re: Navi coolz

        OK, I think I see it then, from page 30 of the NVIDIA ADA GPU ARCHITECTURE V2.02, one gets 165.2 "Peak FP16 Tensor TFLOPS with FP32 Accumulate" (and 6 x 165.2 TF/s = 991 TF/s). So, it's a mixed-mode perf (hopefully applicable to one's workload), whereas the reported RX 7900 XTX perf would be for standard (non-tensor) mode.

        1. HuBo
          Headmaster

          Re: Navi coolz

          Then again, for HPC (FP64) one may be better served by a single MI210 (eg. $9,000) which gives 22.6 TF/s for just 300W. That's more in line with what teams in the (limited budget) Student Cluster Competition (SCC) commonly do, now that I think about it.

      2. Anonymous Coward
        Anonymous Coward

        Re: Navi coolz

        Now all you need is a tiny desktop nuclear reactor to power it.

  3. Groo The Wanderer Silver badge

    Are the Intel cards even relevant? Didn't Intel axe that line of products?

    1. Francis Boyle

      Intel cards have always been relevant. Good on the other hand? (where 'good' means performant, not affordable.)

  4. FeepingCreature Bronze badge
    Thumb Up

    Hell yes! Finally!

    Maybe now somebody can actually figure out why it crashes all the time under load! Since AMD are clearly not up to it.

    This card has been out for a year and a half, I should not be seeing MES crashes in my kernel log.

    1. TReko

      Re: Hell yes! Finally!

      Maybe.

      nVidia is overpriced, but there's a reason they are used. AMD GPUs have so many edge cases where they just don't work. The people who design their drivers and RTL just don't seem to care.

      AMD cards are OK for games, where an error just causes a bad pixel or texture. But using them for reliable processing is harder, which is a pity as nVidia could do with some decent competition.

  5. Grogan Silver badge

    I haven't bought an Nvidia card (for anyone, or anything) in 15 years. So many early deaths because of the PCB substrate problems Nvidia was denying back then. When they couldn't deny it anymore, it was just "mobile graphics" with the problem, but that wasn't so. It just manifested more quickly. Every single one of them within 2 years. I even had an almost new 8800GT card that spent most of its life in the closet (was only used for a few weeks when I decided I wanted a more powerful card). I sold it with a PC and ended up having to replace it for the guy on my dime, 6 months later. Having a conscience is expensive (It cost me money to sell that PC, in the end). So... that relegated Nvidia to "I don't care if they are the last company on earth..." AND "Charlie Demerjian was right all along!"

    It soon became less desirable to have Nvidia on Linux and I never gave Nvidia a thought again for my own use either.

    I don't care about GPGPU computing though. I'm just a chump that buys graphics cards for normal use in computers. As far as I'm concerned, it's just another consequence of human greed. Nobody cared about that shit (outside of some scientific computational use, or some off screen 3D rendering processing etc.) until cryptocurrency mining and now, even a mid range graphics card that would have cost you $250 to $300 is $600. If it isn't, beware... those cheaper brands like Assrock and their 25 cent resistors and capacitors they should have spent 50 cents on won't save you any money a month from now (past seller's return periods) when you can't even get an answer to claim the 3 year manufacturer's warranties.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like