back to article Boffins unveil open source GPU

It's a kitten rather than a roar right now, but if the MIAOW project unveiled at last week's Hot Chips conference can get legs, the next year could see the launch of the world's first “open GPU”. The result of 36 months' development (so far) by a team of 12 developers, MIAOW – the Many-core Integrated Accelerator of Wisconsin …

  1. Hans 1

    Slowly ousting proprietary junk out of this world, one chip at a time ...

    1. Pascal Monett Silver badge

      That proprietary junk has given me over a decade of beautiful game visuals.

      I do look forward to an entirely Open Source hardware computing platform. That way we might have some measure of assurance that our tablets and phones of the future are not 5-Eyes-compliant right out of the box.

    2. PleebSmash

      The slow death of Moore's law could be great for open source hardware, since the performance gap contracts.

  2. Mage Silver badge
    Thumb Up

    Real Hardware: the GPU has been implemented on an FPGA.

    FPGA are very power hungry compared to ASIC, but real HW. As the Verilog can be complied to ASIC design instead it just now needs the IP and money sorted to have a user chip, a chip for a MoBo or card.

    This is quite exciting. So anyone with a suitable FPGA dev kit and the knowledge to use such can test out and adapt this?

  3. CaptainBanjax


    If it uses FPGA and can be utilised as a compute board as well as a GPU this will be picked up by crypto miners all over.

    We might see more efficient mining clients if the board truly does end up being open.

    I suspect some proprietary chips will creep in somewhere.

    Afterall the Raspberry Pi was intended to be open source but it has proprietary chips on it. The broadcom GPU still has fairly naff drivers.

    Didnt the raspberry pi foundation run a competition at some point to get Quake 3 running on the board at over 30fps?

    1. Mage Silver badge

      Re: Hmm... Mining Clients

      No, the FPGA is for prototyping. The same FPGA (or better an ASIC) specially programmed for mining is going to be better than this GPU implemented using an FPGA.

      An FPGA is programmable HW, a standard part anyone can buy for prototypes or low volume (a custom chip needs 10K to 1M pieces and you check the design works by doing an FPGA version first. An ASIC can cost 100K to 1M for NRE, or even more if multi-layer, large die and small geometry etc)

    2. Mage Silver badge

      Re: Hmm... Raspberry Pi

      The design is meant to be as Open source (SW & HW) as feasible. It's basically a now obsolete ARM phone chip on a breakout board. Any complete non-trivial computer design, even if open source is going to have some (or mostly) proprietary chips. Probably the USB/Ethernet chip is proprietary. Fully documented so you can write drivers from scratch is the issue. Almost all chips are proprietary.

    3. Anonymous Coward
      Anonymous Coward

      Re: Hmm...

      Several bit-mining ASICs already in use, TBH the compute-difficulty of mining is now so high that GPUs don't cut the mustard.

  4. John Hawkins

    Kitten Kong?

    I'm showing my age...

    1. deshepherd

      Re: Kitten Kong?

      That's because the BBC don't show repeats of "The Goodies" ... though, I think I heard Tim Brooke-Taylor saying once or twice that they are now available on DVDs (including the "infamous" Apartheight episode that was banned for rebroadcast in case it upset the South African authorities)

  5. Anonymous Coward

    MIAOW ?

    More like : Wisconsin's Open-source OpenCL FPGA...

  6. Ru'

    But can it run Crysis?...

    1. K

      No, but it runs "Breakout" really well!

    2. GrumpenKraut Silver badge

      Of course, just really slow!

    3. Preston Munchensonton

      As a fall back, you can always play Reversi.

  7. phil 27

    From the reaction from people who actually were at the presentation, MIAOW hasnt been designed to steer clear of patents. Right now a non issue, but should it take root the big stick will come along and make its stamp... Really, not the best base for a patent free open source GPU.

    It's also missing some gfx functions, texture-mapping, and has a single processing pipeline, when you start enabling more pipes you run into all sorts of caching and corruption issues you never spotted, so its not just a case of altering some parameters and resynthesising. Great start for a uni project, but there are better options out there already not gaining the publicity.

    If your interested in the subject of diy gpu processors and fpga, check out Jeff Bush's amazing write up of his open gpu on fpga. You need something with a fair number of logic elements to load his design on so something like a de0 nano or similar but you can check out the code/verilog etc from github right now, and his blog is amazingly insightful to read.

    I'm also interested in cpu on fpga, but its a massive massive rabbithole of learning to fall down into. There is the venerable tg68 implementation of the 68000, fpga arcade, vampire v1 and a few other interesting ones released under open source licenses you can download the sources to and check out, plus there are further closed projects in this space. The above is focused on m68k because its a chipset I knew well back in the day, so I can relate past experience to bootstrap learning for the bits I don't understand. But there are z80/6502, even machester computer on fpga...

    The chinese are getting in on the area with their own fpga designs and fab, gowin semiconductor has released two designs in the past two years to compete with altera and xlinx etc. Lattice is another with interesting developments and now there is a open source toolchain (icestorm) covering the lattice icesticks to lower the difficulty bar to getting started.

    Its like that guy on here that built his own diy wirewrapped discreet gated computer that took up half his house (I think he is my hero after that elreg article...) , except you can pop the dev board and usb blaster in the desk drawer between sessions without visitors spotting your a raving loon until you start to babble about von neuman architecture and the like. And its great fun giving the brain a bit of a workout compared to the level of thinking required for real work.

    1. Bob H

      There was also GPL GPU which came out last year but seems stagnant:

  8. Nigel 11


    I think that from a software perspective, the problem isn't patented hardware. It's secret hardware. A lot of what's in current GPUs is undocumented and treated as a commercial secret or disclosed only under a non-disclosure agreement, meaning you can't write open software to use it other than if you can reverse-engineer its function.

    Why do GPU manufacturers keep their chip internals secret? One explanation is that since they can't patent the techniques they are using (because of prior art), they fall back on secrecy. Another is that they *know* that they are violating someone else's IP and don't want that someone else to find out! And of course there are closed-ecosystem cartel / monopoly / NSA-backdoor / conspiracy theories too.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2021