Re: Compatibility?
> LibreSOC didn’t get their stuff into RISCV because they don’t know anything other than how to apply for taxpayer grants. Blaming the unfair submission process was a lot easier than admitting that the design was flawed.
As far as I know they were looking for the process to add or request the addition of VPU and GPU instructions to the RISC-V architecture. But to gain access to read the process required signing a NDA! And that is just fricking weird, imagine if IANA required that an NDA was signed before you could read the process that might allow you to ask to be allocated a port e.g. TCP port 25 (SMTP) or TCP port 23 (telnet) or ... (For about 30 years the process for the entire Internet was to contacted Jon. Read rfc2441 and rfc2468 if you have no idea who Jon Postel was ).
> In any case that was many years ago and that project has predictably not delivered a GPU.
So the planned request for new PowerISA instructions is just for fun ? https://libre-soc.org/openpower/sv/svp64/
If you read their Non-Recurring Engineering (NRE) ballpark estimates to produce a 22nm chip they do plan to target multiple markets some of which may not even require a GPU.
But they do still need to development 3D GPU and VPU Extensions and actually submit the request to the OpenPOWER Foundation for inclusion in PowerISA. Part of it will be the SVP64 for the 16 instructions.
> Last I heard they decided to not deliver a SOC a to go along with that…
So are you saying that the Libre-SOC prototype tape-out for IO and SRAM cell at 180nm was just done for giggles ?
Can you give a reference to where you are getting this bad information AC ?
If the LibreSOC people were buying a bunch of off the shelf closed source IP and slapping blobs everywhere, they could have a functioning chip out the door 18 months after announcing it, but fully opensource with no blobs is not easy and adding innovations as well is definitely not short fast easy path. e.g. even HDMI is a problem (PHY includes HDCP which requires a closed source firmware), so DVI is an option, but that limits you to a maximum resolution of 2560 x 1600. Just identifying all the potential problems takes time. If you look into what these people are trying to do it is amazing.
And the big deal about the end product is that you could spin your own silicon and do a full security audit of everything once they are finished. You can't do that for an Intel or AMD CPU both of which have encrypted blobs that can never be audited.