back to article If you wanna make your own open-source chip, just Google it. Literally. Web giant says it'll fab them for free

If you're doodling your own computer chip yet wondering if you'll ever see it physically manufactured, Google is offering to fabricate it for you for free. There are a few caveats. One is that the chip design must be public and open-source: you'll submit your work by sending in a URL to the relevant Git repo. The other is …

  1. Danny 2

    I get this is a marketing promo but it's still pretty cool. I'll forward it to younger engineers as I'd stopped being cutting edge long before 2001.

    My goto idea is medical electronics, a chip to help process brainwaves into electrical output say for controlling a keyboard/ mouse for Locked In Syndrome.

    1. Anonymous Coward
      Anonymous Coward

      Where's Lester when you really need him? This would make a brilliant El Reg Special Project, with planet-wide contribution. Heck, it could even be called Lester: Light Enabled Static Transcription by Eye Recording

      1. Bronek Kozicki

        Probably enjoying cool point in the Heavenly Kings Arms

    2. Anonymous Coward
      Anonymous Coward

      Not sure how useful it is to learn...

      There is not one line of comment remaining in this code drop. As you say: it is marketing. Send your engineers to OpenCores if they want to learn something...

  2. karlkarl Silver badge

    This is really cool. It might help kickstart more open hardware.

    I am assuming it needs to be *my* project. So I can't just get a bunch of old Zet ( chips fabricated so I can fuel my DOS addiction?

    Danny 2's suggestion which could actually benefit mankind should probably take priority over my own (MS-DOS) struggles.

    1. Danny 2

      Thanks Karl.

      I was spraffing away about nothing on a forum decades ago and I got a short reply from a woman in Switzerland who said she had LIS. I asked what that meant, and fours hours later, with intensive effort and the help of her nurse she explained.

      It seemed hellish to me, worse than death. She could only move her eyes, only communicate with her eye movements. She said the internet was her lifeline to sanity. Idiotic me said I'm an electronics engineer with time and money, I'll be able to develop something to help her get back some control. Took me 20 seconds to make that worthless promise, took her hours to thank me.

      She disappeared days later. That's the thing with LIS, maintaining contact is like clinging to jetsam in the atlantic, eventually you lose hope and let go. I don't think she ghosted me, she just sank.

      I don't have a design for a chip to help LIS patients. Maybe eye monitoring is a better way to go over brain monitoring. I just know it is worth pursuing.

      I do know there are many far smarter engineers than me at El Reg. I'd suggest this as a worthy project for 40 free open source chips.

      There is a book/movie that came out since, "The Diving bell and the Butterfly". You might not find it sympathetic because the author was a fashionista, but it can happen to anyone. One minute you are you, then you are a sack of potatoes.

      1. Richard 12 Silver badge

        Last time I looked into it, eyetracker interfaces were the state of the art - seriously expensive, and very slow.

        They should be far cheaper now that commodity hardware can do this very well, and much faster with modern autocomplete as used by smartphone keyboards.

        Terrifying condition.

        1. Chris G

          By now the military must have some cool eye tracking kit.

          Way back in the seventies, I was at an arms fair where a four barrel 20mm AA rig attracted my attention.

          The whole thing was slaved (yes I know) to a helmet that projected a cross hair in the sky so thst you just had to follow an aircraft with the cross and the cannon would follow.

          The sales drone was telling a Middle Eastern looking gentleman that they were developing an eye tracking version so that the gunner didn't have to move his head quite so much to track a target.

          Can't remember the maker but may have been Lockheed.

        2. Lomax

          > Last time I looked into it, eyetracker interfaces were the state of the art - seriously expensive, and very slow.

          I don't know when that was, but it seems things have moved on; here's a hot-off-the-press study in Investigative Ophthalmology & Visual Science:

          Accuracy and precision of the HTC VIVE PRO eye tracking in head-restrained and head-free conditions

          Alexandra Sipatchin; Siegfried Wahl; Katharina Rifai - June 2020

          They find that this consumer (US$800) VR headset can track the wearer's eyes with an average accuracy of around 4° in a 25° radius around the center of view. That's not super precise, but should be enough to support meaningful UI interactions for people with LIS.

      2. FelixReg

        Martin King, the guy behind Tegic's T9, had an eye tracker prototype back in the '90's. Intended for people who could move an eye and little else. If I recall, it involved some LEDs and sensor diode(s) around the frame of glasses. Dirt cheap to make. The idea was to pick up the general eye direction (and blinking?) and use T9-ish logic to drive the output text.

        1. damiandixon

          If you use low powered IR leds and IR camera eye tracking becomes a lot easier as the iris shows up better . Just need to make sure the eye does not heat up too much. This was the basis of an eye tracking system used in the early 90's in the R&D department I worked in at the time. It worked quite well except the issue around the eye feeling slightly warmer.

  3. Anonymous Coward


    It's a Process Design Kit.

    "which is described as a grab bag of design rules, logic and analog models and cells, specifications, and other data to turn your RTL files into actual working patterns of semiconductors,"

    Not really. It's the process level stuff like design rules, transistor simulation models etc. yes.

    The PDK is not strictly needed for digital design. You will need some physical library IP (digital cells, I/O, maybe some memory) along with all their views/models and some router tech files, RC extractor models and DRC/LVS decks to be able to take your RTL to GDSII in 130nm (0.13μm in old money) And don't forget your test.

  4. short

    VHDL's still a verbose horror, then.

    After knocking out VHDL for FPGAs over the decades, I thought I'd have a look at how the grownups do it, in the A2I files.

    Sweet baby jesus, it's just as awful, but on a much bigger scale.

    How can VHDL possibly still be the best way of putting thoughts into silicon? They've got some nicer support libraries than I do, but this kind of crap makes me wonder if it's an intermediate, or if there really is no better way?

    latch_ob_buf_status_avail : tri_rlmreg_p

    generic map (width => ob_buf_status_avail_q'length, init => 0, expand_type => expand_type)

    port map (nclk => nclk,

    act => ob_rd_logic_act,

    forcee => func_sl_force,

    d_mode => d_mode_dc,

    delay_lclkr => delay_lclkr_dc,

    mpw1_b => mpw1_dc_b,

    mpw2_b => mpw2_dc_b,

    thold_b => func_sl_thold_0_b,

    sg => sg_0,

    vd => vdd,

    gd => gnd,

    scin => siv(ob_buf_status_avail_offset to ob_buf_status_avail_offset + ob_buf_status_avail_q'length-1),

    scout => sov(ob_buf_status_avail_offset to ob_buf_status_avail_offset + ob_buf_status_avail_q'length-1),

    din => ob_buf_status_avail_d,

    dout => ob_buf_status_avail_q );

    ob0_buf0_status_val <= ob_buf_status_avail_q(0) and not ex6_ob0_buf0_flushed;

    ob0_buf1_status_val <= ob_buf_status_avail_q(1) and not ex6_ob0_buf1_flushed;

    ob0_buf2_status_val <= ob_buf_status_avail_q(2) and not ex6_ob0_buf2_flushed;

    ob0_buf3_status_val <= ob_buf_status_avail_q(3) and not ex6_ob0_buf3_flushed;

    ob1_buf0_status_val <= ob_buf_status_avail_q(4) and not ex6_ob1_buf0_flushed;

    ob1_buf1_status_val <= ob_buf_status_avail_q(5) and not ex6_ob1_buf1_flushed;

    ob1_buf2_status_val <= ob_buf_status_avail_q(6) and not ex6_ob1_buf2_flushed;

    ob1_buf3_status_val <= ob_buf_status_avail_q(7) and not ex6_ob1_buf3_flushed;

    ob2_buf0_status_val <= ob_buf_status_avail_q(8) and not ex6_ob2_buf0_flushed;

    ob2_buf1_status_val <= ob_buf_status_avail_q(9) and not ex6_ob2_buf1_flushed;

    ob2_buf2_status_val <= ob_buf_status_avail_q(10) and not ex6_ob2_buf2_flushed;

    ob2_buf3_status_val <= ob_buf_status_avail_q(11) and not ex6_ob2_buf3_flushed;

    ob3_buf0_status_val <= ob_buf_status_avail_q(12) and not ex6_ob3_buf0_flushed;

    ob3_buf1_status_val <= ob_buf_status_avail_q(13) and not ex6_ob3_buf1_flushed;

    ob3_buf2_status_val <= ob_buf_status_avail_q(14) and not ex6_ob3_buf2_flushed;

    ob3_buf3_status_val <= ob_buf_status_avail_q(15) and not ex6_ob3_buf3_flushed;

    1. Jusme

      Re: VHDL's still a verbose horror, then.

      I much prefer schematics, though in the hardware community it seems that's a bit like saying you prefer to program in LOGO.

      HDLs are great for the core function blocks (state machines, counters, logic etc.), but for me nothing beats a proper diagram* showing how it all connects together. It also keeps you mind on the structure and physical implementation too - doing it all in HDL can lead to "writing code" syndrome, where one forgets it's not a programming language.

      * Not the "tangle of gates" kind of schematic spat out by synthesis tools, or the "component library" flavour with pages of boxes with no interconnecting lines, just labels on the pins. A good schematic should be a work of art, but nobody got time for that these days, especially at modern hardware complexities.

      1. Natalie Gritpants Jr

        Re: VHDL's still a verbose horror, then.

        Ever tried to diff two versions of a schematic so you can review a bug fix by someone?

        Ever tried to figure out why a schematic design has a bug by looking at a waveform diagram of a simulation where every net is called net000123?

        Ever tried to do a state machine in a schematic?

      2. Anonymous Coward
        Anonymous Coward

        Re: VHDL's still a verbose horror, then.

        Schematics, aka the logical netlist, has its place. But schematic entry, on a modern, large complex ASIC is no longer possible.

        Also you can only run gate level (or, extremely small size permitting, transistor level) sims. No behavioural.

        1. Mage Silver badge

          Re: VHDL's still a verbose horror, then.

          Certainly for digital / logic, the HDL is the only way.

          Analogue, which this seems partially aimed at, does usually need schematics and spice models that also add parasitic inductances and capacitances.

          1. . 3

            Re: VHDL's still a verbose horror, then.

            Great to see Chisel getting a mention. It's the way forward for open source hardware; a functional programming language for building up digital logic. Easy to grok, easy to maintain, 100% provable.

            BTW, the Google chip thing seems to top out at 40 IOs, so practically zero chance of building a fancy CPU unless it needs next to no memory or next to no connection to the outside world.

  5. Black Betty

    I would like to see a flat architecture 32 or 64 bit design.

    Zero security features. Just flat out raw compute power.

    1. Mage Silver badge

      Re: I would like to see a flat architecture 32 or 64 bit design.

      Test it with a FPGA evaluation board. No need for this chip program.

      1. Old69

        Re: I would like to see a flat architecture 32 or 64 bit design.

        "Test it with a FPGA evaluation board."

        What gate clocking speed is achieved by 130nm? My Xilinx FPGA designs in 1986 ran with an external 100ns clock - the FPGA logic gates clocked somewhat faster. The 1800 gates required manual routing to be able to cram my design on one.

        Given the number of gates on modern Xilinx FPGA chips plus module libraries - it feels like one would handle quite complex project designs. Baking in silicon as a first evaluation seems very retro and inefficient.

        1. Anonymous Coward
          Anonymous Coward

          Re: I would like to see a flat architecture 32 or 64 bit design.

          130nm is seen as a lowcost option with analogue variants such as BCD easily available. (As well as flash.) Think: ultra low power, ultra mass market.

          Looking at speed, back in the day (15 years ago) when 130nm was cutting edge, apps processors were coming out with a headline clockspeed around the 1.2-1.4GHz mark. (<1ns)

          I'm sure these days using a modern flow and techniques those could be revisited to easily hit 1.5-1.6GHz. But in general you can see lowpower microcontrollers running up to 200Mz on 130nm.

          1. Anonymous Coward
            Anonymous Coward

            Re: I would like to see a flat architecture 32 or 64 bit design.

            200Mhz and hot enough to roast chestnuts!

  6. This post has been deleted by its author

  7. Pascal Monett Silver badge
    Thumb Up

    I think this is a cool idea

    Good on Google for stepping up and proposing something that might turn out to benefit people in the long run. This proposal has the potential to open the doors to many projects that would like to see the light of day, but are hampered because existing chips are too power-hungry or too expensive to use.

    In that sense, It's difficult to judge Google. On the one hand, they're data thieves, making bank and literally printing money with private data they take from you, but on the other hand, Google Maps is useful, Google Translate is astonishingly powerful (even if not always entirely accurate), and now, this.

    Congratulations, Google. You've just earned yourself a Suspension-Of-Criticism voucher valid until Monday.

  8. Anonymous Coward
    Anonymous Coward

    If Google are involved...

    ... it can only be for their benefit.

    Their job is to slurp your data, from pictures of your house, your personal wifi data, your shopping habits, and personal information and then keep it as their secret and sell it for profit.

    Google, along with Facebook, are the retail arm of the NSA.

    Keep feeding them at your cost.

  9. This post has been deleted by its author

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like