back to article Hate Verilog? Detest VHDL? You're not the only one. Xilinx rolls out easier-to-use free FPGA programming tools after developer outcry

If you hate writing Verilog, VHDL, and other hardware design languages, used to craft computer chips and configure FPGAs, you're far from the only one. This Register vulture wrote Verilog for gate arrays in the early 2000s, and how did that work out? He ran screaming into the, at the time, more exhilarating world of journalism …

  1. Anonymous Coward
    Anonymous Coward

    Well - that "obvious" aid for FPGAs has only taken nearly 34 years. I always hand-crafted my Xilinx designs in 1985-88 at gate level. That gave maximum density and control over the interconnect routes for time critical paths. Admittedly - there were only about 1800 gates to play with.

  2. DCFusor

    Easy to use but inefficient software as a boost to hardware sales is nothing new.

    It might be interesting to see how inefficient it is, how much more hardware it needs to get all but a couple of optimized jobs done, and how much easier it is to use for various types of users.

    If it's good, it'll help with the basic tradeoff - in security we call that one usability vs security - usually it's nearly a perfect trade between one and the other. The real goal is to increase the total (for which I've overloaded "dynamic range") - so, more usability with equal or better security, or more security without losing usability.

    This is an analogous case - if it truly increases the "dynamic range" - it's great. Else, remember, code is an NRE - you pay once. Hardware is pay per sale.

    1. Jim Mitchell

      Yes, this would seem to trade off requiring less skilled people to design FPGA content, but getting less good results. Of course, "less good" sure beats "none" on the results front.

      In general, C/C++ is a sequential idiom, hardware chips can be highly parallel.

      1. DCFusor

        Yes, you have to change your thinking fairly seriously between CPU programming and FPGA (or for that matter any hardware).

        So far, I've not seen great examples of automatic parallelization translators. It's a hard problem to just generalize, and if they've managed, it has applications far beyond mere FPGA tooling.

        Else why do many many programs on even this Linux box just use up 100% of one core...it's not like no one has the source, and if such a tool existed - even between relatively like domains - they'd all scale to more cores, right?

        Time to market might matter for some things than others. Until one recent fiasco, it was considered a fatal flaw to need to reboot something during a fighter airplane takeoff -

        I did both hardware and software design and realization for boxes that didn't have a reset button - that was a service call at minimum. Lots of 9's. Not all problems are just the most common web-monkey stuff. Forget "ship it and we'll promise to update later" for some things. Especially the ones that matter, vs the froth intended to collect VC money.

        Where life is on the line - the medical biz - you'd hope that somewhat more care was taken than for some social media "killer app" (could be a vain hope of course).

        1. Pascal Monett Silver badge

          I would think that it is still very much a fatal flaw to need to reboot something during a fighter airplane takeoff.

        2. nerdbert

          Where life is on the line - the medical biz - you'd hope that somewhat more care was taken than for some social media "killer app" (could be a vain hope of course).

          Not so vain. I've got friends in the medical device biz, and coworkers who used to work at the biggest names in the biz, and I can say that the amount of care they take is both amazing and frustrating. From architectural chip sims where even changing the slightest thing requires approval from 3 levels of management, to simulations where even 9-sigma design margins have to pass, to years and years of measuring lead wire designs in various configurations, aging conditions, etc are required.

          We'll turn a new chip in 9 months in a new technology and get something to work and release it a couple of months after that.

          It'll take 7+ years to make the smallest change to a pacemaker, much longer if there's any change to something like the silicon inside it. And you know, I think I like it that way.

          But there are times that it's frustrating to see how conservative the FDA is. There are biohackers who are taking old Medtronic insulin pumps, rooting them, then connecting them to blood sugar monitor watches and making an "artificial pancreas" where the monitoring a correction of blood sugar is automatic and far superior to the way things are done now (not controlling blood sugar well leads to damage that accumulates over time, and the closed loop feedback minimizes time that blood sugar gets too high, as well as eliminates the risk of it going too low and the person going into shock or a coma, which is even more dangerous) . In fact, one of the board members of the JDRF, the main Type I diabetes research foundation, does just that and is extremely pleased with the results. But Medtronic and the other manufacturers are moving very, very slowly on that path because of the FDA and concerns that they would be "practicing medicine" if they do more than provide information and be far more liable if something went wrong. The engineers would like to get a prototype system out there for those willing to take a chance, but the regulatory system and management are pretty firmly against it. Management is mostly doctors, and the regulators want well structured test groups, double blind studies, constant monitoring, human subjects regulations followed, etc. In other words, it'll be a long, long time before the public gets something through them, rather than engaging in the hacking subculture now going on in the diabetes community.

          1. Arthur the cat Silver badge

            It'll take 7+ years to make the smallest change to a pacemaker, much longer if there's any change to something like the silicon inside it. And you know, I think I like it that way.

            As someone on his second pacemaker, I'll take that over "Fail Fast, Fail Often" any day.

    2. Brewster's Angle Grinder Silver badge

      Like it or not, in a commercial environment, time to market and ease of coding trumps the quality of code. The history of software development demonstrates that.

      1. Aspen Logic

        Time to market matters but if you can not debug your design then how quickly you got to gates does not matter one iota. Software development tools permit straight forward debugging techniques. Not so with logic designs, and especially not ones from C/C++ which actually make the problem 10-100 times worse!

        1. Brewster's Angle Grinder Silver badge

          "...if you can not debug your design then how quickly you got to gates does not matter one iota."

          In my book, "time to market" includes ironing out the biggest bugs. And the kids will be able to use these tools to do that. Yes, they'll miss or dismiss the more subtle bugs. But the management will be happy. And the more talented ones will learn the lower level stuff.

      2. Anonymous Coward
        Anonymous Coward

        "The history of software development demonstrates that."

        ASIC hardware development is not software development. Thinking it is can lead to wasted $Millions.

  3. Crazy Operations Guy

    Might be a long time before I use it

    Nearly all my work ends up as an ASIC, so I need to a language that I can send to the Wafer Wizards, or convert to something they understand. Plus I need compatibility with FPGAs made by someone other than Xilinx.

    But I might not move to a new language anyway. I've been cranking out VHDL for 20 years now and started with Ada a few years before that. As bad as it is, its still the devil I know and can work with.

    I've also seen far too many fiascoes caused by companies declaring that they have created the new 'end-all and be-all' of languages, and, at best end up with yet another language programmers have to deal with in addition to the stuff they already need to work with.

    1. Anonymous Coward
      Anonymous Coward

      Re: Might be a long time before I use it

      Vivado HLS - which I assume is the language in question - has been around for several years now. It's very good at getting area efficiency close to that of RTL code.

      However... it's a C (or C++) based language that, to be efficient, requires a coding style rather different from CPU code.

      And Intel's FPGA HLS (which is C++ based, mostly, and without STL) also demands that you write the code in hardware-description patterns. The Intel HLS emphasizes throughput over area efficiency

      As things stand, HLS designs are unlikely to be portable between the two vendors.

      I have heard an RTL engineer say that the main use of HLS is rapid prototyping of hardware designs; having settled the hardware architecture it is still best to write the final code in RTL.

      1. Aspen Logic
        Alert

        Re: Might be a long time before I use it

        My experience so far working with HLS and P4 is that Xilinx tools crank out the code very quickly for the result but it is close to impossible to debug what is going wrong with out months of guess work. If your *entire* design is made with one of these languages then okay. Hook up custom code around that core and you are screwed if things do not work.

    2. Tom 7

      Re: Might be a long time before I use it

      There are some things you can simplify.

      There are some things you can automate.

      There are some things that you cant really dumb down to try and make them easier to use.

  4. razorfishsl

    Ahhh yes....... we all remember the program for commodore PET called "the last one"

    The ONLY program you will EVER need to buy.

  5. Philius
    Happy

    sycl

    https://en.wikipedia.org/wiki/SYCL

  6. Mage Silver badge

    "Programming"

    Mostly with an FPGA you are simulating hardware. If there is volume it will be automatically translated to an ASIC, which optionally might have a standard CPU core.

    Verilog looks a little like C, except it's not a programming language, it's a HARDWARE description language. More obvious with VHDL. The design becomes a configuration. It's not a program executed at runtime.

    So a C like actual programming language is only going to be appropriate for one part of the FPGA design and has to be translated to a hardware configuration. This is a bad idea.

    Far better is designing a custom cpu like "machine" that executes a separate runtime program, written by a programmer. I think this tool is for people unskilled to design FPGAs. FPGA design isn't programming. Pretending it is programming is daft. It's hardware design using a specification language instead of a schematic. I've used FPGA DSP and Scilab to develop the DSP filter tables which are imported and FPGA simulated by a Xilinx tool. Then at run time these the design and coefficients are loaded from Flash to the FPGA. If replaced by an ASIC (massively lower power) the filters would be defined in silicon and the coefficients would still be loaded from Flash to RAM to allow updates or changes to the DSP performance / filter etc.

    1. nerdbert

      Re: "Programming"

      Verilog looks a little like C, except it's not a programming language, it's a HARDWARE description language. More obvious with VHDL.

      I think that that concept is the hardest to communicate to folks coming in from a traditional software framework.

      For an HDL, you get the concept that, unless you specify otherwise, everything runs in parallel, which is hard to communicate to a non-hardware person. And in VHDL in particular you can blow out the size of your design if you're not careful to specify how you create your states and wind up generating far more hardware than you expect. It was one of my bigger complaints about VHDL versus Verilog. In Verilog you are much closer to the hardware and it tends to be a bit harder to be stupid about what you're doing in terms of hardware utilization. At the system level Verilog is harder to deal with, and VDHL is a better choice, but neither of them fit the traditional programming paradigm at all.

  7. Anonymous Coward
    Anonymous Coward

    JAFC2HDLC

    Just Another F***ing C to HDL Compiler

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon