back to article Sandra Rivera’s next mission: Do for FPGAs what she did for Intel's Xeon

Sandra Rivera, a longtime Intel veteran and chief executive of the chipmaker's FPGA business, reprised the Altera brand during a webcast Thursday in which she shared her vision for the newly spun-off company. Speaking to the press ahead of the event, Rivera painted a bright future ahead for Altera's FPGA business, predicting a …

  1. elsergiovolador Silver badge

    Worry

    Do you mean make it perform poorly, generate too much excess heat and that you need to get a mortgage to buy it?

    Brilliant idea!

    On the news that Intel is taking over Altera, there were gasps heard across the globe and people were praying Intel is going to leave it alone.

    So what does it mean? Should people start rethinking their projects and switch to other brand?

  2. bazza Silver badge

    Dead End

    An addressable market of $55billion? Pull the other one, it's got bells on. Xilinx were pulling in revenues of just over $3billion until they were bought by AMD, and I doubt Altera under Intel's stewardship has reversed their trailing market position. I'd be stunned if between them they were pulling in more than $7billion revenue.

    The reason why there's an inventory correction going on is, I think, that a certain amount of AI Kool-Aid was drunk concerning FPGA's role in tech's latest bubble.

    One really hard question both Xilinx and Altera have to face is, just how big is the market really? Taping out a new part these days is a very expensive business. To get a large complex part in production on the best silicon process is several $billions these days. I don't think the FPGA market is too far from the point where the cost of production set up exceeds the total market size. Xilinx, being part of AMD, is perhaps a bit immune in that AMD has some weight to exploit when it comes to getting time on TSMC's fabs. An newly independent Altera could really struggle. It feels to me like the whole technology is edging towards being unsustainable in the market place.

    We shouldn't be surprpised if that happens. It's happened plenty of times before. There's many a useful / niche technology that's not been able to fund upgrades, and have been swamped by alternative technologies that enjoy the mass market appeal. Anyone remember Fibre Channel? Serial RapidIO? Both replaced by Ethernet.

    FPGAs are troublesome, difficult, hard to program for, worst-of-all-worlds devices, the kind of thing one uses only if one absolutely has to. Thing is, there simply isn't that many such roles left where they're actually necessary. CPUs are very capable, and if for some reason the performance of many CPU cores all featuring their own SIMD / Vector units isn't enough, it's pretty simple to plug in a few teraflops of GPU. Even for the highly vaunted "they're good for radio" type work, FPGAs are often used simply to pipe raw signal data to / from a CPU where the hard work is done. I've seen projects go from blends of FPGA / CPU to just CPU, because the workload for which an FPGA was well suited is now a fraction of a CPU core's worth of compute. And with radio standards like 5G being engineered specifically to be readily implemented on commodity CPU hardware, the future looks bleaker not brighter.

    At the lower end of the market, the problem is that it's actually pretty cheap to get low-spec ASICs made (if you're after millions). So, even if used in lower-tech devices FPGAs will struggle because if the product they're used in is successful in the mass market, it's worth ditching the FPGA and getting an ASIC made instead and making more money. So, FPGAs are useful only to product lines that are not run-away successes; doesn't sound like the kind of product line that's going to return $billions.

    1. elsergiovolador Silver badge

      Re: Dead End

      I don't know. When it comes to high speed real-time (like zero latency) computations or data transformations, the CPUs are still lacking. It's FPGA, ASIC or nothing.

      The problem with FPGA is poor software and tooling. It's an absolute nightmare to work with this stuff and probably that's why adoption is low.

      1. Anonymous Coward
        Anonymous Coward

        Re: Dead End

        I think zero latency is a real advantage for, e.g. reading real time sensors and writing to ring buffers, but almost any kind of computation heavy transformation - not so much. That would better done by a CPU on the ring buffered data.

        25+ years ago, I worked in a Japanese company that made highly customized inspection machines for the semicon industry. I programmed an FPGA to do just that, ring buffering data from a camera, and then wrote C code to process the ring buffer data. Writing the FPGA code was truly horrible - every compile error was a crash with no diagnostics, it felt like to trying to run in a pool of molasses.

        That camera unit was also put together from separate parts in house, the cameras sensors and lenses procured separately and placed on an in-house designed circuit board - all designed by a die hard senior electrical engineer often slept at the company, washing his socks there and leaving them to dry on the back of his chair. That was Takahashi-san, a product of Japan's Showa-era post war die-hard generation - and they don't make those kinds of guys anymore!

        Without a Takahashi-san, nowadays it would probably be wiser to get the cameras+board+buffer-aggregation off the shelf, if at all possible.

    2. Conor Stewart

      Re: Dead End

      I don't think FPGAs are going anywhere, they have many advantages over CPUs. You mention multiple CPU cores being better but in most cases they just aren't. FPGAs for the most part have lower latency and since everything can be done separately and simultaneously that is a huge advantage too.

      Also FPGAs are good for when you want a product that can be reconfigured later, you can update the hardware as well as the software.

      1. bazza Silver badge

        Re: Dead End

        Pretty sure they're not going to have $55billion's worth of advantage over CPUs.

        As for latency, there's nothing in particular about an FPGA as such that gives them an advantage. They do as well as they do largely because interfaces such as ADCs are there on chip, rather than being at the end of a PCIe bus. If one put the ADC on a CPU hot wired into its memory system, that too would have a lower latency. CPUs these days also have a ton of parallelism and a higher clock rate.

        As ever selection is a design choice in response to requirements. In 30+ years I've yet to encounter a project that has definitively needed an FPGA, definitively could not be done on a CPU. I've seen an awful lot of projects where the designers have chosen to use FPGAs fail, often badly.

        To give an idea, a modern CPU with hundreds of cores and something like AVX512 available can execute 8960 32bit float point computations in the time it takes an FPGA running at a slower clock rate to clock just once. Given that things like an FFT cannot be completed until the last input data is input, there's a good chance a CPU with an integrated ADC would beat the FPGA with an integrated ADC.

    3. martinusher Silver badge

      Re: Dead End

      I'd guess the real problem is that they're so busy addressing the high end of the market (those incredibly expensive parts that the aerospace types of this world love) that they forgot that there's an even bigger market in commodity parts.

      Both Xilinx and Altera (sorry, I've always known them as that) have quite high barriers to entry which leaves a space for upstarts like Lattice. Sure, their parts service mainly the smaller, cheaper, end but 'small' and 'cheap' are relative, they're still more than adequate for a huge range of applications. The tool costs are hobby level so the main barrier to entry has been the sheer investment in time and effort needed to move from the established brands, a move that's then encouraged by the established brands fiddling with the types, costs and availability of parts used for years.

    4. F Seiler

      Re: Dead End

      In the linked the next platform article, the first slide included (titled our 55+b fpga opportunity) there is a footnote that, from a 10b market in 2023 and an extrapolated 13b market in 2028 they arrive at 55b *accumulated* over 5 years. So 10+b per year.

  3. CowHorseFrog Silver badge

    How can someone who cant program, cant design and only talks hot air do anything ?

    Why give credit to a gas bag instead of the real engineers ?

    1. anonymous boring coward Silver badge

      Many CEOs are crap, so why not acknowledge when someone isn't?

      Your post sounds like simple misogyny.

      1. CowHorseFrog Silver badge

        Yes its so balanced to give all the credit to the ceo and ignore the hundreds or thousands below, nothing ever goes wrong when this unbalanced exaggerated praise is given to one over the masses.

        Just ask NK or Russia how well those countries are where everybody pretends the "leader" is so fucking wonderful and disregards the efforts of all the little people with no names.

        What you cant accept is this form of writing is nothing more than brainwashing and cult branding.

  4. Fruit and Nutcase Silver badge
    Alert

    That Rivera Touch

    A nod to Messrs Morecambe and Wise in, "That Riviera Touch"

    https://en.wikipedia.org/wiki/That_Riviera_Touch

  5. vincent himpe

    full circle (twice)

    intel spun of their CPLD as Altera, bought it back, now spits it back out.. That's twice...

  6. Steve Kerr

    FPGA's

    Looked around and they do seem to have their uses where creating bespoke hardware solutions would be expensive and if they being used to prototype ASICs before final manufactering, that sort of thing.

    Personally, I've only come across FPGA's in the MiSTer emulation arena where cores for FPGAs have been designed by people to fully emulate original computer hardware with all the original timings. I got hold of one of the boards after a long time trying and they're pretty funky and cool and most certainly people have put a lot of work in to get them as exact as possible.

    Sometimes for a solution, you don't need multicore fast generic x86 CPU when something lower powered will do when all the requirements can be met in an FPGA core without the expense of tooling up to produce some custom boards.

    1. martinusher Silver badge

      Re: FPGA's

      The main barrier for entry for hobby/small scale work** is that a typical FPGA is a BGA with a bazillion pins. Unless you've got the capability to work with these parts you have to use some kind of evaluation board.

      The parts themselves are quite cheap, cheap enough to use in production equipment (usually <$10). A typical commodity FPGA might have a bunch of RAM in it and enough cells to build a processor from plus enough logic to build custom peripherals for that processor.

      (**I think it was Microchip that pioneered the ultra low cost of entry route. These "hobby projects" are not only good training for engineers but they often become products in their own right.)(I remember the Good Old Days when setting up a FPGA or firmware development project would cost serious money, It seemed like the H/W engineers working with Xilinx were spending more time fussing over libraries and licenses than actually doing design work. I mentioned Lattice before because they seemed to be putting the effort in to get their tools and parts to a wide market -- and the parts are definitely 'contenders'.)

  7. Chairman of the Bored

    Buy Altera? Not freaking likely!

    The elephant in the room here is that many companies across the world used Altera. Then Intel bought Altera... COVID happened... And you couldn't buy them at any price. Still can't.

    Intel apparently never reserved the fab capacity to fulfill their existing part commitments. Those of us who spent millions on IP wrapped around the MAX V, MAX 10 FPGA were and remain screwed. About 12 to 18 months ago, while we were all redesigning our kit to use Xilinx in place of the Intel parts... We were waiting for word from Intel that MAX and Arria were going back into fab. But instead what we got was bumf about "AI FPGA". WTF? Useful as an air horn on a chicken.

    Altera is (re)starting out behind the curve. Engineers and companies worldwide are still irritated at getting absolutely screwed. And having just spent loads of NRE on tooling, training, setting up supply chain, new board spins, etc moving to Xilinx there is no way in hell one can justify doing this again to go back to Altera when it has no vision or direction other than AI bullshit.

    It will be a cold day in hell before I spec and buy an Intel/Altera FPGA. And I'm not alone.

    1. Yet Another Anonymous coward Silver badge

      Re: Buy Altera? Not freaking likely!

      You mean you wouldn't bet your own company on the future development of an FPGA line that was bought by Intel, ignored by Intel, spun off by Intel then given a puff of AI fluffiness ?

    2. CowHorseFrog Silver badge

      Re: Buy Altera? Not freaking likely!

      I know of many produts that use Altera, for example DJI uses them for their transmitters even the MISTER project seems to be sourcing their FPGA products without issue.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like