back to article Intel: Joule's burned, Edison switched off, and Galileo – Galileo is no more

Intel has discontinued three of its offerings for the Internet of Things and embedded device markets. The chipmaker said in a series of low-key product updates that it would be killing off the Edison [PDF], Galileo [PDF] and Joule [PDF] compute modules and boards over the second half of the year. The notices mark an ignoble …

  1. steamnut

    Another botched call by Intel

    Intel had more than one chance to buy the king of the IOT - ARM. But they thought that they could do better by clinging to yet more derivatives of the quirky 80x86 chipsets. Ever since the 8008, those HL registers were going to be a problem....

    Now, ARM is part of SoftBank Group it is too late. Intel might also have looked at Atmel at the very bottom end (Arduino territory) but they too are now in the clutches of a bigger company - Microchip.

    And even the mighty Microsoft, part of that "Wintel" combination, is looking more closely at ARM processors than ever before.

    As the PC market shrinks year on year, server farms the only big game in town. If AMD really do manage to deliver some powerful and lower cost server chips, and the likes of Dell and HP start buying them, then Intel will have to explain to it's shareholders where it went wrong.

    1. Lyle Dietz

      Where it went wrong?

      I can answer that with one word: hubris.

      But then, I'm not the one that's making the mistakes, so I don't have to put a pleasant sheen on it.

      1. BillG
        FAIL

        Re: Where it went wrong?

        No, I can tell you where it went wrong. The documentation for these products sucked. Absolutely sucked. Distributors initially were selling the Edison and the Galileo so fast they couldn't keep them on the shelves, then people tried to return them because other than the demo they could not write firmware due to lousy docs. You couldn't even get the SPI to interface to an external EEPROM.

        The quality of these devices are/were excellent. While these products weren't the cheapest, you got a ton of power in a small footprint. But Intel also made some stupid mistakes. Like on the Galileo Rev 1, you had to power the board first, then connect the USB connector to your PC. If you did it the other way around, you blew the board.

        I had a direct line to Intel last year & I was working on the Intel Quark. The lawyers and the accountants have taken over the company. Getting proper phone or email support absolutely required knowing the design engineer's person's personal email address and conversing privately, and even then you only got part of the story.

        IMO, Intel's reputation in Embedded is done. First they discontinued all their embedded microcontrollers in 1993 and screwed over lots of customers, some I knew & went out of business. Then they tried XScale and despite spending $35M in marketing they could not penetrate Embedded because too many people remembered getting screwed in 1993. A few other failures, then the Atom mess, now these very very high profile devices. No word yet on if the Quark will be discontinued.

        I don't see how a responsible engineer could trust an Intel sourcing decision.

    2. Anonymous Coward
      Anonymous Coward

      Re: Another botched call by Intel

      No one actually cares what the instruction set is. They care that it's cheap as shit.

      ARM is cheap as shit so that's what people are using.

      1. Anonymous Coward
        Anonymous Coward

        Re: Another botched call by Intel

        Exactly! Had Intel bought ARM, we'd probably be using MIPS or PowerPC in our phones, and getting pretty much identical performance. There's nothing magical about the ARM ISA that makes it better - it has just had way way way more resources put into it than the competition. If ARM was taken off the board, those resources would have gone elsewhere.

        Apple chose ARM for the first iPod, and first iPhone probably because they were one of ARM's founders and had a little experience with it having used it for the Newton. But had Intel owned it and licensing wasn't as attractive, they'd have chosen something else.

        By the time ARM was big and it was obvious x86 was not going to go anywhere in the Android world, the FTC would have never allowed Intel to buy ARM. There was no way Intel could have bought it instead of Softbank. They would have had to buy it a decade ago when it was still an asterisk in global CPU revenue.

        1. bazza Silver badge

          Re: Another botched call by Intel

          Exactly! Had Intel bought ARM

          You're right about Intel not being allowed to buy ARM. The competition authorities on both sides of the pond would have had to take a look at the deal. It would have been very difficult to conclude that it was "in the public interests". Same for Apple or Google.

          Softbank, being neither a phone manufacturer or chip developer, where a neutral.

          There's nothing magical about the ARM ISA that makes it better - it has just had way way way more resources put into it than the competition.

          Actually, ARM's ISA is pretty good. The code density is much better than x86; you can get more done per kilobyte of program. This maps into less RAM, fewer RAM accesses, less power, etc.

          The more important part is that you don't need millions of transistors for an ARM core. I think that even the 64 bit cores (ignoring the cache, etc) is still only about 48,000 transistors to implement the ISA and get decent performance. This compares very well to the x86, which typically needs millions of transistors. Transistors take power, a bad thing in a battery powered device.

          The low transistor count goes all the way back to the very beginning. When Acorn were doing their first design, back in the 1980s, they had no money (compared to today), and every single transistor saved really mattered financially. So it's kind of an acident that the ARM core turned out to be very power efficient.

          ARM also have mastered the idea of specialised co-processors for popular tasks - video compression, etc. Intel have always been of the opinion "the core can do everything", which it can, but not at low power...

          Apple chose ARM for the first iPod, and first iPhone probably because they were one of ARM's founders and had a little experience with it having used it for the Newton.

          Apple weren't on the scene back in the 1980s when Acorn first started developing their own chip. In 1987 we had Acorn Archimedes computers at school with ARM2 inside. Apple came later, when ARM Holdings (the company) was founded and took on the role of developing the chips that Acorn had created. Apple sold off a load of shares in ARM Holdings in the late 1990s.

          When Apple started using ARMs in iPhones, ARMs were already pretty well established in the mobile industry, even in feature phones. For example, everything based on Symbian was ARM. They already dominated the mobile market by the time iPhone came along.

          That goes all the way back to Psion and the Psion 5, 5mx; they chose ARM for this device (even shaving off some of the packaging off the chip so that it'd fit inside), running EPOC32, which Nokia bought and then proceeded to ruin and called it Symbian. It took the world a long time to work out that Nokia had ruined it (the first iPhone demonstrated just how badly), but Symbian (and therefore ARM) was literally everywhere already by the time the iPhone came along.

          I couldn't forgive Nokia for screwing it up that badly. It was the foundation of their own ultimate ruin. Had they simply taken the Psion 5MX and shoved a 3G modem chip inside, it'd have been a killer device. Had they paid Psion to keep developing it, it'd now rule the world. They didn't. Whoops.

          But had Intel owned it and licensing wasn't as attractive, they'd have chosen something else.

          You raise a very intersting point. Way back in the day, Intel inheritted StrongARM, they were at the height of their powers, and ARM was still a pretty small player. Intel back then could have probably got away with buying ARM itself without perturbing the competition authorities. Instead they ditched StrongARM (Marvell got it), went and did Itanium, ended up focusing on x86 and copied AMD's x64. Not their most glorious of moments.

          I think Itanium really stung Intel. That was their last attempt to introduce a new instruction set. It didn't work out too well. I think that really put Intel off introducing a new ISA ever again, x86/64 it is no matter what. Yet running x86 quickly is a battery killer - too many transistors. To succeed in the mobile space the really needed a different ISA. It's difficult to see how they could have made that succeed any time in the last 17 years, but they've not even tried.

          If Intel got back into the ARM game, their mastery of silicon processing would produce a stunning ARM SOC. They'd wipe the floor with Snapdragon, Apple's A series of devices, and everything else. It's only pride that's stopped them doing this.

          1. HmmmYes

            Re: Another botched call by Intel

            Intel have always been a bit embarrassed by the x86 ISA. Its shut basically.

            They have wierd cycle:

            -x86 makes lots of money.

            - Intel execs get excited by the next big thing.i960 (object orientated CPU is I remember the spin), Itatmium (not sure what it was meant to be - Intels version of Alpha?), Pentium 4 (surface of the sun).

            - Idea blows up in face, execs sacked, some faceless browne noser takes over and syarts banging up about x86 everywhere.

            Intel are one of the best silicon fabricators going. No argument.

            Intel are one of the worse CPU desgigners. Ever. They fuck up time + time again. Intel have had two major bits of luck that bailed them out big time: IBM choosing x86 for the PC; Israel designed producing a low power RISC which ran x86 (CoreDuo). At the time of the Core Duo, Intel had everyone in the company pushing Pentium 4. They knew it was an overhot piece of shit bu they they did note care.

            However, Intels biggest luck as been the margins on x86, well Xeons these days. This allows them to outspend its competitors.

          2. druck Silver badge
            FAIL

            Re: Another botched call by Intel

            @Bazza: I don't think Intel would wipe the floor. When they got hold of the wonderful StrongARM from DEC, most of DECs chip designers left rather than be forced on to the Itanium millstone. The Intel people then developed it to the X-Scale, which despite being fast at time due higher clocking using a smaller processes size than competitors, was a rather rubbish chip.

            Unlike Apple who have actually improved their cores over the baseline designs from ARM, Intel made it far worse in terms of instructions per clock, with a load of unnecessary barriers between execution units which have never been seen on ARM before or since - indicating they just didn't know what they were doing. The memory subsystem was horribly crippled, and the chip errata was longer than every other ARM chip that came before it combined - almost as bad as an x86.

          3. Anonymous Coward
            Anonymous Coward

            Re: Another botched call by Intel

            >Actually, ARM's ISA is pretty good.

            ARM's 32bit ISA is a mess now and they dropped stuff that people get a woody for like conditional execution for aarch.

            Oh and the code density thing is only in thumb mode and they didn't invent that. They licensed the patents from Hitachi.

            TL;DR; no one gives a crap about the instruction set. Is it fast enough? Does it have a decent C compiler? .. No but it has this really cool instruction that does .. stop, you can keep it.

            1. druck Silver badge

              Re: Another botched call by Intel

              @ Daniel Palmer you've clearly never used ARM assembler, or any assembler for that matter.

              The original ARM instruction set of a thing of beauty, simple, orthoganal and immensely powerful. Unlike some other bloated ISAs which are effectively compiler only, it allowed developers to write large amount of hand crafted assembler to make things possible on early low power devices that couldn't be with high level code for several more years. (See RISC OS)

              1. Martin an gof Silver badge

                Re: Another botched call by Intel

                (See RISC OS)

                Or Acorn Replay. I had one of the early Risc PC 600s (in fact it's still running - it's on at the moment and I use it every day) and I remember being absolutely blown away by video running on my desktop.

                Oh, and Replay seemed to have got the sound-video synchronisation problem licked, a problem which broadcast digital TV still doesn't seem quite to have solved.

                M.

              2. Anonymous Coward
                Anonymous Coward

                Re: Another botched call by Intel

                @druck

                No one cares about assembler. If you have to code in assembler it better bit 8 bit and cost less than a cent per part.

      2. Dan 55 Silver badge

        Re: Another botched call by Intel

        It's also low power and customisable, something that Atom isn't.

    3. Charlie Clark Silver badge

      Re: Another botched call by Intel

      Now, ARM is part of SoftBank Group it is too late.

      Can't agree with that at all. Given SoftBank's level of debt flipping ARM for a profit is a pretty likely outcome. But anti-trust concerns would probably prevent a sale to Intel.

    4. Anonymous Coward
      Anonymous Coward

      Re: Another botched call by Intel

      One major way things started to go wrong was when Ottelini backed the PC group and failed to grasp the mobile opportunity handed to him by Vodafone. Intel aready had a "major market sector" in PC's and the iPhone was yet to be. Vodafone CEO and team loved the prototypes and concept, Intel killed it.

      Please wake up.

  2. TonyWilk
    Facepalm

    Feet and firearms

    The big advantage Intel has is in continuance of supply for OEM's

    er... oh bugger.

  3. Captain DaFt

    Hardly unexpected

    looking at the specs for the discontinued tat, it almost seems like Intel thought they could conquer the small board market by releasing bigger, more powerful boards in a market wanting small, power sipping ones.

    1. Anonymous Coward
      Anonymous Coward

      Re: Hardly unexpected

      >more powerful boards in a market wanting small, power sipping ones.

      Actually a lot of people want faster boards. The cheap ARM board has plateaued on the performance side (1.2GHz quad cortex A7) and that's not enough for a lot of applications.

      1. Steve Todd

        Re: Hardly unexpected

        The Raspberry Pi 3 is currently using quad core 64 bit A53s, and is far from the only cheap SBC to use 64 bits. Yes, these machines are not as powerful as a full sized PC, but then they are hugely cheaper, capable of many tasks and I'd hardly describe them as plateauing.

        Where Intel shot themselves in the foot with these IoT processors was in their lack of support and documentation available to mere mortals. If you want to do pretty much anything with a Pi then you'll find the details you need on the web somewhere. With Intel it's mostly guesswork as they won't tell anyone short of a large OEM anything, and that's with an NDA in place even.

        1. Anonymous Coward
          Anonymous Coward

          Re: Hardly unexpected

          >The Raspberry Pi 3 is currently using quad core 64 bit A53s

          Does raspbian use a 64 bit userland yet? FYI: those A53s are slower than the 32bit big.LITTLE cluster on the odroid xu4.

          >and is far from the only cheap SBC to use 64 bits.

          OrangePi etc have 64bit Allwinner based boards that are the same price/cheaper than the actual cost of the Pi.

          >capable of many tasks and I'd hardly describe them as plateauing.

          Doing stuff you would like to do on them like software defined radio is difficult because of the lack of cpu power, memory bandwidth and really basic stuff like decent USB.

          >Where Intel shot themselves in the foot with these IoT processors

          >was in their lack of support and documentation available to mere mortals.

          Not really. There are IoT products shipping in the hundreds of thousands of units that are using chips from Marvell, Broadcom (cypress now) etc that have documentation and SDKs that are only available under NDA and have strict distributor rules (i.e. you can't order their stuff from digikey without approval). If you are serious about doing something getting into vendor schemes to get NDA, access to samples etc isn't that hard. No big vendor gives a crap about some guy that is going to produce maybe 5 units in total. It's not even worth their time maintaining a github repo.

          >If you want to do pretty much anything with a Pi then you'll find the details

          >you need on the web somewhere.

          You aren't going to use a Pi in an real world IoT product because it's far to expensive and overkill.

          >With Intel it's mostly guesswork as they won't tell anyone short of a large

          >OEM anything, and that's with an NDA in place even.

          Weird. You keep going on about the Pi which is NDA and binary blobbed up to the eyeballs to discredit Intel when Intel are probably the only vendor that has full opensource drivers etc for their CPUs, GPUs, NICs etc in the mainline kernel.

          1. Steve Todd

            Re: Hardly unexpected

            @daniel - thank you for proving my point. The Pi is far from the only SBC, and even it is available with better than a Cortex A7, thus proving your assertion on performance limits wrong. The ODROID UX4 uses the A15 paired with A7s in big.LITTLE, but the A53 uses the newer ARMv8-A instruction set so is faster at a given clock speed.

            Not having source for the GPU driver doesn't stop most IoT developers (no SoC that I'm aware of has open GPU drivers, and x86 systems with open GPU drivers normally significantly underperform the OEM BLOB). Not having source or correct details to make it talk to external devices through the likes of GPIO, SPI, I2C etc does.

            1. Anonymous Coward
              Anonymous Coward

              Re: Hardly unexpected

              >Not having source for the GPU driver doesn't stop most IoT developers (no SoC that I'm aware of has open GPU drivers,

              The i.mx6 can now run Android with no binary blobs.

              >and x86 systems with open GPU drivers normally significantly underperform the OEM BLOB).

              Intel's drivers are opensource and they are the OEM.

              >Not having source or correct details to make it talk to external devices

              >through the likes of GPIO, SPI, I2C etc does.

              Many of the parts people use for IoT don't have decent datasheets and the existing driver in the SDK is the only reference. One part I have worked with had massive useful features totally undocumented in the datasheet yet had code to support them in the SDK.

  4. Bitbeisser

    Thei biggest mistake was plain and simple greed. With a Galileo 3x the price of an rPi and less capable in terms of peripherals, hey just didn't stand a chance. Similar for the Edison and the Joule. Not too shabby but hopelessly overpriced...

    1. Flocke Kroes Silver badge

      rPi was never the competitor

      Intel's cut down chips had to compete against Intel's server chips. In Intel's place, would you have your Fab's working flat out making big server chips that you could sell with a huge margin, or cut the number of server chips so you can make some embedded system chips that might sell at near cost?

      1. Yet Another Anonymous coward Silver badge

        Re: rPi was never the competitor

        I would outsource the manufactue of the low value chips to someone else.

        There is no need to do everything inhouse.

        In fact you don't need a foundary at all, do you ARM / AMD ?

  5. Sil

    Intel's Problem in Chief = CEO

    The problem was not ARM.

    It was too little too late too expensive and no big push.

    Brian Krzanic will be remembered as the man who single handedly destroyed Intel, diversifying beyond reason, always looking for the glitzy press conference, and never standing by its offerings for more than a minute or two.

    Why would you trust Intel for anything new ? Program after program gets deleted after 2-5 years: mobile processors, Edison, Joule, Galileo.

    Have you ever tried to order an Edison or an IoT kit from Intel and its partners ?

    There is no support whatsoever behind technologies that could have become something, such as RealSense - just look at how many new laptops use Windows Hello's camera recognition.

    What's happening to Knights Landing? Where is the big push? Where is Intel fiercely battling NVIDIA and soon AMD for a piece of the cake ?

    Even SSDs, can you remember years ago when Intel had a very strong position in the enterprise with the G2 ?

    Instead, we get crap gadgets, from wearable computers to dancing drone shows, where Intel's added value is highly questionable, musicians as ambassadors or whatever, decimated marketing personnel.

    1. HmmmYes

      Re: Intel's Problem in Chief = CEO

      Not just Krzanic. All Intel CEOs since ~96 have been the same.

      Talk to the number of peple who've been thru the Intel VC meat grinder.

      It goes like this: 'You are strategic investment, we want you to work Intel silicon into your products which we use as base design to sell to 3rd parties who'll buy our chips inthe billions.

      VC aquired company goes away. Puts Intel silicon in design. Silicon does not work - full of bugs.

      Design fails. Intel review business - Its not making money! Err, you removed all our sales force and made our customers use box shifters sales channel.

      Inel shutdown compnay as it does not make money.

    2. BillM1

      Re: Intel's Problem in Chief = CEO

      As mentioned, Intel has done this several times pulling the rug from under their customers. Intel has done this several times in their ASIC/Foundry businesses. Intel either has or will earn a reputation of short term market evaluations and if poor killing those product lines. This actually causes a death spiral for any new Intel endeavors: customer shying away from engaging and risking their supply issues (Intel).

      Quite amazing given that Intel during the PC regime would eliminate weak links/high risk suppliers to selling their PC chips. Taking over chipset business from VLSI Tech Inc and undermining the PCB makers, RAM, etc to produce an integrated mother board. Intel should look into an old mirror to understand how their customers might avoid Intel at any cost.

      Unfortunately, this announcement adds more credibility to avoid Intel.

  6. oldtaku Silver badge
    FAIL

    Inevitable failure

    Yeah, this was inevitable - Intel is institutionally incapable of making a credible play for the low end market with a combination of performance (which they can do) with low power (which they struggle with since they thought beating AMD at that was good enough) and low cost (which they're too arrogant to do).

    Hindsight is easy! you may say, but I said this when they launched too. It's Intel, it's not desktop/laptop, it'll fail.

  7. Mikel

    Watts and price

    These are two big factors.

    Platform flexibility too. Intel neuters the platform specs to avoid competing with their pricier platforms. ARM devices are, quite simply, engineered to be as capable as possible at the given power and price.

    And gpio. HELLO! the iOT things need to do "thing" things.

  8. Anonymous Coward
    Anonymous Coward

    Considering how stupidly overpriced this stuff was, I'm surprised it lasted as long as it did.

    I looked briefly at Galileo and then looked elsewhere when I saw how much it cost.

  9. John Smith 19 Gold badge

    You pay top $ to Intel for its instruction set

    And you do that because you've got a shed load of software (and tools) written (and tuned) to run on that ISA.

    If you don't have that investment to protect they the x86 instruction set has to stand on its own two feet.

    At it's core is it's a typical 1960/1970 microcoded complex instruction set. designed when instruction set design tools only existed inside mainframe mfgs by (essentially) hardware engineers.

    So it's got lots of kool stuff that does one single task on one single data type (which might be tied to a specific register), which is also a nightmare to generate code for from a high level language. In 1979 (when they 8086 was launched) this was not a high priority (except on Burroughs mainframes, which were famous for being programmed only in HLL's).

    The problem is if you sell a cheap x86 processor people start to ask WTF do they have to pay such prices for the high end stuff.

    One option (which I think some very cheap 8051 versions use) would be to go with an internal bit serial core that retains x86 compatibility. Then there would a reason why they were so cheap (small die size and more clocks to do stuff) but you could have many more cores as an option.

    Another would have been to keep Strong ARM (then the fastest ARM implementation) and said "It's not x86, but we sell a shed load of them and make a decent profit, and as long as people want maximum performance for ARM we will own the market" IOW Accept they are a chip company and whatever the market wants in the best way possible. In the long run owning quite a lot of a big market beats owning 100% of a tiny market (as the British GEC company eventually discovered).

    But the day is coming when Intel's advantages will disappear.

    When all transistors are 1 atom wide everyone has maximum density.

    Then we will see how vital compatibility with an architecture designed around the time "Saturday Night Fever" and "A New Hope" were on first run at your local cinema (or "multiplex" for younger readers).

    1. HmmmYes

      Re: You pay top $ to Intel for its instruction set

      CPUs will compete on how good their caches are.

  10. Anonymous Coward
    Anonymous Coward

    Big issue here

    Why would anyone ever commit to using devices from a supplier that keeps pulling the rug? This used to happen with micro-controllers in the past, but they learned that long buy times are essential in many markets.

  11. Redstone
    Megaphone

    I'm going to stick up for the Joule...

    ...having played with a 570x kit.

    While the Joule kit was nearly 10x the price of the Pi, it's Atom could crunch Sysbench CPU tests 20x faster than the ARM. The memory transfer rates were 4 to 5 times faster on a Joule too.

    It was supported by a well-known Linux distro in Ubuntu 16.04 and had a pretty straight forward set up procedure.

    Intel also put some effort into supporting the RealSense ZR300 camera with some really helpful example programs for the RealSense imaging and SLAM libraries. I really thought this was the beginning of something that could become pretty damn good.

    I think part of the problem is that old Intel chestnut: very hard to get documantation. They also failed to get across in their marketing that this wasn't aimed at the Pi users - it was meant for people that wanted to graduate to a more serious edge computing IoT platform.

    That's just my 2 cents, anyway.

    1. Anonymous Coward
      Anonymous Coward

      Re: I'm going to stick up for the Joule...

      I would have gone for Joule rather than ARM-based system for embedded protoype project except for one thing: the ARM supplier guarantees availability to 2028, Intel wouldn't even discuss the issue.

    2. DropBear

      Re: I'm going to stick up for the Joule...

      No-one argues more crunching power isn't welcome in certain applications - but when you're not trying to make Robot Vision, how often is all that power really needed...? IoT actually works fine even on tiny MCUs, although non-masochists certainly appreciate being able to run a proper OS on an ARM chip instead where you're not expected to juggle everything from the file system to the wireless modem single-handedly. But all the dirt cheap boards (that you can get several of for the price of a single "genuine" Arduino nowadays...) already do that - why on Earth would you want to pay a ten times premium (AND futz around with minute connectors that you need nanites to wire up, as a hobbyist) if you don't have to?!? All the rest about NDAs and such are just the tip of a very, very large iceberg...

      1. Anonymous Coward
        Anonymous Coward

        Re: I'm going to stick up for the Joule...

        There are people that do stuff more than turning relays on and off.. If you want to do software defined radio, machine vision in a small space with latencies better than sometime next week you are basically in a tight spot.

        The common ARM stuff is produced on old fabs to make it cheap and doesn't clock above ~1.2ghz.

        Almost no one is shipping chips that use the higher performance cores (A15 instead of A7).

        1. Anonymous Coward
          Anonymous Coward

          Re: I'm going to stick up for the Joule...

          Not me - machine vision, SDR, etc. has FPGA written all over it. Especially the FPGA-ARM hybrid devices where the computational grunt-work can be compiled from something C-like to FPGA logic (Altera OpenCL or Xilinx SDSoC) and then called from the ARM software.

          1. Anonymous Coward
            Anonymous Coward

            Re: I'm going to stick up for the Joule...

            >Especially the FPGA-ARM hybrid devices where the computational grunt-work

            So you're talking about chips like the xilinx zynq which the cheapest available board is $100, the FPGA isn't all that big and the amount of time you spend developing your custom cores is going to over take the money you would have spent to use pre-existing software on something x86 based if your day rate isn't a fiver and a packet of crisps.

            1. Anonymous Coward
              Anonymous Coward

              Re: I'm going to stick up for the Joule...

              > So you're talking about chips like the xilinx zynq which the cheapest available board is $100, the FPGA isn't all that big and the amount of time you spend developing your custom cores is going to over take the money you would have spent to use pre-existing software

              Well in my day job the pay is over $300/day so my employers don't care how much the devkit costs.

              For hobby purposes everything more expensive than a PI is too much, but I really like the Cypress PSoC BLE devkits where a module costs less than $10.

              Now the things that are changing in FPGA land are (a) rise of the block-integrator tools (Xilinx IP integrator, Altera Qsys) which drastically reduce the amount of RTL that needs to be written and (b) C programming tools like Altera OpenCL and Xilinx SDSoC.

              So 95% of the development occurs in the low-power (and well documented!) ARM penguin-land, and one adapts & recompiles the key compute-intensive bits into the FPGA. SDSoC is particularly good for this acceleration model, OpenCL is a bit more hassle on the host side.

              1. Anonymous Coward
                Anonymous Coward

                Re: I'm going to stick up for the Joule...

                >Well in my day job the pay is over $300/day so my employers don't care how much the devkit costs.

                So your employer isn't going to let you mess around for a few weeks to see if you can make something work in a zynq when you could drop the money on an intel module or one of nvidias things to see if it works as an actual product first.

  12. John Smith 19 Gold badge
    Unhappy

    "failed to get across in their marketing that this wasn't aimed at the Pi users "

    So if you spend a lot more money you get a box with a lot more processing power.

    Now yes it sounds like a good deal 20x speed for 10x money.

    But the Pi is a baseline for ARM performance, not the pinnacle.

    There are a lot of ARM based processors, and quite a lot of ARM based boards.

    1. Anonymous Coward
      Anonymous Coward

      Re: "failed to get across in their marketing that this wasn't aimed at the Pi users "

      >There are a lot of ARM based processors, and quite a lot of ARM based boards.

      But very few that are clocked above the magical 1.something ghz that aren't phones or expensive server hardware.

  13. imanidiot Silver badge

    Cost not the problem

    I suspect most serious users (the pro embedded systems designers) realised well enough what advantage these platforms offered to justify the added costs. Availability in lower quantities for prototyping was a tad hit and miss however which already threw a spanner in the works. Getting documentation for ANYTHING on these IoT devices if you didn't have a signed contract for x million units was apparently a nightmare. I've heard stories of people developing for the platform not allowed access to particular bits of data crucial to their implementation of Intel stuff simply because they didn't have the correct contract/nda/friends in high places to allow them access to that bit of Intels document library. Even if they KNEW the data existed.

    Intel expected to just dump these boards on the market and get orders for 10 million units in year 1. Which is not how the embedded systems market works. Lots of single unit prototyping or maybe double digit unit runs happening there. Especially for specialist high power devices like these Intel devices.

    (Not to mention the rediculous edge connector that meant you needed a breakout to do anything with the boards)

    One this is for sure. By dropping the Joule only a few MONTHs after it's introduction without so much as a warning they'll NEVER get into the embedded market ever again. They've killed any goodwill they might have remaining.

    IoT= Intel offered Trash in this case.

    1. Anonymous Coward
      Anonymous Coward

      Re: Cost not the problem

      Yes, devkit cost is insignificant compared to engineering time - which is exactly the cost of poor documentation.

      Unfortunately it's much the same for AMD (though they are a bit more forthcoming due to being hungrier).

  14. HmmmYes

    Talk to anyone embedding processors.

    Most will not use Intel as they have been shat on them at some time i nthe past.

    Silicon that does not work.

    Or that CPU, sold with 10 guarantee ans just been removed.

  15. common-sense-prevails

    three problems

    There are only 3 problems with all the Intel IOT related kit. It is:

    a) over-priced

    b) too expensive

    c) costs too much

    I really wanted to buy a Joule or two, but .......... :(

    You've got to understand your market

    1. Steve Davies 3 Silver badge
      Facepalm

      Re: three problems

      This is not a new thing.

      many years ago I wanted a sample device from a Silly Valley maker. We would pay a decent price for a sample but all they wanted was an order for 1000 minimum.

      IT didn't matter that if the sample was succesful we'd order around 50K a year. Nope, 1000 or nothing.

      We went elsewhere. About 6 months later we got a call from the new CEO of the business wondering if we'd like a quote for a 100 devices. Doh!

  16. a_yank_lurker

    Cannibals

    I believe Steve Jobs commented that a company is either going to be cannibalized by competitors or they are need to cannibalize themselves. His point was the market is never static and new niches and players will appear. Some of these niches will grown big enough to hurt you unless you keep adapting. Adapting may mean letting the old products that made your name die off while new ones that better fit the market are introduced.

    Chipzilla and many others have failed to understand this basic truism. The market is ever changing and just because you the current apex predator does not mean you will not be someone else's dinner in the future.

  17. Herby

    Dust??

    Rewind the clock to around 1980 or so. If IBM had not chosen Intel, they might as well be dust now. Probably making dram chips or rom's. Of course IBM did chose Intel (they had good reason to, they owned part of it), and history was written.

    The x86 instruction set is not the best in the world, and has gone through many band-aids to get it where it is now. I still wonder why it is still being used. Only because of good compilers and the like and big increases in clock speed does it make any sense. Then again, what Intel gives in speed improvements, Microsoft takes away in bloated software.

    Life goes on.

    Me? I still like the 68k processors, but that's another story.

    1. Redstone

      Re: Dust??

      I like to imagine what might have been after the 6502 had Commodore not aquired MOS.

      1. Martin an gof Silver badge

        Re: Dust??

        I like to imagine what might have been after the 6502

        You don't have to imagine - didn't Acorn use the 6502 as inspiration for ARM?

        Granted, MOS might have done something different, but ARM is a good start I think.

        M.

  18. Anonymous Coward
    Anonymous Coward

    Intel is an ARM licensee...

    by dint of the Altera purchase, Intel is now selling FPGAs with embedded ARM processors and documentation to the standards you'd normally expect from an embedded device manufacturer.

    I'm not sure than any of these devices are being fabbed by Intel (maybe Arria 10s).

    The Terasic devkits with Cyclone V devices are a not-too-expensive way to get hobbyist hands on this hardware.

    1. John Smith 19 Gold badge
      Unhappy

      Re: Intel is an ARM licensee...

      Don't worry, I'm sure they'll sell that off ASAP.

  19. Stoneshop
    Devil

    Galileo – Galileo is no more

    Fiasco-hoo-ho-ho-ho-ho-ho-oh

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like