NASA has awarded a $50 million contract to Microchip Technology, the microcontroller giant, to develop next-generation processors that will enable space computers to be 100 times faster than they currently are. If NASA is to fulfill its goal of exploring deeper into the solar system, it's going to have to develop advanced …
ARM ate their lunch years ago, almost entirely because Microchip charged for their C compiler.
It seems they grew half a brain at some point in the last decade, but even now they want to charge for optimisations.
It does seem an odd choice, as their highest performance processor (SAM) appears to be a single core, 600MHz - ARM9 from 2001. It's not even Cortex, it's that old.
There are advantages in going that far back in time. Everything is bigger so less susceptible to bit flips. Alternatively you can select a more modern manufacturing process and optimise for low power and wider temperature range. Think of every new feature as a place where radiation induced bit flips can ruin your day in a way that you will have to work around. CPUs in space often to not have a memory management unit so programmers do not have to deal with potential bit flips in the address translation cache.
I can easily understand NASA selecting an embedded system chip designer that has retained experience with retro-tech.
Although what you said is correct, going that far back makes it difficult (impossible?) to get the 100X performance improvement they want. To go to smaller device features they are going to need to integrate processor error detection/correction at the chip level. Depending on the device size they decide on for the
needed performance they might be able to integrate a three (or more) way redundant voting system on a single chip. Along with the required improved radiation hardening made more difficult by the decreased device size this may be the only way to get the performance they want. I am very surprised they chose Microchip Technology as I was not aware that they had much prior experience with radiation hardening.
going that far back makes it difficult (impossible?) to get the 100X performance improvement they want
Depends on from how far back they're coming from. I remember not terribly long ago hearing that NASA was launching stuff using rad hard 8 bit CPUs, anything from 2001 would go way beyond 100x improvement over that.
Edit: a little googling shows the Mars Rover uses a PowerPC CPU similar to what you might find in a late 90s Mac (though clocked slower, probably because of being made differently for the radiation hardening)
It looks like the HSPC project mentioned in the article is going to base the next one on ARM Cortex A53 with up to 8 cores - so there's 8x of their performance increase.
"Edit: a little googling shows the Mars Rover uses a PowerPC CPU"
Which "Mars Rover " are you referring to?
Spirit & Opportunity?
Sojourner - 80C85
Spirit & Opporunity - RAD6000 (Power 1 architecture)
Curiosity - RAD750 (PowerPC 750)
Perseverance - RAD750 (ditto)
To add a bit of variety, the New Horizons spacecraft uses a Mongoose-V (MIPS R3000) clocked at 12MHz.
As the line in the Apollo 13 film goes - "(electrical) Power is everything"
contradictions in design specs:
* must be fast
* must tolerate cosmic radiation
* must have low power consumption
Resolving those will be interesting for the engineers.
As for SAM processors, they work well for the purposes for which they were designed. Arduino uses the ATMel AVR and MicroChip bought ATMel a while back. They have not abandoned AVR support as far as I can tell, so I am happy with that. And ATMel already had SAM series processors using Cortex-M (aka 'microcontroller') which is a somewhat reduced instruction set (double-RISC?) designed for microcontroller use.
Still does not mean they cannot use the latest ARM core. What NASA probably wants is their experience with low power consumption and good performance, on-chip peripherals [which I REALLY like] and so on.
Plus for space they'll probably need sapphire substrates. I do not know if Microchip uses these at all (quicky search was inconclusive) but they may also have another way to do high radiation tolerant CPUs thatt we do not know about.
There is stuff up there in space that was launched in the 1970s and is still in communication with NASA. Of course Voyagers are extreme case, but it's to be expected that NASA can launch log-term missions now that will still be operational in 2050-2060. With that in mind it's good to load as high capacity as possible, with lots of built-in flexibility that can later be upgraded with software updates, similair to what they did with Hubble.
"with lots of built-in flexibility that can later be upgraded with software updates, similar to what they did with Hubble"
Even Voyager 1 and 2 could - and did - receive software updates and virtually every NASA mission since.
I think you are getting confused with Hubble's capability to be have HARDWARE upgrades. At least until the space shuttle fleet was grounded.
The goal is 100x their current tech which from other comments seems to be 2+ decade old production of even older tech.
A combination of redundancy better fault Torrance and shielding could give a combination of current arm CPU design with some custom processors for certain ai tasking more than a chance. Given the state of Arc, it may be a good fit for the task... At least as a starting point.
This post has been deleted by its author
The Sojourner rover used an 80C85 CPU.
And highest speed isn't everything. Power is a very finite resource on spacecraft. The general rule with processors is the faster they go, the more power they need.
If you run out of power - brown out - at a critical moment you can have a dead spacecraft. Aka LOV (loss of vehicle) and thus probably LOM (loss of mission).
Then you really are buggered...
Plus older "hardened" electronics are worth their weighrt in gold compared to new "non-hardened". Sheilding is heavy compared to hardened electronics and costs you dearly to launch.
… the money might be better spent figuring out how to create some sort of magnetic belt (braces) shielding, mimicking our one down here so they could just use COTS products like the rest of us, bonus taking COTS humans et al also. Although she’d probably see through such a cheap trick and so it remains medical experiments for the lot of you
If it was BAe I'm guesssing they had enough of Billions Above Estimates idea of "Competive pricing"
I like PIC's, ever since I heard of them. Their usual design is Harvard architecture with around 4Kwords of 12bit (up to 16 bit IIRC) instructions with byte wide data. Very fast for such a machine. Also (IIRC) static registers, allowing clock slow down to 0 --> power level down to leakage levels. This is quite relevant given the grief NASA gets when it wants to use a TEG for outer planet missions. IIRC Neither Neptune nor Uranus have had any attention since Voyager.
If they have ARM IP and a rad hard process to implement it on then yes, that's a reasonable plan.
But supplier lockin is a major issue as the market is too small for many companies to swallow the up-front cost of a new process.