Compilers job
Surely that is a job for the compiler not the CPU
Intel researchers have developed a way to make the increasingly tiny processors needed to power the impending "Internet of Things" even tinier: compress the code running on them. "We compress the code, make it smaller, and save area and power of integrated on-die memory," Intel Labs senior reseacher Sergey Kochuguev from ZAO …
Surely that is a job for the compiler not the CPU
Completely understand where you're coming from.
But then:
- all compilers for this processor would have to compress the code
- or the processor would have to be able to distinguish between compressed and uncompressed code.
If you were going to produce a compiler which output compressed code, then you'd either want to make that option switchable, or have it produce both compressed and uncompressed code so that you could use it for other processors.
So - start with the proof of concept demonstrator, as they have done. Then do the other stuff later, if there is sufficient interest.
Executable code is produced by a compiler _before_ it is executed and often on a different machine to the one that uses the code. This is about something that improves code efficiency _while_ it is being executed on a target device.
(Having a compiler optimise for a target architecture would be a separate consideration.)
"The reason it's better to have it done in hardware instead of by the compiler is it makes for less work for the CPU,"
How do you figure that? If both compression and decompression is done in hardware and the initial code is uncompressed, then the CPU has to burn power to compress the code in the first place, then de-compress it just-in-time to execute it.
If the compression is done by the compiler you load the compressed code directly at run-time, and you only have to decompress it just-in-time to execute. By having the compiler do the compression once (and it can spend a lot more optimizing the compression since it doesn't have to be done in real-time at run-time) you are saving at least half of the run-time work, probably more since compressing is typically slower than decompressing.
Sounds a bit like what ARM set out to achieve with their Thumb/Thumb2 instruction sets, but taking things one step further with proper data compression rather than just shrinking the instruction format.
The article doesn't mention what architecture this is targeting, although as it's Intel I'm guessing x86 because I can't imagine them getting back into smaller embedded stuff having dumped MCS-51, i960 and XScale a long time ago. If that's the case, it makes sense for the compression/decompression to be on chip so that code compatibility can be maintained. If compression had to be done at compile time, compatibility would go out of the window.
As for 'intelligent drapes, coffee machines, toothbrushes, baby monitors, stereos, alarm clocks, supermarket shelves, air-quality sensors, and more', surely that has been ARM's bread and butter since almost forever? It's hardly new for microcontrollers to be embedded in that kind of stuff and you don't need vast amounts of processing power or memory to achieve Internet connectivity.
If this was some form of say UPX, then you'd have a small benefit on storage space saved, but your memory "energy footprint" would be the same as the executable would still take the same space once uncompressed.
But... if the exec is compressed once as it leaves storage>RAM and then dynamically uncompressed/compressed as the CPU fetches it (and maybe changes it), you'd have a smaller memory "energy footprint" than the original, assuming code would compress enough that the lower memory energy usage outweighed the energy the compression/decompression unit used.
As for blaming compilers, well, there are many a part of code a compiler simply can't take a guess at rewriting/optimizing. For example code that might never execute but that the compiler can't ignore because it might execute sometimes. A unit such as this actually works in that case because it doesn't need to make assumptions, it just packs/unpacks as needed.
This seems to be a decompression unit between code storage and the CPU; completely transparent to the CPU - its asks for memory, its get it, not realising that it was grabbed from a compressed image and decompressed on the fly. This means smaller memory requirement (and memory is the biggest foot print on these chips), and reduced power to run the memory. Take in to account the power needed to decompress, and there is still a net gain in power requirements, but a 5% loss in performance. According to Intel.
It's nothing to do with the compiler. Its nothing to do with the architecture of the instruction set.
As for the compiler in HW, look up ARM Jazelle - might be of relevance.
so that is about 7 devices per persion, at current I own 2 (phone and desktop), i suppose this means they are expecting people in rich contries to own internet enabaled versions of each of the following:
1 ) desktop
2 ) laptop
3 ) tablet
4 ) phone
5 ) car
6 ) tv
7 ) microwave
8 ) refirdgerator
9 ) freezer
10) washing machine
11) tumble drier
12) oven
am I missing anything?
DVRs and game consoles would be a big ones, be it an XBOX, Wii or PS3, or something like a Nintendo 3DS, PSP or PS Vita. Smart electrical meters, HVAC controls (residential or commercial), weather-aware sprinkler systems, EV charging stations, home security systems would also add to the count.
"We compress the code [...]" but code is typically far less than data in a conventional app. Ok, they say it is for embedded stuff in which case there is likely to be little code anyway; not your full app stack on a full fat OS; so I can't see the value.
If it is for Rogue Jedi's tumble drier/microwave/car, it makes no sense as these are heavy energy users anyway. If it's for portable devices such as phones, there will be a lot more data than code. Still can't understand it.
First question, then what niche is this technology targeting?
Might help more if the data was compressed but that's been done already I believe, in which case this might just be a land-grab for maybe-one-day-useful patent IP. That's the only thing that makes sense to me so far - any wiser suggestions welcome.
And then he reads the last para (sorry for being dumb): "intelligent drapes, coffee machines, toothbrushes, baby monitors, stereos, alarm clocks, supermarket shelves, air-quality sensors"
Well...
intelligent drapes, intelligent toothbrushes? Just die.
Coffee machines - heavy energy user anyway.
Baby monitors - a quick browse suggest some are mains powered, and the rest have wifi & speakers. which is likely to use more power than cpu.
Alarm clocks? Oh please, WTF do I need to put intelligence in a clock for?
supermarket shelves, air-quality sensors - dunno.