Re: My cynical side thinks this is no accident.
All I'll say is that I'd like his pay package.
Apple’s top silicon lead Jeff Wilcox, who led the iGiant's push to develop homegrown chips, has left his role to start a new job at Intel. Wilcox announced he had returned to his old employer in a LinkedIn post, this week. "I’m pleased to share that I have started a new position as Intel Fellow, Design Engineering Group CTO, …
This post has been deleted by its author
So Wilcox transitions back to Intel with reams of Arm-based Proprietary Intellectual Property to share with Apple competitors/opponents?
Is that not akin to Industrial Espionage/Intellectual Property Theft?
We appear to be of a similarly suspicious and uncharitable frame of mind on this matter, ShadowSystems.
... and non-compete agreements are explicitly unenforceable in California, for the general public-policy reason that people shouldn't be able to bargain away their ability to engage in a lawful profession.
On the other hand, he'll be under a pile of NDAs and wouldn't have risen anywhere close to as far as he already has if he'd been the sort of person who obviously doesn't honour them.
> So Wilcox transitions back to Intel with reams of Arm-based Proprietary Intellectual Property to share with Apple competitors/opponents?
I wonder how much information he took with him when he joined Apple. I mean what with him having previously sent many years working for the worlds largest designed of CPUs & their supporting chips and then moving to a company that was keen to learn how to do this?
So Wilcox transitions back to Intel with reams of Arm-based Proprietary Intellectual Property to share with Apple competitors/opponents?
Is that not akin to Industrial Espionage/Intellectual Property Theft?
Skills and experience are the property of nobody except the person that earned them.
"Client SOC Architecture". I wonder what his NDAs are like? Could he be targetted at doing something at Intel that is quite similar to what he did at Apple? I doubt he would be terribly interested in diving into X86-64 stuff (no-one wants to do that!), unless it is at an architecture-independent level. Intel have announced they will be doing SOC's based on ARM and RISC-V, so those, or general stuff, is my guess. (See "Integrated Device Manufacturing", IDM 2.0)
Non competes aren't enforceable in California, so while he can't reveal/use Apple's secrets, he can work in exactly the same niche he was at Apple without problems. .... DS999
Oh, I wouldn’t like to be relying on that information in a court of law if now working for Intel in exactly the same niche he was at Apple without problems. It is not as if the one is not into the designing of clones and drones of the other to do exactly the same things only better and faster and at speeds faster than light to realise an early advantage and deliver an overwhelming lead.
It’s the true nature of such intelligently designed great games, is it not?
Oh, I wouldn’t like to be relying on that information in a court of law if now working for Intel in exactly the same niche he was at Apple without problems
Indeed - we may have Oracle v Google again, this time with Apple with an even bigger pile of cash to keep the lawyers fed in the manner they are accustomed to.
Just to be on the safe side, if I were Intel, best not design any ICs with Rounded Corners
This post has been deleted by its author
For years, Intel insisted that the only way to make a performant system was to divide on the CPU / DIMM SDRAM boundary. To be fair, partly because it’s enterprise customers pushed them.
But first the mobile SoC proved that soldered in memory achieved better price perfomance, despite howls of protest.
Now the M1 memory architecture shows that you wipe the floor with the old-school.
Intel now want to run a SoC-style project, as an experiment - as they have suddenly realised they will be out of the consumer PC and laptop market entirely within 3 years unless they do. Enterprise customers can do what they like.
And this guy is the person who will architect it for them, because there is too much internal inertia and passive aggressive sabotage to turn the ship otherwise.
He really doesn’t need to bring across any secrets. He just needs to put his foot down, and say “see those SoCs published over there? Drop an x86 into that, add HBM, and you’ll be all good.”
Gamers, those using computers for engineering work (CAD, software dev, ML etc) and actual proper servers need far more memory than beancounters, browers and general admin. They also need accelerator cards (usually GPU, but not always)
That's the reason for the split, it lets PC builders and end users fit as much RAM and as powerful an accelerator card as they want.
Apple just decided that they don't care about those markets - on macOS, gaming is dead, CAD is nearly dead, and there haven't been any servers for a decade. The Apple Arm switch has probably killed CAD on macOS entirely as it's now impossible to use accelerator cards.
And that's fine. Macs don't make any money for Apple, they could (and probably will) cease making them at all and make even more profit.