The future
I bet these companies can't wait for the ability to use a Men in Black type neuralyzer to wipe all your intellectual knowledge on the day you resign.
Apple's iPhone chip designers mulled creating their own homegrown server processor for the Silicon Valley giant – but were shot down by Steve Jobs after they presented the idea at an internal meeting. In the following years, Apple executives continued to refuse to give the project the green light, arguing the iGiant is a …
Post Jobs? The man himself turned it down, he's been dead nearly a decade, can we stop idolising the arrogant asshole now?
Whilst we're at it, can we stop with saying how "innovative" Apple are, they haven't "innovated" anything, literally everything they've made was available by others, at best they merely either refined it, or told you their's were better, they're nothing more than a marketing company with a technology arm...
To be vaguely fair, Apple if nothing else have innovated how they package and market products to the masses. When I repaired apple products in the 90s it was pretty niche, outside of a few Macs in schools and at designers there were precious few using them.
iMacs, iPods and clamshell ibooks changed that rapidly. I can't think of any equally rapid rise in market share other than perhaps Samsung and Huawei in terms of hardware.
> can we stop with saying how "innovative" Apple are, they haven't "innovated" anything
Jobsian Apple excelled at making sure everything WORKED in an integrated manner. It was chaotic between his leaving/rejoining and it's become chaotic again since he died.
Reality Distortion Field aside, he actually did have a positive influence in the aspect of making sure stuff tied together "almost" seamlessly
Typical post-Jobs Apple if you ask me, “Innovation? Meh.”
Jobs stopped them doing it too.
I can understand why the company might want not want to commit billions to a project it has very little strategic interest in, like building its own server CPU.
But what I don’t get is why the company gives a damn about it when the ex staff had voted with their feet and are pursuing the idea anyway on their own resources. People leave all the time. So what? They’re not even competing in a sector Apple cares about. Even if one of them had been a bit cheeky with the timing of when it was set up that’s really not worth spending any lawyer time on at all.
A better approach would have been to make a small investment with very few strings attached, if only to have some good will and polite obligation coming back to the Apple mothership. That way there’d be some control and it’d be cheaper than lawyers.
Pick one or more of...
1) They dared to go against management and leave.
2) Potential to make management look foolish should they succeed.
3) Apple lost good people to this, which they perhaps could have otherwise retained.
4) Apple are run by a bunch of dicks.
There's a standard clause in contracts of employment where I work that says if you go to work elsewhere, you're not allowed to approach any former colleagues to try and recruit them into your new company/employer for 6 (or is it 12?) months.
Also have clauses in contracts with customers to prevent them from approaching staff with a view to direct employment. Indeed, we had one guy who left of his own volition to go and work for a customer - ended up having to prove that he'd gone voluntarily and that the customer had not tried to poach him.
I agree but I also think Jobs was probably concerned that they would have to spend a lot of time and money fighting against Intel. I'm not sure what the timing was but if they were courting Intel as a supplier and not a bigger company yet I can only assume they weren't wanting to start another fight on another front as that generally doesn't end well.
That said, I still think these days they should definitely be considering a takeover of AMD and then move from Intel to building their own inhouse chips.
It's been quite a while since I was at AMD, but I see a real culture clash if they tried that.
Plus, Intel's anti-competitive behavior is a HUGE reason that AMD has never been acquired. The industry desperately wants Intel to have effective competition. Intel needs someone to keep the regulators at bay. Otherwise, AMD would never have gotten the 286 contract in the first place.
But no one wants to be the "other" when negotiating for parts with Intel.
They've only been innovative when Wozniak and engineers like him were there. Now they just buy other people's tech and rebadge it.
And if Jobs was alive, he'd be suing also. Why? Because he was a cunt just like the Apple of today are. Why? Because they are essentially shouting
"No you can't make the server chip! No, no, no." Jobs dies. They carry on for 10 years "No, no, no". You quit and start up your own company to make that chip you pitched to them that they rejected for 10 years. "What? We rejected you for 10 years but fucking no way are you gonna make that chip without us. Despite us not wanting it we are gonna be petty cunts because we didn't want you to quit but you did anyway, by suing you until you run out of money. You then won't be able to make your chip and we won't benefit from the chip we never wanted. But at least we'll have stopped you. We'll totally forget the history of how Apple started. Where Wozniak had to offer the Apple I to HP first as he'd been making it on their time. They said they didn't want it but fine for him to go make it on his own. We'll forget that part of history because, we're cunts"
The colourful language was required to express how much of an arse Apple are being.
Mac Minis probably “ or dozens of them — hard at work.” according to their current website depicting rows of Mac Minis in racks.
The cynic in me says this is how they portrayed the Mac Mini server in days of old too.
https://www.apple.com/uk/mac-mini/
Apple outsources a lot of its cloud to Amazon and Microsoft. They keep building new datacenters but their services are growing fast enough that they can't catch up unless they start building datacenters a lot faster.
There probably isn't a huge benefit to using their own CPUs for their cloud - sure they would be cheaper but now that AMD is competitive Intel is being forced to drop their server CPU pricing so the delta is smaller than it was a couple years ago.
The big win would be if they wanted to provide an API for developers to run functions in the cloud rather than on a user's iPad, having the same CPU would make that easier. That could be useful for tasks like Photoshop, though the user's files would need to already be in the cloud or they'd need a very fast link (with very fast upload) to make the time taken uploading and then downloading again worth it.
>There probably isn't a huge benefit to using their own CPUs for their cloud
Agree, however there probably is a market for a new server CPU (and associated chipset?) and selling that - just like ARM, Intel, AMD etc. do - to third-parties to do whatever with.
"Agree, however there probably is a market for a new server CPU"
Lets see - we have:
- x86
- ARM (promising but market unconvinced at present)
- POWER & z (not terrible but they struggle to meet new people because of their big blue friend whos a little pushy)
- SPARC (semi-retired, mostly just happy that he's healthier than Itanium)
- Itanium (on life support, family considering to )
- MIPS (I think I can...I think I can...)
- recent legacy (PA-RISC/Alpha)
- older legacy (Z80,pre-RISC 8086, 68000, VAX etc)
At an extremely simplified view, server CPU's aim to arrange different functionality groups to deliver maximum sustained performance. And to deliver high performance, you increase the functional groups to the limit of your process technology if you can afford it. Most of that list cannot afford to hit the limits of current process technology because it is so expensive and if you get it wrong you lose a lot of money.
It's almost as if the architecture isn't the key issue and that there's something that sits above the architecture that drives development of successful CPUs...
I suspect you're right about it being a ARM solution.
My point about what is required to succeed is that its not the architecture or design that will allow your product to succeed. It's getting the backing and finding the market.
Working for Apple they had some of the backing and the market - I suspect if Nuvia do successfully produce a design, Apple will still be the market, probably as a low-power desktop chip with a scale out option for server workloads....
Given the number of companies and the amount invested in trying to get ARM server CPU's off the ground, I suspect Nuvia require 10x the capital they have to succeed (i.e. $2-3bn) given the current ARM server manufacturers are struggling for market share (i.e. Ampere eMAG, ThunderX2, Neoverse N1, A64FX and Huawei) And it will be hard to beat Huawei in the long term if it is viable.
FPGAs don’t scale up. A data centre stuffed with FPGAs is a very expensive thing to run. They’re only good for small scale deployments, and even then they’re good for only certain applications. Plus the bigger FPGAs are eye wateringly expensive. And they’re a bastard to develop for; I’ve seen FPGA based developments take so long that they’ve failed to keep up with the software guys who’d simply been waiting for Moore’s law to solve their problems.
If large scale up is required it’s far better to get an ASIC built with the same vhdl (or whatever) and save a ton of energy.
I'm not that confident about RISC-V tbh - I suspect it will go the way of MIPS, promising but never reaches it's potential.
Why? ARM will continue to provide pretty decent performance at a very low cost, so for educational purposes, it is sufficient.
For high performance server chips, someone needs to take RISC-V designs and include all the cache logic, longer pipelines and scalability - none of it is "hard to do with bright people" perspective but it is hard to do when each shot at it is likely to cost a few $10's of millions. And if you do everything as cheaply as possible (older process nodes, low volume, low frequencies, proven design rules etc) I'm not sure you will match mainstream ARM performance even with lots of cores. At which point commercial ARM server chips look more practical.
TL;DR: designing your own competitive CPU is expensive and there are a lot of cheap alternatives out there on multiple architectures.
>There probably isn't a huge benefit to using their own CPUs for their cloud
Apple have the biggest coffin in the world, around $400Billion they do not know what to do with.
Building a server CPU and a kick ass apple server they can do just to keep the talent happy.
If they ever built an "Apple server" it would be for their internal cloud only. That would probably interest their engineers a lot more - building for their internal use they'd have clearly defined use cases instead of having to build something for everyone, and would have a lot more blue sky opportunities for its design without the constraints of selling something that would be marketable to and serviceable by external customers.
There probably isn't a huge benefit to using their own CPUs for their cloud - sure they would be cheaper but now that AMD is competitive Intel is being forced to drop their server CPU pricing so the delta is smaller than it was a couple years ago.
Given current server CPU chip shortages, I wouldn't be so sure about that. Once burned, Apple likes to own their supply chain.
>Once burned, Apple likes to own their supply chain.
Probably, but if they did, bursting out to the public cloud is going to be much harder.
AMD is probably not an excellent fit either. Apple need good mobile CPUs for their laptops, whereas AMD appear to be console, mid-range and enthusiast-desktop focused, missing the dual/quad-core laptop high-end.
I suspect Apple will focus on their ARM IP and regard servers as a cost-of-business. It will be interesting to see if that changes as the consumer market cools off, but I can't see any other cloud provider picking up Apple's tech and I can't see Apple particularly wanting in-house proprietary server tech. I suspect choosing between AMD and Intel is just fine for them. If ARM comes to the table for scale-out low-latency data-moving that would be ok but they aren't really in the compute-cloud business like the others. This might change if they start embedding ARM in their desktops, or they make an ipad-terminal type system.
iCloud is powered by white unicorns beaming rainbows from their horns into candy fiber optic links, all in a heavenly walled garden floating in a cloud. It's not a dark room full of dirty, heartless machines running Windows and Linux.
"Got to wonder what they run the "apple cloud" on"
"Incredibly, in the weeks before Apple took its ex-chief architect to court, the multi-billion-dollar behemoth privately told Nuvia to stop recruiting engineers from its ranks of techies, yet behind the scenes, the iPhone giant was trying to hire one of the startup's top designers."
Translation: "How dare you do the things that we do!! What gives you the right to do the exact same thing that we do?"
Let us not forget that Apple was one of the gang of four that made a settlement for their illegal no poaching pact.
Now they are trying the same thing on with these guys; I have no doubt that any Apple ex-staffers that went to the new outfit did so because it would be really cool stuff to do and it is likely (knowing designers) that they applied rather than being poached deliberately.
Not really... Apple's strength has historically been in "technical marketing" rather than true innovation. Of course, they have innovated from time to time, but what makes Apple impressive is their ability to combine a bunch of existing stuff into something they can sell as special.
To be honest even their marketing approach is largely a refinement of other peoples. I remember a lot of Macolytes I was working with at the time gushing over wireless charging, when I'd had it for years on a Google Nexus 7 tablet and a Nokia Lumia 925 phone.
The removing a feature and jacking up the price approach is a favourite of car manufacturers, particularly the German companies. Back in the days when all cars had ashtrays and lighters as standard BMW were amongst the first to remove them and charge you for the privelege of them not being fitted.
The Apple 1 was slightly ahead of the curve in 76 but not by much. The apple 2 was just another nothing special 8 bit machine.
The Mac GUI paradigm was stolen wholesale from Xerox.
The iPhone was NOT the first touch screen smartphone, it simply had the biggest marketing budget with a large number of dedicated Apple fanboys to give it initial traction.
As for the rest of the overpriced toys coming out of Cupertino, innovative? Give me a break.
Apple was indeed innovative because of what they did in music and they had a touch screen iPod out for awhile before giving it a mobile radio. That was innovative.
So what if it wasn’t the first touchscreen mobile on the market those others had piss poor user experience. Which goes to show that they were just wanted to be the first with the technology and they weren’t very innovative in making it useful.
Check your history. Apple did NOT steal the GUI idea. Xerox, against the wishes of their PARC staff, took a payment of Apple stock to demonstrate networked computers, object orientated programming and the Xerox GUI, which Jobs immediately saw as the future for personal computing.
“I had three or four people (at Apple) who kept bugging that I get my rear over to Xerox PARC and see what they are doing. And, so I finally did. I went over there. And they were very kind. They showed me what they are working on.
And they showed me really three things. But I was so blinded by the first one that I didn’t even really see the other two. One of the things they showed me was object oriented programming – they showed me that but I didn’t even see that. The other one they showed me was a networked computer system… they had over a hundred Alto computers all networked using email etc., etc., I didn’t even see that. I was so blinded by the first thing they showed me, which was the graphical user interface. I thought it was the best thing I’d ever seen in my life.
Now remember it was very flawed. What we saw was incomplete, they’d done a bunch of things wrong. But we didn’t know that at the time but still thought they had the germ of the idea was there and they’d done it very well. And within – you know – ten minutes it was obvious to me that all computers would work like this some day. It was obvious. You could argue about how many years it would take. You could argue about who the winners and losers might be. You could’t argue about the inevitability, it was so obvious
Steve Jobs about his visit to Xerox PARC – Clip from Robert Cringley’s TV documentation “Triumph of the Nerds“.
> The apple 2 was just another nothing special 8 bit machine.
The Apple II was the first mass-market mainstream personal computer as we know it. (#) Due to the pace of technology, it was undeniably mediocre compared to computers released just a few years later, but it certainly wasn't "just another" 8-bit computer.
(#) Yes, there were predecessors that could be called "personal computers" such as the HP 9100A, but at $37,000 in today's money that wasn't remotely "mass market", and the Altair 8800- part of the scene that led to the development of the Apple II- was very much a geek hobbyist toy that used flip switches (no keyboard as standard!) and hardly mainstream friendly. The Commodore Pet and TRS-80 were the Apple II's contemporaries, but didn't hit the streets until slightly later.
To AMD to design the Zen core ... tearing it up in the laptop and data centre markets.
.. to Tesla on gardening leave
... and to Intel.
Apple should've said yes....
And where did Jim Keller come from....
Dec Alpha team...
... x86-64 at AMD (better Intel than Intel)
... Sibyte bought by Broadcom BCM14xx (better Mips than Mips)
... PA Semi ... bought by apple (better PowerPC than IBM)
... Apple ... better ARM than ARM.
Seriously interesting career...
... PA Semi ... bought by apple (better PowerPC than IBM)
At last, someone else who remembers PA Semi for their truly prodigious PowerPC SoC. It was fantastic.
When Apple bought PA Semi the first thing they did was abandon PA Semi’s chips, leaving the customers high and dry. Unfortunately one of those customers was Uncle Sam / DoD, and Apple were obliged to resume supply. It had been incorporated into military kit pretty quickly...
Oh and strictly speaking AMD came up with X64, not Intel. So right now it’s AMD are a better AMD than Intel are (apologies).
Sound about right.
EvilCorp: "We aren't interest in your product idea."
Engineer: "OK, I'll go make it myself."
EvilCorp: "Go f___ yourself, WE don't want your product idea but nobody else can have it either, we're suing."
Make no mistake -- Apple has the PR guys to imply they are all niceness and light, but Apple has in fact always had rather poor corporate behavior; if they had a grandma they would have sold her off years ago to make an extra buck.
Anonymous, because they are the kind of company that'd decide they do have a grandma and try to sue for slander.
With the arrival of ubiquitous multi-threading the separation between server and desktop CPUs was somewhat blurred. From that CPU they could have derived something for graphic stations, high end laptops and desktops and so on. Given Apple prices and overheating problems with products coming from the suppliers that CPU did have a market within their business. Saying just that the iGiant is a consumer-focused biz. seems a weak excuse, isn't is possible that they had other reasons, like no competition (illegal) undisclosed agreements?
There has to be more reason for this this is odd corporate behavior is it possible that they have some agreements to stay out of this business with one of existing companies. It would make sense why they are bending over backwards to stop this guy if they are obliged to stay out of it. Keeping that a secret for someone in his position wouldn’t sense though.
Good reads Very interesting topic and comments
Apple more recently has been accused of mistreating their employees, hiding money offshore, and not paying taxes. It has also been accused of violating legislation, and misusing its position where they have a monopoly in the market. Evil corporations can be seen to represent the danger of combining capitalism with larger hubris. In real life too, corporations have been accused of being evil. Let seat back wait and see what’s the true come out from this Bad Blood law sue crisis of Semiconductor industry in Silicon Valley industry.