
The future
I bet these companies can't wait for the ability to use a Men in Black type neuralyzer to wipe all your intellectual knowledge on the day you resign.
Apple's iPhone chip designers mulled creating their own homegrown server processor for the Silicon Valley giant – but were shot down by Steve Jobs after they presented the idea at an internal meeting. In the following years, Apple executives continued to refuse to give the project the green light, arguing the iGiant is a …
Post Jobs? The man himself turned it down, he's been dead nearly a decade, can we stop idolising the arrogant asshole now?
Whilst we're at it, can we stop with saying how "innovative" Apple are, they haven't "innovated" anything, literally everything they've made was available by others, at best they merely either refined it, or told you their's were better, they're nothing more than a marketing company with a technology arm...
To be vaguely fair, Apple if nothing else have innovated how they package and market products to the masses. When I repaired apple products in the 90s it was pretty niche, outside of a few Macs in schools and at designers there were precious few using them.
iMacs, iPods and clamshell ibooks changed that rapidly. I can't think of any equally rapid rise in market share other than perhaps Samsung and Huawei in terms of hardware.
> can we stop with saying how "innovative" Apple are, they haven't "innovated" anything
Jobsian Apple excelled at making sure everything WORKED in an integrated manner. It was chaotic between his leaving/rejoining and it's become chaotic again since he died.
Reality Distortion Field aside, he actually did have a positive influence in the aspect of making sure stuff tied together "almost" seamlessly
Typical post-Jobs Apple if you ask me, “Innovation? Meh.”
Jobs stopped them doing it too.
I can understand why the company might want not want to commit billions to a project it has very little strategic interest in, like building its own server CPU.
But what I don’t get is why the company gives a damn about it when the ex staff had voted with their feet and are pursuing the idea anyway on their own resources. People leave all the time. So what? They’re not even competing in a sector Apple cares about. Even if one of them had been a bit cheeky with the timing of when it was set up that’s really not worth spending any lawyer time on at all.
A better approach would have been to make a small investment with very few strings attached, if only to have some good will and polite obligation coming back to the Apple mothership. That way there’d be some control and it’d be cheaper than lawyers.
Pick one or more of...
1) They dared to go against management and leave.
2) Potential to make management look foolish should they succeed.
3) Apple lost good people to this, which they perhaps could have otherwise retained.
4) Apple are run by a bunch of dicks.
There's a standard clause in contracts of employment where I work that says if you go to work elsewhere, you're not allowed to approach any former colleagues to try and recruit them into your new company/employer for 6 (or is it 12?) months.
Also have clauses in contracts with customers to prevent them from approaching staff with a view to direct employment. Indeed, we had one guy who left of his own volition to go and work for a customer - ended up having to prove that he'd gone voluntarily and that the customer had not tried to poach him.
I agree but I also think Jobs was probably concerned that they would have to spend a lot of time and money fighting against Intel. I'm not sure what the timing was but if they were courting Intel as a supplier and not a bigger company yet I can only assume they weren't wanting to start another fight on another front as that generally doesn't end well.
That said, I still think these days they should definitely be considering a takeover of AMD and then move from Intel to building their own inhouse chips.
It's been quite a while since I was at AMD, but I see a real culture clash if they tried that.
Plus, Intel's anti-competitive behavior is a HUGE reason that AMD has never been acquired. The industry desperately wants Intel to have effective competition. Intel needs someone to keep the regulators at bay. Otherwise, AMD would never have gotten the 286 contract in the first place.
But no one wants to be the "other" when negotiating for parts with Intel.
They've only been innovative when Wozniak and engineers like him were there. Now they just buy other people's tech and rebadge it.
And if Jobs was alive, he'd be suing also. Why? Because he was a cunt just like the Apple of today are. Why? Because they are essentially shouting
"No you can't make the server chip! No, no, no." Jobs dies. They carry on for 10 years "No, no, no". You quit and start up your own company to make that chip you pitched to them that they rejected for 10 years. "What? We rejected you for 10 years but fucking no way are you gonna make that chip without us. Despite us not wanting it we are gonna be petty cunts because we didn't want you to quit but you did anyway, by suing you until you run out of money. You then won't be able to make your chip and we won't benefit from the chip we never wanted. But at least we'll have stopped you. We'll totally forget the history of how Apple started. Where Wozniak had to offer the Apple I to HP first as he'd been making it on their time. They said they didn't want it but fine for him to go make it on his own. We'll forget that part of history because, we're cunts"
The colourful language was required to express how much of an arse Apple are being.
Mac Minis probably “ or dozens of them — hard at work.” according to their current website depicting rows of Mac Minis in racks.
The cynic in me says this is how they portrayed the Mac Mini server in days of old too.
https://www.apple.com/uk/mac-mini/
Apple outsources a lot of its cloud to Amazon and Microsoft. They keep building new datacenters but their services are growing fast enough that they can't catch up unless they start building datacenters a lot faster.
There probably isn't a huge benefit to using their own CPUs for their cloud - sure they would be cheaper but now that AMD is competitive Intel is being forced to drop their server CPU pricing so the delta is smaller than it was a couple years ago.
The big win would be if they wanted to provide an API for developers to run functions in the cloud rather than on a user's iPad, having the same CPU would make that easier. That could be useful for tasks like Photoshop, though the user's files would need to already be in the cloud or they'd need a very fast link (with very fast upload) to make the time taken uploading and then downloading again worth it.
>There probably isn't a huge benefit to using their own CPUs for their cloud
Agree, however there probably is a market for a new server CPU (and associated chipset?) and selling that - just like ARM, Intel, AMD etc. do - to third-parties to do whatever with.
"Agree, however there probably is a market for a new server CPU"
Lets see - we have:
- x86
- ARM (promising but market unconvinced at present)
- POWER & z (not terrible but they struggle to meet new people because of their big blue friend whos a little pushy)
- SPARC (semi-retired, mostly just happy that he's healthier than Itanium)
- Itanium (on life support, family considering to )
- MIPS (I think I can...I think I can...)
- recent legacy (PA-RISC/Alpha)
- older legacy (Z80,pre-RISC 8086, 68000, VAX etc)
At an extremely simplified view, server CPU's aim to arrange different functionality groups to deliver maximum sustained performance. And to deliver high performance, you increase the functional groups to the limit of your process technology if you can afford it. Most of that list cannot afford to hit the limits of current process technology because it is so expensive and if you get it wrong you lose a lot of money.
It's almost as if the architecture isn't the key issue and that there's something that sits above the architecture that drives development of successful CPUs...
I suspect you're right about it being a ARM solution.
My point about what is required to succeed is that its not the architecture or design that will allow your product to succeed. It's getting the backing and finding the market.
Working for Apple they had some of the backing and the market - I suspect if Nuvia do successfully produce a design, Apple will still be the market, probably as a low-power desktop chip with a scale out option for server workloads....
Given the number of companies and the amount invested in trying to get ARM server CPU's off the ground, I suspect Nuvia require 10x the capital they have to succeed (i.e. $2-3bn) given the current ARM server manufacturers are struggling for market share (i.e. Ampere eMAG, ThunderX2, Neoverse N1, A64FX and Huawei) And it will be hard to beat Huawei in the long term if it is viable.
FPGAs don’t scale up. A data centre stuffed with FPGAs is a very expensive thing to run. They’re only good for small scale deployments, and even then they’re good for only certain applications. Plus the bigger FPGAs are eye wateringly expensive. And they’re a bastard to develop for; I’ve seen FPGA based developments take so long that they’ve failed to keep up with the software guys who’d simply been waiting for Moore’s law to solve their problems.
If large scale up is required it’s far better to get an ASIC built with the same vhdl (or whatever) and save a ton of energy.
I'm not that confident about RISC-V tbh - I suspect it will go the way of MIPS, promising but never reaches it's potential.
Why? ARM will continue to provide pretty decent performance at a very low cost, so for educational purposes, it is sufficient.
For high performance server chips, someone needs to take RISC-V designs and include all the cache logic, longer pipelines and scalability - none of it is "hard to do with bright people" perspective but it is hard to do when each shot at it is likely to cost a few $10's of millions. And if you do everything as cheaply as possible (older process nodes, low volume, low frequencies, proven design rules etc) I'm not sure you will match mainstream ARM performance even with lots of cores. At which point commercial ARM server chips look more practical.
TL;DR: designing your own competitive CPU is expensive and there are a lot of cheap alternatives out there on multiple architectures.
>There probably isn't a huge benefit to using their own CPUs for their cloud
Apple have the biggest coffin in the world, around $400Billion they do not know what to do with.
Building a server CPU and a kick ass apple server they can do just to keep the talent happy.
If they ever built an "Apple server" it would be for their internal cloud only. That would probably interest their engineers a lot more - building for their internal use they'd have clearly defined use cases instead of having to build something for everyone, and would have a lot more blue sky opportunities for its design without the constraints of selling something that would be marketable to and serviceable by external customers.
There probably isn't a huge benefit to using their own CPUs for their cloud - sure they would be cheaper but now that AMD is competitive Intel is being forced to drop their server CPU pricing so the delta is smaller than it was a couple years ago.
Given current server CPU chip shortages, I wouldn't be so sure about that. Once burned, Apple likes to own their supply chain.
>Once burned, Apple likes to own their supply chain.
Probably, but if they did, bursting out to the public cloud is going to be much harder.
AMD is probably not an excellent fit either. Apple need good mobile CPUs for their laptops, whereas AMD appear to be console, mid-range and enthusiast-desktop focused, missing the dual/quad-core laptop high-end.
I suspect Apple will focus on their ARM IP and regard servers as a cost-of-business. It will be interesting to see if that changes as the consumer market cools off, but I can't see any other cloud provider picking up Apple's tech and I can't see Apple particularly wanting in-house proprietary server tech. I suspect choosing between AMD and Intel is just fine for them. If ARM comes to the table for scale-out low-latency data-moving that would be ok but they aren't really in the compute-cloud business like the others. This might change if they start embedding ARM in their desktops, or they make an ipad-terminal type system.
iCloud is powered by white unicorns beaming rainbows from their horns into candy fiber optic links, all in a heavenly walled garden floating in a cloud. It's not a dark room full of dirty, heartless machines running Windows and Linux.
"Got to wonder what they run the "apple cloud" on"
"Incredibly, in the weeks before Apple took its ex-chief architect to court, the multi-billion-dollar behemoth privately told Nuvia to stop recruiting engineers from its ranks of techies, yet behind the scenes, the iPhone giant was trying to hire one of the startup's top designers."
Translation: "How dare you do the things that we do!! What gives you the right to do the exact same thing that we do?"
Let us not forget that Apple was one of the gang of four that made a settlement for their illegal no poaching pact.
Now they are trying the same thing on with these guys; I have no doubt that any Apple ex-staffers that went to the new outfit did so because it would be really cool stuff to do and it is likely (knowing designers) that they applied rather than being poached deliberately.
Not really... Apple's strength has historically been in "technical marketing" rather than true innovation. Of course, they have innovated from time to time, but what makes Apple impressive is their ability to combine a bunch of existing stuff into something they can sell as special.
To be honest even their marketing approach is largely a refinement of other peoples. I remember a lot of Macolytes I was working with at the time gushing over wireless charging, when I'd had it for years on a Google Nexus 7 tablet and a Nokia Lumia 925 phone.
The removing a feature and jacking up the price approach is a favourite of car manufacturers, particularly the German companies. Back in the days when all cars had ashtrays and lighters as standard BMW were amongst the first to remove them and charge you for the privelege of them not being fitted.
The Apple 1 was slightly ahead of the curve in 76 but not by much. The apple 2 was just another nothing special 8 bit machine.
The Mac GUI paradigm was stolen wholesale from Xerox.
The iPhone was NOT the first touch screen smartphone, it simply had the biggest marketing budget with a large number of dedicated Apple fanboys to give it initial traction.
As for the rest of the overpriced toys coming out of Cupertino, innovative? Give me a break.
Apple was indeed innovative because of what they did in music and they had a touch screen iPod out for awhile before giving it a mobile radio. That was innovative.
So what if it wasn’t the first touchscreen mobile on the market those others had piss poor user experience. Which goes to show that they were just wanted to be the first with the technology and they weren’t very innovative in making it useful.
Check your history. Apple did NOT steal the GUI idea. Xerox, against the wishes of their PARC staff, took a payment of Apple stock to demonstrate networked computers, object orientated programming and the Xerox GUI, which Jobs immediately saw as the future for personal computing.
“I had three or four people (at Apple) who kept bugging that I get my rear over to Xerox PARC and see what they are doing. And, so I finally did. I went over there. And they were very kind. They showed me what they are working on.
And they showed me really three things. But I was so blinded by the first one that I didn’t even really see the other two. One of the things they showed me was object oriented programming – they showed me that but I didn’t even see that. The other one they showed me was a networked computer system… they had over a hundred Alto computers all networked using email etc., etc., I didn’t even see that. I was so blinded by the first thing they showed me, which was the graphical user interface. I thought it was the best thing I’d ever seen in my life.
Now remember it was very flawed. What we saw was incomplete, they’d done a bunch of things wrong. But we didn’t know that at the time but still thought they had the germ of the idea was there and they’d done it very well. And within – you know – ten minutes it was obvious to me that all computers would work like this some day. It was obvious. You could argue about how many years it would take. You could argue about who the winners and losers might be. You could’t argue about the inevitability, it was so obvious
Steve Jobs about his visit to Xerox PARC – Clip from Robert Cringley’s TV documentation “Triumph of the Nerds“.
> The apple 2 was just another nothing special 8 bit machine.
The Apple II was the first mass-market mainstream personal computer as we know it. (#) Due to the pace of technology, it was undeniably mediocre compared to computers released just a few years later, but it certainly wasn't "just another" 8-bit computer.
(#) Yes, there were predecessors that could be called "personal computers" such as the HP 9100A, but at $37,000 in today's money that wasn't remotely "mass market", and the Altair 8800- part of the scene that led to the development of the Apple II- was very much a geek hobbyist toy that used flip switches (no keyboard as standard!) and hardly mainstream friendly. The Commodore Pet and TRS-80 were the Apple II's contemporaries, but didn't hit the streets until slightly later.
To AMD to design the Zen core ... tearing it up in the laptop and data centre markets.
.. to Tesla on gardening leave
... and to Intel.
Apple should've said yes....
And where did Jim Keller come from....
Dec Alpha team...
... x86-64 at AMD (better Intel than Intel)
... Sibyte bought by Broadcom BCM14xx (better Mips than Mips)
... PA Semi ... bought by apple (better PowerPC than IBM)
... Apple ... better ARM than ARM.
Seriously interesting career...
... PA Semi ... bought by apple (better PowerPC than IBM)
At last, someone else who remembers PA Semi for their truly prodigious PowerPC SoC. It was fantastic.
When Apple bought PA Semi the first thing they did was abandon PA Semi’s chips, leaving the customers high and dry. Unfortunately one of those customers was Uncle Sam / DoD, and Apple were obliged to resume supply. It had been incorporated into military kit pretty quickly...
Oh and strictly speaking AMD came up with X64, not Intel. So right now it’s AMD are a better AMD than Intel are (apologies).
Sound about right.
EvilCorp: "We aren't interest in your product idea."
Engineer: "OK, I'll go make it myself."
EvilCorp: "Go f___ yourself, WE don't want your product idea but nobody else can have it either, we're suing."
Make no mistake -- Apple has the PR guys to imply they are all niceness and light, but Apple has in fact always had rather poor corporate behavior; if they had a grandma they would have sold her off years ago to make an extra buck.
Anonymous, because they are the kind of company that'd decide they do have a grandma and try to sue for slander.
With the arrival of ubiquitous multi-threading the separation between server and desktop CPUs was somewhat blurred. From that CPU they could have derived something for graphic stations, high end laptops and desktops and so on. Given Apple prices and overheating problems with products coming from the suppliers that CPU did have a market within their business. Saying just that the iGiant is a consumer-focused biz. seems a weak excuse, isn't is possible that they had other reasons, like no competition (illegal) undisclosed agreements?
There has to be more reason for this this is odd corporate behavior is it possible that they have some agreements to stay out of this business with one of existing companies. It would make sense why they are bending over backwards to stop this guy if they are obliged to stay out of it. Keeping that a secret for someone in his position wouldn’t sense though.
Good reads Very interesting topic and comments
Apple more recently has been accused of mistreating their employees, hiding money offshore, and not paying taxes. It has also been accused of violating legislation, and misusing its position where they have a monopoly in the market. Evil corporations can be seen to represent the danger of combining capitalism with larger hubris. In real life too, corporations have been accused of being evil. Let seat back wait and see what’s the true come out from this Bad Blood law sue crisis of Semiconductor industry in Silicon Valley industry.
Updated Arm today told The Reg its restructuring ahead of its return to the stock market is focused on cutting "non-engineering" jobs.
This is after we queried comments made this morning by Arm chief executive Rene Haas in the Financial Times, in which he indicated he was looking to use funds generated by the expected public listing to expand the company, hire more staff, and potentially pursue acquisitions. This comes as some staff face the chop.
This afternoon we were told by an Arm spokesperson: "Rene was referring more to the fact that Arm continues to invest significantly in its engineering talent, which makes up around 75 percent of the global headcount. For example, we currently have more than 250 engineering roles available globally."
Arm has at least one of Intel's more capable mainstream laptop processors in mind with its Cortex-X3 CPU design.
The British outfit said the X3, revealed Tuesday alongside other CPU and GPU blueprints, is expected to provide an estimated 34 percent higher peak performance than a performance core in Intel's upper mid-range Core i7-1260P processor from this year.
Arm came to that conclusion, mind you, after running the SPECRate2017_int_base single-threaded benchmark in a simulation of its CPU core design clocked at an equivalent to 3.6GHz with 1MB of L2 and 16MB of L3 cache.
Analysis Supermicro launched a wave of edge appliances using Intel's newly refreshed Xeon-D processors last week. The launch itself was nothing to write home about, but a thought occurred: with all the hype surrounding the outer reaches of computing that we call the edge, you'd think there would be more competition from chipmakers in this arena.
So where are all the AMD and Arm-based edge appliances?
A glance through the catalogs of the major OEMs – Dell, HPE, Lenovo, Inspur, Supermicro – returned plenty of results for AMD servers, but few, if any, validated for edge deployments. In fact, Supermicro was the only one of the five vendors that even offered an AMD-based edge appliance – which used an ageing Epyc processor. Hardly a great showing from AMD. Meanwhile, just one appliance from Inspur used an Arm-based chip from Nvidia.
WWDC Apple opened its 33rd annual Worldwide Developer Conference on Monday with a preview of upcoming hardware and planned changes in its mobile, desktop, and wrist accessory operating systems.
The confab consists primarily of streamed video, as it did in 2020 and 2021, though there is a limited in-person component for the favored few. Apart from the preview of Apple's homegrown Arm-compatible M2 chip – coming next month in a redesigned MacBook Air and 13" MacBook Pro – there was not much meaningful innovation. The M2 Air has a full-size touch ID button, apparently.
Apple's software-oriented enhancements consist mainly of worthy but not particularly thrilling interface and workflow improvements, alongside a handful of useful APIs and personalization capabilities. Company video performers made no mention of Apple's anticipated AR/VR headset.
Workers at an Apple Store in Towson, Maryland have voted to form a union, making them the first of the iGiant's retail staff to do so in the United States.
Out of 110 eligible voters, 65 employees voted in support of unionization versus 33 who voted against it. The organizing committee, known as the Coalition of Organized Retail Employees (CORE), has now filed to certify the results with America's National Labor Relations Board. Members joining this first-ever US Apple Store union will be represented by the International Association of Machinists and Aerospace Workers (IAM).
"I applaud the courage displayed by CORE members at the Apple store in Towson for achieving this historic victory," IAM's international president Robert Martinez Jr said in a statement on Saturday. "They made a huge sacrifice for thousands of Apple employees across the nation who had all eyes on this election."
The UK government is continuing efforts to have chip designer and licensor Arm listed on the London Stock Exchange after its public offering rather than New York, as is the current plan.
At stake is whether Arm moves its headquarters to the US, potentially leading to the further loss of UK jobs.
Speaking to the Financial Times, UK minister for Technology and the Digital Economy Chris Philp said the government was still "working closely with" Arm management on the IPO process, despite its parent SoftBank having previously indicated that it was planning to list Arm on the Nasdaq stock exchange in New York.
The UK government is upping the ante in attempts to have Arm listed on the London stock exchange, with reports suggesting it is considering the threat of national security laws to force the issue with owner SoftBank.
According to the Financial Times, the British administration is considering whether to apply the National Security and Investment Act (NSIA), which came into force at the start of the year, in a bid to have SoftBank change its mind over listing Arm exclusively on the Nasdaq in New York, as it has previously indicated.
The FT cites the usual "people familiar with the matter", who indicated there had not yet been a formal debate over using national security legislation, and the idea was opposed by some government officials.
Arm is most likely to list on the US stock exchange Nasdaq, according to Masayoshi Son, chief executive of SoftBank Group, which bought the chip designer in 2016 for $32 billion.
Although he stressed no final decision had been made, Son told investors that the British chip designer was better suited to a US listing. "Most of Arm's clients are based in Silicon Valley and... stock markets in the US would love to have Arm," Son told shareholders at the company's annual general meeting.
He said there were also requests to list Arm in London without elaborating on where they came from. The entrepreneur did not say whether the conglomerate is considering a secondary listing for Arm there.
Arm has a champion in the shape of HPE, which has added a server powered by the British chip designer's CPU cores to its ProLiant portfolio, aimed at cloud-native workloads for service providers and enterprise customers alike.
Announced at the IT titan's Discover 2022 conference in Las Vegas, the HPE ProLiant RL300 Gen11 server is the first in a series of such systems powered by Ampere's Altra and Altra Max processors, which feature up to 80 and 128 Arm-designed Neoverse cores, respectively.
The system is set to be available during Q3 2022, so sometime in the next three months, and is basically an enterprise-grade ProLiant server – but with an Arm processor at its core instead of the more usual Intel Xeon or AMD Epyc X86 chips.
Arm is beefing up its role in the rapidly-evolving (yet long-standing) hardware-based real-time ray tracing arena.
The company revealed on Tuesday that it will introduce the feature in its new flagship Immortalis-G715 GPU design for smartphones, promising to deliver graphics in mobile games that realistically recreate the way light interacts with objects.
Arm is promoting the Immortalis-G715 as its best mobile GPU design yet, claiming that it will provide 15 percent faster performance and 15 percent better energy efficiency compared to the currently available Mali-G710.
Biting the hand that feeds IT © 1998–2022