I thought it was about time Apple changed CPU architecture
It's all been a bit stable for too long.
For years, Apple has been said to be working on its own in-house Arm architecture to replace Chipzilla's silicon. And, while nothing is confirmed, famously reliable analyst Ming-Chi Kuo reckons they'll come in the first half of 2021. "We expect that Apple's new products in 12 to 18 months will adopt processors made by 5nm …
I rather suspect its got nothing todo with the readiness of the Tech and everything to do with current state of the relationship between Apple and Intel.
Theres probably a break clause or volume/cost step coming up or Intels shortages are starting to hurt their Apple shipments.
These stories always tend to leak around those points.
Icon - bit of Intel wallet lifting approaching --->
It's easy to make claims about 'ARM having caught up with Intel CPUs', but color me a sepia-shade of skeptical until this is backed up with some solid evidence. When for example the Raspberry Pi 4 (with zippy quad Cortex-A72 cores) is promoted as a 'desktop replacement', but as Wired notes, this would be a 2012-era desktop: https://www.wired.com/review/raspberry-pi-4/
Until there are some solid apples-to-apples (excuse the pun) benchmarks that show that Apple's ARM SoCs come even remotely close to a budget AMD APU in a fair comparison, I'll take this analyst's claims with a 1 kg bag of the finest skeptic's salt.
"Erm, try looking at Geekbench results. The current Apple A13 is slightly faster at single core work than an AMD Ryzen 3800. It loses in multi core, but then it only has 2 fast cores vs AMDs 8."
Geekbench 4 benchmark on gadgetversus.com/processor/amd-ryzen-7-vs-apple-a13-bionic/
Apple A13 Single core - 5,432, All cores - 13,554
AMD Ryzen 7 3800X Single core - 5,738 All cores - 36,130
A13 vs Snapdragon 865 shows the A13 to be 93.7% of the performance of the SD.
"Now, it is a personal choice whether you like iPhone or Android devices, otherwise on paper Snapdragon 865 comprehensively beat the Apple A13 chipset."
Geekbench score for the A13X, in the latest iPad Pro is a little faster than the i7-8700U.
Power draw is way lower and it fits in a much smaller space.
Add more cores, up the clock speed and add the extra cooling this requires, such that it matches the Intel U-Series space and power budget, and you may well have something that is way faster than Intel.
Raw numbers are fine for the "mine's faster than yours" brigade. What we need are actual numbers for running things like Photoshop under macOS and real desktop multi-tasking results. Of course, we won't see that until Apple actually releases something.
The architecture has been fine-tuned for lower power and "co-operative" multi-tasking, with non-visible tasks sleeping most of the time. I suspect this is where Apple has been concentrating its efforts, getting their ARM design optimized for a general computer, as opposed to the more controlled environment found on an iPhone or iPad.
It will be interesting to see, if they can turn that theoretical advantage into a real one.
Apple also has the advantage, that they have done this before. As well as Rosetta, they had "big executables", which had resource branches for PowerPC and Intel code. They just need to bring that back into play for Intel and ARM, then developers can cross-compile to both platforms in one package.
Of course, the emulation would need to be there for software that hasn't been re-released in dual-executable format.
I did some benchmarks with an iPhone XR vs quad core iMac running some math intensive code. Same code, using all available cores, ran faster on the iPhone with 2 fast and 4 slow cores than on the iMac.
Apples iPhone processors are not the same as those in a Raspberry Pi.
I've been doing a few projects in 'C' in which the tests include running benchmarks on multiple platforms. One of them is a Raspberry Pi 3.
Using a wide mix of mathematical operations on arrays, on average a Raspberry Pi 3 comes in at close to 20% of the performance of mid range desktop Intel compatible CPU running at 1.8 GHz. That's a average though, and isn't normalised according to how frequently those instructions would be encountered in a typical application.
On an individual basis they can vary between the RPi 3 actually being faster than the Intel CPU to only 10% of the speed. A lot varies on the individual CPU and whether an SIMD operation is available for that task.
A Raspberry Pi 3 is ARMv7, which only has 64 bit SIMD vectors, while the Intel CPU has 128 bit vectors (256 bit is theoretically available, but the Intel architecture is such a mess of conflicting variants that it's not practical to use it on a large scale). An ARMv8, such as is found in a Raspberry Pi 4 also has 128 bit SIMD vectors. I plan on getting my hands on a Raspberry Pi 4 to conduct more testing, but essentially it should nearly double the performance of many operations just based on having twice the SIMD width. The faster clock rate will help as well. Apple's hypothetical ARM chip, whatever that turns out to be, will I suspect be even faster than an Raspberry Pi 4 and will likely have a faster memory interface as well.
I also benchmarked an old, low end laptop dating from the Windows XP era (I put Linux on it for the test). It came in as slower than a Raspberry Pi 3.
Many of the sort of things which do very intensive numerical operations can benefit greatly from SIMD and other such hardware acceleration. There's nothing to prevent Apple from adding special processing units or instructions targetting particular use cases such as image or video editing to accelerate those to get even better performance. GPU acceleration will play a part in this as well, and some of the really intensive tasks may make use of GPU based calculations.
To make a long story short, the actual performance of a hypothetical Apple ARM PC will depend a lot on the specific use cases and how the application software was written. UI performance will be primarily GPU bound, and I don't expect Apple to use a slower GPU.
Based on my own experience, I expect that synthetic benchmarks that look at some sort of average will be almost meaningless when comparing CPUs which have a different fundamental architecture, and we would need to see application oriented benchmarks to get a real idea of how it would perform in real life.
Pi 3 is ARMv8 - as is the Pi 3+ and the Pi 4. It’s just that Raspbian is currently run in v7 compatibility mode for reasons. There is a 64bit/v8 kernel for testing and several other OSs such as Gentoo.
Pi 3 is A53 40nm quad core, Pi 4 is A72 28nm. For the only benchmarks I care about (Smalltalk) Pi 4 is about 3x faster than 3 and about 25% of my i7 iMac - for about 2% of the price.
Thanks, I just had a deeper look into it and it turns out that in 32 bit mode the Pi 3 is ARMv7, but in 64 bit mode it's ARMv8. Standard Raspbian OS is 32 bit only, so that apparently limits both Pi3 and Pi4 to ARMv7. I suspect that the reason for this is so they can offer a single image to cover all currently supported versions of Pi (which include the 32 bit only Pi2) instead of offering two different versions of Raspbian. Most of their user base probably won't care one way or the other.
I had just relied on looking at /proc/cpuinfo to find out what ARM version my Pi had and hadn't dug further. I still plan to buy a Pi4 for testing, but will probably put a 64 bit version of Debian or Ubuntu on it to get the best performance and improved SIMD support.
I found a blog post where someone ran both 32 and 64 bit benchmarks on a Pi3 (using Debian for the 64 bit) and he got very significant performance improvements by using 64 bit on the same hardware. Even without SIMD, 32 versus 64 bit makes a very significant difference on x86 as well, according to my testing, so that shouldn't be too surprising when it comes to ARM as well.
As Apple would undoubtedly run any hypothetical ARM laptop in 64 bit mode, that needs to go into the ARM versus x86 comparison discussion as well. Start to add all these factors up and an ARM CPU starts to look quite credible from a performance standpoint.
Thanks again, I'll be taking this into account in my future plans.
I tried to buy another Pi3 recently and they were out of stock so I had to get a 4. The 4 is a lot faster than the 3 but I was disappoined to find that it gets hot -- many cases have fans built into them. This is great for a little computer but not something I want to run 24/7.
Since Pis tend to used 'last year's processor' I can only speculate what a desktop with a state of the art processing system would be like -- I suspect it would haul.
On the other hand, running a math benchmark on a multi-tasking PC is never going to give you the full performance, unless you boot into benchmark, with no macOS running in the background. iOS has an advantage there, in that it sleeps a lot of the background stuff.
But it is still indicative. What will be interesting is to see if their ARM architecture can do full multi-tasking and what Apple had to tweak to get it from the phone to the laptop. I think it will be very interesting.
Common math libraries such as Numpy (used with Python) are available on ARM as well as x86. ARM also have an ARM specific SIMD math library.
If you use GCC, then using SIMD vector code via built-ins (pseudo-functions which patch in SIMD instructions into C code) is pretty straightforward and actually much easier and better documented with ARM than it is with x86. If you can write GCC SIMD on x86, then doing it on ARM is a breeze in comparison. The main thing is to have a good set of unit tests to automate the testing process on multiple architectures and that's something you really ought to have anyway.
If you are using LLVM, then the SIMD situation is not as good, as their built-in support is very limited. Auto-vectorisation (letting the compiler try to do it itself) on LLVM can help to only a very limited degree. It can work on very simple things, but often with SIMD you need to use a different algorithm than you would with generic C code and there's really no obvious way to express that in C.
If Apple suddenly start making major commits to LLVM relating to ARM support to fill in some of these gaps in compiler support (as compared to GCC), then it might indicate they have something up their sleeves with respect to future hardware releases.
Interested parties should check out the Youtube channel of one Christopher 'Haircut' Barnett, here he is using an RPi for everything for a week: https://www.youtube.com/watch?v=i3ddj9_LMgU
I use a Raspberry Pi for general tasks. If I sit down on a computer at home then it is a Pi. I still use Intel an Windows for mobile things that I can't do on a Gemini. I would really like a Pi laptop, this would mean I only keep a widows machine for stuff not available for Pi.
Who says it has to come remotely close?
For 80% of use cases 2012 chippery is fine. For everyday tasks my 2013 MBA works better in most cases than my stupidly power hungry 2018 6 core MBP.
I also have a Sony Vaio TT from 2008 with an absolutely weedy Core 2 Duo U9400 that runs Win 10 fine for my daughter.
Infact after my work MBP the most powerful CPU I currently own is probably the one in my 10" Ipad Pro.
One of the great advantages of the Apple ecosystem is the huge variety professional software that's available for Macs. That's all currently coded for x64 hardware so a good and efficient emulator will be required in the first instance. The interesting thing is going to be how much of that software is rewritten to make it directly compatible with ARM architecture.
Given Apple's previous record, I expect them to stuff up at some stage and successfully upset their customers.
To be fair, the move from 32bit to 64bit was nothing more than a recompile. I'd expect this to be the same.
HOWEVER: If you've been lazy / underfunded / work in agile / etc, and used / relied upon 3rd party libraries, then I pity you, because you'll either sit in Apple purgatory, or need to dump them and write your own.
I read that as Vanity Professionals.
Really on Apple Profits the sales of kit for Video Editing, DTP or gaming etc is small change. They already showed how much they cared about some users with the Waste Bin model. The Mac Air and similar might as well be an iPad with a keyboard. Coffee shop hipsters using the Web. Or aspiring writers using Scrivener.
I thought most/all software you download is 100% CPU specific? (ie binary?)
Apple has a pretty poor track record of backwards compatibility, even dropping 32 bit intel support in the latest macOS, cost saving blunder. they could also have kept rosetta alive still. Maybe isolating those apps in container like technology to minimize security risks.
Most macOS specific software runs slower than under windows.
MacOS only supports AMD GPU:s.
Hence in this light moving to Arm might not matter much, because performance sensitive customer workloads would have used Windows anyway.
"Apple has a pretty poor track record of backwards compatibility,"
Not in my experience. I've weathered the changes from Motorola to Power to Intel without any problems or loss of software. During the same time with Microsoft mostly on Intel I've had the experience of the worst backward compatibility I can recall, as even relatively minor OS version changes have resulted in software that stops working. By the time Apple has dropped support for a previous version of software the manufacturers have released native versions as part of the upgrade process.
Newton. Apple did invest in ARM.
Steve came back, killed the Newton. Worst decision he made.
MS bailed out Dying Apple.
Then maybe the iPod because of iTunes saved Apple?
Also the bulk of profit is from Arm based phone (innovative carrier/data plan marketing), with Tablet and Watch a bit behind.
So they've been doing it for 13 years.
The Apple laptops are mainly so the Genius bar and HQ laptops don't have Dell logos.
Mine's the one with a Moto Dragonball PDA in the pocket.
> Steve came back, killed the Newton. Worst decision he made.
Can you expand upon that point? Genuinely curious.
I only know that other stylus-driven devices that followed the Newton (Palm, WinCE) remained relatively niche and took a different approach to text input ( having the user learn stylised letters instead of the device transcribing natural handwriting). Even today, stylus driven phones such as the Galaxy Note are rarer than their stickless cousins.
Certainly they needed to change the approach on text input. It was too hyped and too ambitious, However the Newton PDA in general, esp with Mac Sync, would have been worth sticking with.
It would have helped them in Smart phones (9 years late to market) and the iPod (a few years late to the solid state PMP market).
As it was they had to buy in the GUI and most of the iPhone design!
They were totally struggling till the record company per track deal & the iPod. The next big boost was the carrier data deals for iPhone, till then only rich & corporates could afford data access.
The eBook and eink ereader is niche, but Amazon has 90% or more of both.
I did play with a Newton at the time and thought it lacked useful applications and that the handwriting aspect was rubbish.
> It would have helped them in Smart phones (9 years late to market) and the iPod (a few years late to the solid state PMP market).
For sure, but being late to market didn't hurt Apples bottom line - although of course we can't overlook the role of chance when we play alternative histories :) Is it possible that if Apple had maintained the Newton line through to the mid 2000s it would have been harder for Apple to make an iPhone-like device? If we look at Symbian, whose Psion forebears were contemporaries of the Newton, we could say the legacy was a in some ways a disadvantage when the state of hardware (lots more RAM, low power GPUs, capacitive touchscreens etc) changed.
>As it was they had to buy in the GUI and most of the iPhone design!
Some elements of the UI were bought in from Fingerworks, notably the touch gestures. Other elements of the UI, including the GUI, were Apple's - a two man research team at Apple had been playing around with iOS-like concepts for a few years before the OSX iPhone (as opposed to the competing iPod iPhone) was green-lit.
Not really the worst decision. The Newton was held back by the tech available at the time, and failed as a result. The concept of a PDA didn't start to pick up until the Palm Pilot, around 3 years later, and, while I loved my Palm Pilot , it never reached the heights of the smartphone today.. The Psion organisers did well, but they didn't get the mainstream acceptance they needed. The Newton, along with Apple's misguided attempt at a Digital camera is arguably one of the causes of Apple's failure.
Steve Jobs came back, axed a whole load of products (including the Newton) with a load of money and not only restored Apple, but built it up to today's behemoth.
Personally, I doubt Apple will move the Mac to ARM. There are still a lot of companies using some very expensive Mac software on their Macs, which they are likely unwilling or unable to change. Yes, it's true on a basic level that to change architecture, you can just change the deployment platform in Xcode, but merely doing that is unlikely to give a good experience to the user.
They may or may not have issues with Intel, but I think it more likely they will move to an AMD CPU. After all, they already have a trading relationship with AMD, and there will be few issues for users.
I would be happy to be proved wrong, but I rate the Mac on ARM rumour as about as believable as the "Apple building a TV" rumour that pops up every few years or the odd iMac as a dock for the MacBookPro rumour I've seen from time to time (which is actually something I'd like to see.
This post has been deleted by a moderator
Apple's rumoured Arm chipsets will inevitably have to run in an emulator, which will have a knockdown effect on overall performance.
Not necessarily. For an EMPowering Emulation with Sees that Deliver Virtual Realities for Earthlings to Mentor and Monitor/Host and Provide has much more of a kickstart effect upping overall performance.
Are Apple into such? Do they Exercise Lead? Do they wish to Exorcise Lead/Alter Current Direction with Basic Instructions Leading One and All to places and spaces like here to hear of what there is to see when always coming home and future promised lands bound?
There's at least one being interested in whether it be positively answerable yet?
* The Certain Fatal Mistake to Make is Not to Care to Realise the Consequences of Thoughts Bordering on Being Almighty in an Earthly Orbit. All take the greatest of care there to ensure and guarantee creative and constructive rather than destructive and depressive prevails and delivers Future Bounty to All.
I'd personally prefer to see Apple switch to AMD processors (which would keep in line with them using AMD graphics) for their "Pro" line of desktops and laptops for a good few more years yet until we can see proof that Apple's silicon is absolutely suitable for running a macOS desktop and CPU crunching applications (e.g. increasing the number of cores). Moving to the A-series of ARM processors would cause headaches for those that use Macs using Windows VMs or Boot Camp - unless Apple has plans to provide compatibility with Windows for ARM processors. But that seems rather distant given what I've been reading about Microsoft's Surface Pro X.
ARM seems more suitable for consumers right now, but less so for more CPU and GPU munching professionals. I'm still slightly miffed that Apple provides such small amount of RAM to iPhones. While I'm sure it manages memory reasonable well, I begrudge having to wait for it to close my bus ticket app in the background because something else needs the RAM (even if I don't touch the phone for the rest of the day - it takes a while to launch the app and go through the rigmarole of getting the ticket ready - TfL this isn't).
Plus is it feasible for Apple to cotinue to use Intel's Thunderbolt in ARM based Macs? What's the licensing like for that?
I've no doubt Intel will continue to license Thunderbolt to Apple or anyone else who wants a license even if Apple switched to ARM for their CPUs. As Intel would still like some licensing money from Apple as I doubt they make a huge amount from selling laptop and desktop CPUs, as Intel is making most of its money from the expensive datacentre chips at the moment.
But Thunderbolt 3 is to be included in USB4 spec so once that occurs they wont need to pay for a separate license for Thunderbolt as it will come with their USB license.
Last I'd heard, Intel badly fumbled 10nm (giving AMD a royal red carpet), and was trying to get 7nm going because 10nm was royally FUBAR'd.
And now Apple suddenly has 5nm on the roadmap?
Is there some alien tech I've not heard of yet ?
Someone please explain. I'm in the dark here.
Biting the hand that feeds IT © 1998–2020