>Apple has experience here, following the move to Intel from PowerPC.
And Motorola 68K ==> PowerPC before that.
Apple will reportedly release its own Arm-powered Mac computers next year, marking the biggest component change since Cupertino ditched IBM's PowerPC chips in 2005 for Intel's x86 line. Citing anonymous internal sources, Bloomberg claims Apple has designed three laptop processors based on the upcoming A14 platform, with the …
And those parts of Apple that came from NeXt had experience with NeXtStep on several architectures, SPARC, PA-RISC, x86 and Motorola 68000, no doubt made a lot easier because running on different architectures was a part of its original design intent.
Edit: and of course OSX has already been ported to ARM, in the guise of iOS.
>Edit: and of course OSX has already been ported to ARM, in the guise of iOS.
Which still leaves the question, "Why do it?"
The previous chip moves provided obvious benefits. As much as I'd love to see this, it wouldn't be an upgrade.
- There aren't many people that need more battery life than we already have in the Airs.
- It would introduce an incompatibility in the laptop line, whereas everyone knows what iPad's don't do.
They would be better off porting to ARM and releasing an ipad pro-pro with full macos. Position it as a better ipad, not a disappointing Air.
What I would note is that MS' attempt at ARM laptops have not been a spectacular success and the only place I've seen a lot of Google's devices are those dumped on kids in school. I'm rather hoping that would be aversion therapy, but I suspect that's a vain hope for the phone+instagram generation.
They would be better off porting to ARM and releasing an ipad pro-pro with full macos.
What I would note is that MS' attempt at ARM laptops have not been a spectacular success and the only place I've seen a lot of Google's devices are those dumped on kids in school.
What about phones, which are pretty much mini PCs without a keyboard?
What about the Raspberry Pi and its clones?
What about ARM servers? (I know that ARM servers tend to be specific-purpose as opposed to general-purpose, but hey, it's there)
The shift to PowerPC was a lot better received, because Apple were already using a Motorola CPU, so using one developed by Motorola and Apple (and IBM) was seen as an upgrade. (And of course, it could emulate 68k code faster than any real 68k CPU could run it, so it was a big upgrade).
In contrast, when it was first announced that OSX would be running on Intel's x86 CPUs there was much wailing and gnashing of teeth, and lo, they did begin to doubt the word of the prophet Jobs and curse the name of Intel.
"And of course, it could emulate 68k code faster than any real 68k CPU could run it"
That is not completely accurate. The first releases of the PowerPC chip did not have enough cache to run the 68k code faster. Power users, with more immediate needs continued to buy 68K models, which more future-leaning customers bought PowerPC models, feeling OK to wait for native code to arrive. Later releases of PowerPC bumped up the clock rate and added larger caches, to gain parity.
Faster 68K chips were being released, but never bundled into Apple systems. Apple killed third-party manufacturers from releasing processor upgrade cards, by restricting licensing of the Apple ROM's, when they offered Motorola CPU upgrade cards to out-perform the PowerPC architecture, on older systems. The 68060 was used in networking equipment, some workstation & gaming systems, and desktop accelerator cards), but were not used by Apple.
It is easy to compare newer machines which using later PowerPC release over the older technology without updates, which never received an upgrade, but suggesting PowerPC was emulating 68k code faster, is not quite accurate.
Intel has failed to deliver most of what they have promised in the last 5 years.
As the article stated, they have lots of ARM design experience and seem to think that they can take the overall performance of a MacBook to levels that Intel can't deliver.
It remains to be seen if they can make it happen. Intel should have seen this coming a mile off.
"Intel has failed to deliver most of what they have promised in the last 5 years."
- which is approximately why Apple moved from (what was basically IBM's) PPC architecture to Intel. Motorola's failure to keep up with the curve with the 68k architecture is why Apple shifted to PPC in the first place. Perhaps Intel took the line that keeping Apple happy wasn't a priority - Apple is after all a relatively small outlet for Intel products if you take the global view.
Nothing lasts forever, of course, and both Apple and Macs will become extinct in the end. But Apple has kept the Mac line going steady through three different CPU architectures so far. Predictions of Apple's imminent demise date back to the days of the Apple ][. It's not happened yet.
I used Macs through the 68k->PPC transition and the PPC->Intel transition. It was - well, okay. I'd be willing to bet something more than a tenner that if Apple switches Macs to ARM CPUs, the transition will work at least as well as the previous two CPU architecture transitions.
[Apple has no sentimentality regarding business or technical decisions - but one might bear in mind that Advanced RISC Machines Ltd. as spun off from Acorn was a joint venture which also involved Apple and VLSI Technology and produced the CPU that powered the Apple Newton PDA.]
There's no reason to think that Apple's going to ditch Macs, or that macOS won't be transitioned to ARM code if that's what Apple wants. The job's entirely do-able.
Having said all that, every guess I've ever made in the past about what Apple's going to do with the Mac line has been completely and utterly wrong. Wait and see.
"which is approximately why Apple moved from (what was basically IBM's) PPC architecture to Intel"
not sure if that was enough: one immediate result of Mac on Intel was that you could use Windows in virtual machines, allowing a user to use Windows-only software seamlessly. With the switch away from Intel these virtual machines will stop working. Well, you might be able to use virtual machines to run Windows-on-Arm, but that won't let you run the legacy Windows programs.
Windows 10 on ARM64 bundles an emulation layer for x86 (not x64 yet) apps which runs fairly well considering the circumstances; however, although Apple’s chips pack some heft it probably wouldn’t be enough for most serious work.
Also doubtful that Apple would make the effort to work with MS and get support for their chipsets / virt platforms integrated into WoA, although with that said WoA does run on QEMU with a bit of elbow grease so never say never.
I don't think running Windows was one of their primary considerations. It was enough of one that they made Bootcamp, but they did that quite a while after making the transition. I think it was mostly about getting a faster laptop that neither ate through a battery in an hour nor caused burns, given the power requirements of the G5 PowerPC chips.
However, even if Windows was one of their primary considerations then, it doesn't necessarily mean that it is one now. There's a discussion further down about whether an iPad is similarly capable as a laptop. While I've been arguing that it isn't, my arguments have been for specific use cases. For many users, the applications they need do function at a certain level on an iPad. Most of the time, that's not because the writers have decided IOS is great and they want people to switch to it, but instead that many companies either put resources into cross-platform applications or have switched to web ones. In either case, they will probably have something that works fine on Mac OS. I bet Apple doesn't think their users care much about running Windows on their hardware, and they're probably right for quite a lot of their users. They'll be wrong about some, just as there will be some people who need an Intel-compiled binary which doesn't get emulated right or just never gets updated, but I have this feeling that Apple doesn't really care about those people.
"Intel has failed to deliver most of what they have promised in the last 5 years"
I would like to know what you mean by that statement. 'Intel has failed...in the last 5 years" to deliver what, exactly?
Are you referring to Apple's failure to properly implement, and therefore failure to properly utilize, Intel microprocessors due to Apple's utter incompetence to design proper heat management systems into most of their recent product line?
Are you referring to the fact that, for at least the last two generations, MacBook Pros have physical thermal design limits below their CPU requirements? Meaning that MacBook Pros intentionally throttle the CPU during standard-use loads, thereby never allowing the CPU to achieve the processing level that the buyer paid for?
Do you mean the well-known utter failure of the thermal design of the trashcan Mac Pros? Meaning that Mac Pro buyers never achieved the full CPU processing power that they paid for?
Do you mean the utter failure of the thermal design of the latest MacBook Air, causing outright CPU failure due to a massive failure of proper airflow design?
The world would like to know.
No, only you want to know. The world at large has been keeping up with developments.
Intel's struggles to move to smaller processes has been well documented, as have the resulting delays to their roadmap.
People complained a couple of years back about the Macbook Pros only supporting 16GB RAM - this was because the Intel CPUs didn't support a type of low power RAM above that. Apple's later refresh gave buyers the option of more RAM at the cost of less than optimal RAM power consumption.
The Trash Can Mac Pro's cooling issues were Apple's, but based on their (erroneous) forecast that using two GPUs was the future - as it was the industry adopted the doctrine of one very powerful GPU. In any case, it was a tool for video editing based around fast external IO (storage and dedicated codec hardware) rather than chiefly CPU grunt.
If Apple is so disappointed with Intel's performance then why don't they just switch to AMD?
AMD has an entire new series of chips out and they are both well regarded and excellent excellent performers. And doing so would avoid the entire new platform/incompatible application thumb.
But fans would rather listen to the belief that it's all Intel's fault for 'making" Apple think about switching.
The only people not keeping up with developments, like AMD? Better look in that mirror.
AMD are doing good stuff now, but from about 2006 to 2018 they were shipping utter shite and they never really had good laptop chips until the last year or so. They also have a history of not being able to deliver their chips in sufficient volume, though that's less relevant now they're using TSMC. I can see why Apple might be less than keen on them as an option with their track record. It just doesn't inspire much confidence in their ability to keep delivering consistently.
That is true. But my point, proven by the downvotes, is that, as usual, Apple fans think that an Apple solution is the magic bullet...to just about anything.
Intel was somehow holding Apple back from creating more "Magic!" products...by Intel's inability to make [their] 10nm process work. So Dell, HP, Acer, Asus, Razor, Microsoft (Surface), and more, were using the technology that Intel was capable of producing...but poor Apple's dream of fantasy products was impinged by evil Intel's lack of skill in getting their technology advances to work.
That's saying that we'd be on Alpha Centauri...if only those idiots at NASA would get off their butts and get warp drive into production. Or if that evil cabal of Tesla/Panasonic wasn't constantly pushing back their super capacitor products, we'd have electric cars that went 900 miles and recharge in 5 minutes.
It may be *possible* that Intel was just biding their time on 10nm, to get maximum profits from 14 when they had no competition; if Intel quickly rolls out 10nm to address AMD, then that's a reasonable accusation. But you say that Intel was blocking Apple from creating better products is the epitome of fanboy stupidity - not only does everyone have to deal with developing at the same technology level, Intel and any other supplier/manufacturer has to put products out at the best skill level they can. If they can't create better...that's it. They couldn't create carbon fibre for airplanes in World War II, and Intel couldn't get their 10nm process up to par. That's the way technology works.
These things aren't put together the week before they're launched. If your suppliers don't live up to their promises, you're stuck having to bodge around things like the laptop running hotter than expected and not supporting low-power memory in large enough sizes. Apple did that like everyone else did, but I can't imagine they're happy about having to do so. Is it really that difficult to believe that they would feel burned by Intel shipping basically the same chips as they were in 2013 but at higher clock speeds with greater power consumption for so long?
I'd be very surprised in Intel's failures didn't play a major role in the 12" MacBook being ditched, because it correlates with Intel failing to deliver improvements in that power envelope.
1) Apple already has an ARM powered laptop: the iPad Pro with the Smart Keyboard.
2) How to completely alienate your professional workstation customers: finally announce a long awaited successor to your workstation line, then effectively EOL it before it has even hit full availability.
Apple may well be working on a new 5nm higher performance ARM chip, but I somehow doubt they are planning to kill the Intel laptop and desktop lines. They can easily enough run both, given the slight split in their market between consumer and production devices.
The i*OS are all based on the same kernel as macos -- darwin. Do you imagine that some apple engineer thought one day "hey, I'm going to cross compile the toolchain and run them on my ipad?"; even if just for kicks? The Ipad has enough ram, cpu and storage to use as a dev machine; why wouldn't they?
Possible? Certainly. Do I think it happened? No. If it did, we'd have it. IOS is a functional OS for mobile devices, but there are tasks it doesn't handle well. One of those tasks is manipulating lots of files, keeping them organized, etc. One of those things you do a lot when writing software. Another is spinning up a terminal session to run multiple small tools on files. IOS doesn't do that either. For nondevelopment purposes, you don't need those things and most users won't notice their absence, but devs would.
You can call nearly anything portable a laptop. As long as it has a processor in it and can be carried with you, it qualifies. There's still a major difference between a traditional laptop that runs a desktop operating system and other devices that do less. Even when it's shaped like a laptop with a keyboard and everything, most tablets are still just tablets with a keyboard.
"One of those tasks is manipulating lots of files": https://en.wikipedia.org/wiki/Apple_File_System
"Terminals & shells": Are you sure IOS doesn't do that? I have applied at least one jailbreak that gave me a root shell (bash). It is pretty surprising what is in there. It isn't a simple embedded OS by any measure.
Macos and I*os are just packaging options for the same system software. Apple can always add packaging options and could even make something ridiculous like an x86 iphone. But an arm-laptop would substantively cut the BOM for computer.
There's what the OS is capable of: nearly everything, and what the UI lets you do: much less. It has a file system. It is capable of creating directories, putting files in them, and moving or copying big sections at a time. Pretty much everything can do that. But until not that long ago, you couldn't do it manually on IOS, because they wouldn't let you at the file system. Individual apps could provide you with access to their own sandboxed sections of the file system, but to get anything in or out required going through IOS's transfer system which works on single files only. Now, they've slightly relaxed that and have a file browser on the device. It can do some things. But it can't do everything you typically do in seconds from any desktop OS, and some of the things it can do are significantly more painful.
As for the shell, it can run one. As you've pointed out, you had to jailbreak for that one. The point being that, as Apple has designed it, you can't have a shell. So you can't do certain things like writing a script to do some batch file changes, firing up python to use it as a calculator, or curl a file from the internet, which are all useful things for the more technically-minded of us. The device and OS are capable of it, but the layers above the kernel have been set up to make it hard to do so.
So, while IOS remains that way, I maintain that it is not fully featured for the uses to which desktops and laptops are put. Apple can fix this if they want, and they don't have to do much. Just add access to the filesystem (writing a good GUI file browser is optional because if the access if available, someone will), give us full access to the utilities through a terminal, give us the ability to install code directly from the device (which currently requires a tether to a mac), and we're done. They don't have to do lots of nice things, like give us root access or open the doors to unsigned code. But until they do those things, the OSes are not similar from the standpoint of a technical user. If they do those things, they've effectively just made IOS a slightly different version of Mac OS with a touch input layer.
So your complaining that Apple consciously decided to omit the ability to connect to the terminal in iOS and has restrictions on applications file access because you might want to do things on the cli on an iOS device.
There are plenty of terminal apps you can use to ssh onto another device and do dev work on that if you want. Apple also provide for free the xcode app for building iOS apps If that’s the kind of dev work your interested in.
Not being able to do the things you mentioned means increased security for the overwhelming majority of Apple’s customers.
Maybe try android fir doing the things you want, hardware is much cheaper too.
"So your complaining that Apple consciously decided to omit the ability to connect to the terminal in iOS and has restrictions on applications file access because you might want to do things on the cli on an iOS device."
Way to not connect posts in a thread. I was disagreeing with the contention that an iPad with a keyboard was similar in feature set to a laptop. The person who made that assumption was willing to argue that it should be possible to do dev work on one. I pointed out that it's very difficult at the moment and provided examples. Whether they choose to change that is not really relevant to me--if I want an Apple-made portable device to do dev work, I'll buy a mac.
"For nondevelopment purposes, you don't need those things and most users won't notice their absence, but devs would."
You've hit the nail on the head there... We are a way off iPads being developer machines. But the vast majority of laptop users aren't developers. Thus for the vast majority of people what an iPad with a keyboard and touchpad can do will be enough. I don't think that for developers and other high power uses they ever will - but ARM laptops - yes - that's happening.
You can bet that Apple has a desktop hidden away somewhere with an ARM processor running full MacOS. At the time when they prepared switching to PowerPC and final decisions hadn't been made yet, they also had some computers with a Motorola 88100 processor running (well, the "flying toasters" screen saver was running on it, I don't know how much else), And of course they had some Macs with a Pentium IV processor before it was released with Pentium Core.
For serious compiling and heavy work, in practice, there really is little evidence that ARM are going to offer any significant benefit over x86 once you build a full, high performance, platform around it. RAM is RAM, PCIe is still PCIe, GPUs remain unchanged, etc...
I would also question the assumption that Apple are magically going to be able to produce an ARM system (not core, an entire system, which is effectively going to be the requirements for an ARM based Mac Pro) given all the people who have tried and failed miserably. Are there any widely available current SoCs at sensible prices with 64GB or 128GB RAM?
I may be surprised, but for Apple, keeping the current split where iPad is ARM for consumption, and Mac is x86 for creation or production, leveraged the strengths of both platforms.
> . Are there any widely available current SoCs at sensible prices with 64GB or 128GB RAM?
No, but Apple doesn't them to be widely available since it designs it own and has them made by TSMC.
In any case, nobody is talking about Apple replacing the Pro range with ARM versions any time soon.
Yep, some of our admin/clerical staff were issued them in the 80's. As I recall some of the "PC Compatible" mainline shrink-wrap either wouldn't run properly, or had to be ordered specially. They were soon replaced by vanilla PCs. It could have been worse, many of the technical staff were issued DEC Professionals running POS (There's an appropriate name), the people who could benefit really wanted PDP11s or MicroVAXen. Typists got DECmates running WPS, which were quickly replaced by PCs running DEC WPS Plus.
All of this was just in time for Windows to come along and replace almost everything. I thought that at the time this was a shame as the various versions of WPS and DEC ALL-IN-1 offered pretty good integration of file sharing, word processing, terminal emulation, timekeeping, and data access (For the time, particularly compared to Windows/DOS).
Asymmetric multiprocessing has been implemented for years; and if you look at a cheap x86 core as an x86 accelerator which you turn on and off as you need it, it doesn't clobber your power budget. From a technical perspective, the x86 doesn't need to run darwin code itself -- it can run as a slave to the arm cores.
That Z80 in the C128 is there to let the damn thing boot. It was a ungodly hack to fix some shoddy design flaw so they didn't have to respin a bunch of chips.
Amigas for a while were able to emulate Apple Macs and be faster than the M68K equivalent mac. And you could get 086 and 286 Zorro cards to let run a full PC in there.
Motorola's plan was to abandon the CISC M68K design, and go all in on the power pc as a replacement, with some sort of compatibility tool welded on. It didn't end well for Motorola.
> I wonder if it would be viable to place Intel *and* ARM processors in the same computer,
I don't know why someone downvoted you, it's not an outrageous idea (I've actually raised the concept here myself). Infact Apple already do it, with their T2 chip on Intel computers - an ARM chip that runs a verified microkernel, looking after boot, security, biometrics and guarding the webcam.
That said, I don't think Apple will take the approach of an ARM computer with AMD 64 coprocessor because it's just a little... untidy. Their history suggests they'd rather get 3rd party Devs to compile to ARM.
A lot of the heavy workload of Pro software, especially in video and graphics, is already done by GPUs or other dedicated hardware.
Arm-based chips also have several key advantages, particularly on the power efficiency
ARM looks good only in apples-to-oranges comparisons. Extremely low performance ARM CPUs have very low power usage, but the higher performance ones use more power than an Intel Atom or some mobile ULV CPUs. Those touting the low power and high performance of ARM CPUs are either lying or are themselves falling for the lies of interested parties, and sadly propagating them.
There isn't a single ARM core out there which comes anywhere close to the performance of a single modern x86 core. Either Apple will have to wait for that to change, or they will have to force all developers of MacOS apps to continually maintain fat binaries for two architectures indefinitely, or Apple will just have to surrender the high performance PC market entirely, which won't make graphical designers and video editors very happy.
If Apple was smart, they'd only consider releasing an ARM laptop as an iPad-with-keyboard type device running IOS. Call it an iClam... That would avoid the risk of hamstringing their MacOS business in the process.
Geekbench disagrees with you. The A13 single core beats the i9-9900 by about 7%. The multi-core is quite a bit different, but heat dissipation is a big issue; if you exclude imac and mac pro; the macbook pro (8 core) has about a 50% lead; but still has a lot better heat dissipation than an iPad.
Do remember, Apple make in-house designed ARM CPUs, that are quite a bit different from the more generic line.
I don't see why Apple would bother with fat binaries. The concept was made obsolete years ago by the Internet. It only made sense when you bought your software on CD in boxes.
Linux manages to support multiple chip architectures without fat binaries. The OS has a standard installer and it knows what chip architecture it's running on. It just fetches the correct binary for that architecture from the correct repo. I run Ubuntu on x86 and Ubuntu on ARM without fat binaries and it's completely transparent to me as a user. I'm sure that Apple is capable of copying that idea if necessary.
As for performance, if you care about per core CPU performance above all else you're not likely to be running an Apple laptop anyway. I don't seriously see it as an issue for the bulk of their target market. Graphics performance is what will matter most, and that is down to the GPU.
For overall performance, they will rely on having a large number of cores for peak loads, and throttling down to a few lower powered cores whenever possible. Have a look at all the work that Apple has spent the past few years putting into multi-core process dispatching. They won't have been doing that work if they didn't intend to make use of it.
"I don't see why Apple would bother with fat binaries. The concept was made obsolete years ago by the Internet. It only made sense when you bought your software on CD in boxes."
Fat binaries are useful to upload to the App Store (and then every device downloads the slim binary that is appropriate). On iOS, apps are not backed up, just the fact that you had them, so if you upgraded from a 32 bit to 64 bit device and restored your backup, you would automatically get the 64 bit version.
"Linux manages to support multiple chip architectures without fat binaries. The OS has a standard installer and it knows what chip architecture it's running on. It just fetches the correct binary for that architecture from the correct repo. I run Ubuntu on x86 and Ubuntu on ARM without fat binaries and it's completely transparent to me as a user."
This is perfectly fine when you actually can use the OS provided package handler to download software.
It doesn't work if you want to use commercial software (including games) that are distributed by a 3rd party, such as Steam or the manufactuter's repository. The software may be available as an RPM, Flatpak or similar, but that doesn't mean it is available for your architecture.
There's "ARM based chips" and "ARM based chips designed by Apple".
Apple is hugely ahead in the area that interests them: Low power but not extremely low power, few very powerful cores but not 64 of them. There are companies making 64 core ARM servers. They use a lot of power. Apple doesn't have any chip right now that can do that. But they _can_ built a chip with eight high speed and four low power cores, as the article suggested, and per core they are on par or ahead of Intel. The proposed 8+4 cores would make one hell of a laptop at lower power than using Intel chips.
I don't think that Apple wanting to move over to ARM for their laptops (if the rumour is true) is necessarily that they think they can make an Intel CPU killer, Although the performance gap between Intel and ARM is a lot closer than it used to be. But instead that they can save costs by designing their own CPU in house rather than buying expensive Intel chips which Intel have not been great at getting any decent yield at smaller die sizes. Apple can also optimize their own designed CPU better for MacOS, as Intel chips come with all those old X86 instructions that Intel have to include to support legacy code that MacOS doesn't need. Obviously a move to ARM would mean dual booting to Windows 10 would no longer be an option, unless MS were to release an ARM compatible Windows 10 version that supported Apple hardware.
What proportion of Apple's user base are still dual-booting Windows? It's something that I read about as being possible but rarely hear about people actually making use of these days. I doubt that Apple are going to cripple the future development of their product line by maintaining a legacy use case that most of their users don't care about.
There's also nothing preventing Apple from keeping a few legacy x86 versions of their laptop line around for those customers who have a genuine use case for them, at least for a while. I say 'x86', because these could quite as easily be AMD and Intel.
And probably <1% of consumers. Maybe 0.1%
It's been painfully obvious for many years that Apple just want the professionals to go away. Yes, they buy the really expensive Macs, but there aren't many of them compared to the consumers buying Airs, iPads and the like.
A lot of professional software isn't available for macOS anymore because Apple removed the features it needs, and the cost of redevelopment far exceeds the likely return.
So most professionals switch over to Windows (or Linux, surprisingly enough) to get some of their work done. They complain of course, because they happen to like the OSX GUI.
But they're using the machine to do a job, and they'll buy and use the hardware and software needed. Dropping 32bit forced many over, because Photoshop.
In many cases they've not bought a new macbook for years because the newer ones were terrible.
"It's been painfully obvious for many years that Apple just want the professionals to go away. "
If that were the case, then Apple would simply have stopped making the high end Mac lines. The fact that Apple still makes high-end Macs for "professionals" (which ones?) means that Apple wants to keep supplying that market.
I've been using Macs a long time and believe me, if Apple wants to ditch a market segment, it'll do so without so much as a backwards glance.
One thing to bear in mind is that Apple is still basically a hardware company. It makes its money selling actual physical objects. If it's still selling high-end Macs, then that's because it's making good money that way.
Apple managed to survive the catastrophe of Visicalc on Apple ][s having no Mac equivalent in the early days, and so losing the lucrative professional spreadsheet market to Lotus 1-2-3 on MS-DOS. Whatever decisions Apple's making on the future of Mac software support and development will most likely work out as well as in the past.
It's been painfully obvious for many years that Apple just want the professionals to go away. Yes, they buy the really expensive Macs, but there aren't many of them compared to the consumers buying Airs, iPads and the like.
If only Linux had half the software I require running on it natively I'd be able to stop running a Hackintosh. Adobe makes some shite, but Lightroom and some of their other media related applications are unequalled. Those are the problem areas, and I'd never dual boot to get them.
I'm an edge case - I run (very occasionally) Windows 10 in a Parallels VM on an iMac with 16GB RAM and an SSD; it takes about 15 seconds from a cold start to get to a responsive desktop, which seems to be as good as many of the Intel PCs I see.
The main reason for me running Parallels is that I still have old stuff that I have written that runs in DOS, Win XP, and Win 2000 VMs, and "one of these days" in my retiree lockdown I might consider updating/porting some of it to something more modern.
Now that Microsoft is selling the Surface Pro X, they have plenty of incentive to make sure they support it well.
It includes built in support for emulating x86 binaries. Obviously that would incur a performance penalty, but Microsoft developers will have more incentive to port their software if they want it to run on the Surface Pro X. If a third party starts selling Windows ARM laptops at a more reasonable price point and it starts selling, look out there will be a flood of Windows applications getting ported!
So the ARM Mac owners can still dual boot into Windows, they'll just boot into the ARM version of Windows.
If anybody can switch to ARM laptops, it would be Apple. I am convinced that Steve Jobs' best work is to convince people that owning their product somehow makes you a better person. Therefore, Apple could sell you iDirt and the fanboys and fangirls would buy it and swear to you that it is better dirt. That was the real genius of Steve Jobs. So if Apple went to an in-house ARM, their users would swear to you that new laptop is better and you could not convince them otherwise. Don't forget that Thunderbolt is no longer Intel exclusive.
I personally think it would make more sense for Apple to have AMD design a semi-custom chip for them. Supply won't be an issue because both use TSMC. Some of the reviews of the AMD Ryzen 4000 laptop I saw had it outperform a desktop Intel, all at much lower power.
Still slightly dubious about this..
They've just launched an extremely expensive upgrade to the Mac Pro. Even Apple aren't going to want to piss off customers who will happily spend tens of thousands of pounds on each workstation.
Yes, Apple have changed CPU architectures several times. Generally not without providing some sort of emulation layer in the OS to enable compatibility with the previous architecture. Can any existing arm emulate the x86 architecture at the same speed as a core i5, i7 or i9? What about the Xeons? Bear in mind that the target markets for the Mac are likely to require tons of processing power (not so much the print and audio design, but video work and 3d work can bring most computers to their knees, pc or mac).
It'd only need to emulate part of amd64.
They already killed x86-32, and they control the only supported compiler, so they can have been ensuring it uses a specific subset of opcodes.
They've also forced every developer in the world to send them a copy of every piece of software for macOS, including alpha and beta builds, so they have a huge pot of
IP to steal applications to test against a compatibility layer.
Plus arbitrarily refusing applications that happen to use opcodes they don't feel like bothering with, perhaps.
They would need several years to complete the transition, they aren't going to do them all at once, they'll do the consumer stuff first. They will wait the longest to do the Mac Pro desktop, if they sell the first ARM Macs in 2021 I wouldn't expect to see an ARM Mac Pro any earlier than 2023.
Apple knows people spending that kind of money (where a well configured Pro costing tens of thousands may be dwarfed by the cost of software licenses costing hundreds of thousands in some cases) and depending on it for their business/livelihood are risk averse, and will let others debug the ARM Macs first. Plus those very complex software packages many Pro users rely on will take a long time to port and fully test, so even if an ARM Mac Pro was available from day 1 they'd have to wait a year or two for the software they depend on to be ready.
Even then it isn't as if the existing x86 Mac Pros will stop working, or Apple or the developers will stop supporting them. Anyone who bought a Mac Pro already or buys one in the next couple years will get its full life out of it, the ARM change won't affect them at all.
If Apple does this, it's because they've given up on the Mac.
Apple's earlier architecture changes made sense. They were painful to go through, but there were good reasons for them. The PPC was a huge speed win from the 68k. The G5, great chip that it was, just never got low-power enough to put in a laptop. The whole fat binary problem, the slow and crashy emulators, and worst of all was when the PPC emulation was finally really good, and could have kept legacy software going for decades, Apple stripped it out with no warning. But this, this one is utter stupidity if you're trying to be a computer company. We're finally at the point where there's an industry standard architechture for CPUs, one that works well, and one that isn't even single source unless you're Apple (and the Intel exclusivity deal ended years ago, but we still have no AMD Macs, and no NVidia GPUs in them). If Apple wants another chip, there's one right there at AMD. As far as that goes, Apple could buy AMD with cash on hand, and have enough left over to buy Adobe.
The only reason they'd try is because they know it'll kill the Mac, and that's what it seems like they ultimately want, a sealed box that's not even really a computer, just a content consumption tablet that can be used for some light word processing, and a few industry-specific apps, but spends most of its time as a toy.
What they don't realize is that killing the Mac WILL eventually kill Apple.
Hopefully Cook comes to his senses and kills this garbage before it ever gets close to a release.
I also like having a standard instruction set that is generally open, but your characterization of it being multi-supplier is a little strong. Basically, the only available options are Intel and AMD, with other companies not being allowed to join the party. With ARM, there are many manufacturers and a few designers of processors implementing that instruction set. Qualcomm, Broadcom, Samsung, TI, Apple, ARM themselves, Huawei, and a couple small places that don't make many chips. ARM has many other problems, like not having a consistent method of booting firmware--I can take virtually any X86 chip and run arbitrary code on it, but not so with an ARM one, but in terms of suppliers and lock-in, ARM is probably better.
"We're finally at the point where there's an industry standard architechture for CPUs"
Really? So why are ARM CPUs so widespread? Why does IBM keep on developing its POWER architecture and z/Architecture? Why is there so much interest in RISC-V? I could go on.
The days when it looked like x86 might take over are long gone. We're at least twenty years past the point where it became obvious that the future lies in multiple CPU architectures, evolving without reference to one another. The world is all the better for such diversity.
The demise of Apple has been predicted regularly from (IIRC) the early 1980s. One day, it'll happen. But I'd bet it won't be for the reasons anyone on the outside predicts.
The situation today is different to 2006. Apple has been encouraging developers for years to use tools that will allow compiling to different architectures at the push of a button. Macs and iPhones already use Metal instead of OpenGL or Vulkan.
"What they don't realize is that killing the Mac WILL eventually kill Apple." You didn't get the memo? Most of the real world is using their phones and the Internet to get "real work" done (some have tablets too) .
The PC model, that we are used to, is a vital tool for clerical support work/administrators/mid-level managers; but unfortunately their jobs are disappearing extremely rapidly. PCs might be left with the niche work of "developing" and serious multitasking between multiple windows (I don't see too many people doing that much these days either). However, I could be wrong...
I believed chip A14 (will be total surprise ) it offer more performance than eagerly awaited for press. Traductor "Rosetta" applications it need extra-power (offering by CPU A14 ) for at least equal performance Intel´s CPU. Intel had catch "suitcase" and to go very far, this time is begin of the Intel eclipse?
"Apple have less than 10% (and falling) of the desktop market."
Apple has about 90% market share in laptops over $1,000. And that's where you want market share. That's where the money is made.
In the desktop market, they have about 10% of unit sales, but 30% of the revenue. And make more profit than Dell, who in turn makes more profit than anyone else.
This article misses the whole point of virtualization.
The reason why Job's decision to move to X86 was virtualization and is, to date, why Macs are still good systems, for work and home stuff, even gaming. I spent an insane amount of time gaming on my Mac ...
Now, with this completely retarded decision, from morons Jobs left in charge, like this poor Cook dude, who apparently can't guess ass from elbrow, Macs will only be focused on insane margins tablet-like systems (like 2kUSD vs 100 USD costs).
The result will probably be a very high second hand market and very luxury new macs ...
Congrats, Cook !
Are you trying to say that ARM doesn't support virtualisation? Because it does.
Are you trying to say that it needs to run virtualised Windows x86? That makes a bit more sense, although it should be possible to do as Qemu does and run a virtual machine with an emulated processor to walk Windows x86. Probably not a great experience, but usable for minor tasks.
If Job's wanted virtualization, Apple would still be using PPC (or even 68k) chips. The X86 was a bitch to virtualize given the unprivileged access to the processor status word for reads. Kudos to vmware for actually figuring out how to get it done. I suspect it was AMD who fixed that too.
The A13 in the soon-to-be replaced iPhone beats all Apple's laptops other than the highest-end 16" MacBook Pro. I can only imagine what the A14 in a thermally less constrained body can achieve, and I suspect the "low-end" laptops mentioned will actually be superior in all respects (speed and battery life) to the Intel ones. All apps submissions the App Store have been sending a LLVM Intermediate Language variant that can be retargeted to any architecture supported by LLVM, including arm64. Obviously, poor coding practices and assumptions can still cause the code to work incorrectly, but I would think the transition will be better than the PPC to x86 one was.
Creates a market for the software, especially the most popular apps, so that developers will develop Native ARM apps greatly reducing the need for legacy x86 software and over a year or two will make an ARM based power laptop more viable. They might even offer a cloud service for some legacy x86 software.
Until then they can enjoy the renewed AMD competition in laptop chips.
Biting the hand that feeds IT © 1998–2020