back to article Apple to keep Intel at Arm's length: macOS shifts from x86 to homegrown common CPU arch, will run iOS apps

Apple on Monday said it plans to shift its macOS line from Intel chips to its own homegrown Arm-compatible processors, an initiative called Apple Silicon that will put all Cupertino's products on a common hardware architecture. During its 2020 Worldwide Developer Conference keynote – a virtual event due to the coronavirus …

  1. Anonymous Coward
    Anonymous Coward

    Two years to work out the kinks and I am buying one

    Intel never thrilled me (and I know this may be an irrational decision).

    1. Anonymous Coward
      Anonymous Coward

      "Intel never thrilled me"

      Everything before the 80386 was horrible. x86-64 isn't at all bad. Who cares where the CPU comes from? How it performs is what's important. These days, security at the hardware level is a hot topic. Intel's doing things the old monolithic way. I suspect the ARM approach allows CPU designers to come up with more secure CPUs quicker than Intel's way of doing things - especially in the case of Apple, which can now design a complete package of CPU, full computer circuitry, OS, and software developer tools in a fully controlled ecosystem. Authoritarianism is a rotten way to run a country, but often works well in engineering.

      - my main point there being: since Apple's got control of everything from CPU design upwards, it can implement systems to mitigate security flaws at any given level by tweaking security systems at any other level.

      1. Charlie Clark Silver badge
        Stop

        Re: "Intel never thrilled me"

        There's no reason to suggest that Apple's engineers will make chips that are any more "secure" (for users) than Intel's.

        ARM designs are inherently more customisable, which means Apple can put more stuff in silicon that it wants whether it's video codecs, encryption algorithms or machine learning. This, in turn, should lead to less demands on the CPU which should be good for battery life and heat generation. Custom hardware also makes software even more Apple specific, ie. increasing lock-in for users.

        That said, I'm looking forward to the first devices to see how they stack up.

        1. cdrcat

          Re: "Intel never thrilled me"

          There are multiple hardware mitigations *already* in Apple processors. They are mostly aimed at preventing kernel level exploits, but it seems very likely Apple will continue putting in more security protections into the A* processors.

          Intel have repeatably shown they prioritise sales performance before security, sort of like Microsoft of yore, and Intel is less likely to develop mitigations that require tight integration with the OS or deep modification of the OS.

          Scroll way down to the heading “iOS kernel exploit mitigations” in this link which details some of the hardware protections: https://googleprojectzero.blogspot.com/2020/06/a-survey-of-recent-ios-kernel-exploits.html

      2. maffski

        Re: "Intel never thrilled me" - "x86-64 isn't at all bad"

        x86-64 isn't Intel. It's AMD.

        1. PeeKay

          Re: "Intel never thrilled me" - "x86-64 isn't at all bad"

          "x86-64 isn't Intel. It's AMD."

          Which, for some reason, always tickled me.

          1. G.Y.

            AMD64 Re: "Intel never thrilled me" - "x86-64 isn't at all bad"

            I call it AMD64; makes my Intel friends happy ...

  2. aregross

    "Rosetta 2 will translate x86_64 code on installation or on-the-fly in the case of browsers using JIT-compiled JavaScript or Java."

    I lol'd

    1. Anonymous Coward
      Anonymous Coward

      Rosetta

      The original Rosetta from 2006 worked pretty well: PPC applications mostly did run on new Intel Macs without trouble.

      Prior to that transition, Apple implemented the "Classic" environment so that you could boot up the old-style MacOS 9 on a MacOS X PPC Mac, and run 68k code while you were about it. That worked well too.

      The software to run 68k code on a PPC Mac came from when Apple moved Macs from 68k CPUs to PPC CPUs. It mostly worked well - although back in the days of the 68k to PPC transition, I did notice a performance hit.

      As it happens, there's a thing called SheepShaver that lets you install MacOS 9 on an Intel Mac running the current Mac OS and run old PPC software on Intel Macs even now. It's not from Apple, but it works.

      Apple has form in this area: it's done well in the past. I'd bet more than a pint that Rosetta 2 will outperform its predecessors.

      1. P. Lee

        Re: Rosetta

        MacBook Pro?

        Surely a low power MacBook would be the best choice given the need to Rosetta most apps to start with?

        I’m not sure why you wouldn’t do an arm coprocessor, for a transition.

        1. Dave 126 Silver badge

          Re: Rosetta

          I think the transition is that Intel Macs will continue to be available for a couple of years. If by the end of the year the ARM versions of your productivity apps are ready and well-tested, then why not MacBook Pro? It's also a statement - by releasing MacBook Pros and not MacBooks first, Apple is going against the messy image of Windows RT.

        2. gnasher729 Silver badge

          Re: Rosetta

          "Surely a low power MacBook would be the best choice given the need to Rosetta most apps to start with?"

          On a new Mac, all your apps will be downloaded. And they get translated to ARM during download. If you copy an app in another way, it can be translated to ARM on the first launch. Rosetta is something that the user will not even notice.

          I would think the first released Macs would be something that leaves the 6 and 8 core MacBooks behind.

          1. Charlie Clark Silver badge

            Re: Rosetta

            Rosetta is something that the user will not even notice.

            Bollocks. The approach will be very similar to that done on Android after every update which tries as much as possible to apps through a JIT to get native code. Intel was able to make use of this for a lot of stuff of Android for Intel. Except, it didn't work for everything and for some stuff it definitely was noticeable.

            Most stuff using Apple's APIs should transpile pretty well but there will always be exceptions and anything making heavy use of x86 specific optimisations could be noticeably slower.

            1. Anonymous Coward
              Anonymous Coward

              Re: Rosetta

              "The approach will be very similar to that done on Android after every update which tries as much as possible to apps through a JIT to get native code. Intel was able to make use of this for a lot of stuff of Android for Intel. Except, it didn't work for everything and for some stuff it definitely was noticeable."

              Apple implemented pretty much that approach via Rosetta for the PPC->Intel transition. It applied only to application code and worked pretty well.

              In my experience, it was reliable and it didn't seem to slow anything down to an extent that I noticed - but then again, nothing I was using placed serious load on the CPU.

        3. Charlie Clark Silver badge

          Re: Rosetta

          The MacBooks have been replaced by the MacBook Airs and this is where ARM probably makes the most sense because new MBAs get hot pretty quickly and have to start throttling. But they'll have to be pretty careful not to cannibalise the market they've just refreshed.

        4. iron Silver badge

          Re: Rosetta

          So during a period when apps need to be translated from one architecture to another you think that extra overhead would be best handled by an underpowered, low spec machine?

          Can I interest you in a bridge? Just one careful owner...

        5. Anonymous Coward
          Anonymous Coward

          Re: Rosetta

          "I’m not sure why you wouldn’t do an arm coprocessor, for a transition."

          Two different CPUs increases cost, complexity, size, and power consumption for no great gains.

          Apple's previous CPU architecture transitions worked well enough without doing it that way.

      2. Charlie Clark Silver badge

        Re: Rosetta

        The original Rosetta from 2006 worked pretty well: PPC applications mostly did run on new Intel Macs without trouble.

        They did but it was big hit in performance for anyone coming from the PowerPC.

        1. ThomH

          Re: Rosetta

          Depends which PowerPC they were coming from; my Core Duo MacBook Pro outperformed my 666Mhz G4 of five years earlier for PowerPC code. And just when its age was starting to become an obstacle.

      3. Torben Mogensen

        Re: Rosetta

        One of the things that made Rosetta work (and will do so for Rosetta 2) is that OS calls were translated to call natively compiled OS functions rather than emulating OS code using the old instruction set. These days, many apps use dynamically loaded shared libraries, and these can be precompiled so apps spend most of their time in precompiled code, even if the apps themselves are written in x86. Also, with multicore CPUs, JIT translation can be done on some cores while other cores execute already-compiled code.

        But the main advantage is that nearly all software is written high-level languages, so they can essentially be ported with just a recompilation. The main thing that hinders this is that programs in low-level languages like C (and Objective C) may make assumptions about memory alignment and layout that may not be preserved when recompiling to another platform. Swift is probably less problematical in this respect. And these days you don't need to buy a new CD to get a recompiled program from the vendor -- online updates can do this for you. So moving to an ARM-based Mac is much less of a hassle for the user than the move from PPC to x86 (and the earlier move from 68K to PPC).

        1. gnasher729 Silver badge

          Re: Rosetta

          Latest MacOS only runs 64Bit Code, and so does ARM. So that simplified things a lot. Memory alignment rules are the same, so no problems there.

          1. bobblestiltskin

            Re: Rosetta

            The A64 instruction set is used when executing in the AArch64 Execution state. It is a fixed-length 32-bit instruction set. The ‘64’ in the name refers to the use of this instruction by the AArch64 Execution state. It does not refer to the size of the instructions in memory.

            from https://developer.arm.com/architectures/learn-the-architecture/armv8-a-instruction-set-architecture/instruction-sets-in-the-armv8-a

            1. joeldillon

              Re: Rosetta

              While this is absolutely true....what's your point? Alignment rules and 64 versus 32 bit ness apply to the size of loads and saves from memory, not the size of the instructions themselves.

        2. MacroRodent
          Boffin

          Re: Rosetta

          > in low-level languages like C (and Objective C) may make assumptions about memory alignment and layout that may not be preserved when recompiling to another platform.

          X86_64 and ARM are very similar in this respect. Both are little-endian 64-bit architectures. Different endianness and differing pointer sizes are in my experience the biggest bugaboos, when porting low-level C code. It also customary to use natural alignment on both platforms, so structure layouts are similar. I expect even poorly written C code will in most cases work with just a recompilation.

          1. Richard 12 Silver badge

            Re: Rosetta

            Serialisation relies on exact alignment.

            This will certainly expose a whole host of crashes, data corruption and security failures as the assumptions that are true for x86 and amd64 turn out to be false on Apple ARM

            In many ways it's far worse than 32/64, because pointers are rarely serialised.

            1. MacroRodent

              Re: Rosetta

              > Serialisation relies on exact alignment

              If you serialize data relying on structure alignment, you are doing it completely wrong, unless the serialized data is guaranteed to be never read by anything else but the same program on a similar machine.

    2. TReko

      Intel has a patent wall

      Anything that emulates x86/64 architecture is patented very securely by Intel.

      This is one reason Windows 8 never worked well on the ARM chips.

      I wonder if Intel patent attorneys are preparing to sue Apple?

      1. Anonymous Coward
        Anonymous Coward

        Re: Intel has a patent wall

        Recompilation isn't the same as emulation though. Would be interesting to see how well that patent is worded.

        1. vtcodger Silver badge

          Re: Intel has a patent wall

          Would be interesting to see how well that patent is worded.

          It's a patent. It'll have been created by professionals to be both completely incomprehensible and incredibly broad. Probably could be interpreted to cover anything from putting detergent in soluble pods to ordering books alphabetically on bookshelves.

      2. Hans 1
        Facepalm

        Re: Intel has a patent wall

        Better tell AMD, then!

        When you do not know what you are talking about etc...

        1. Richard 12 Silver badge

          Re: Intel has a patent wall

          AMD and Intel have a long history of legal action against each other.

          When you don't know what you're talking about etc...

      3. Anonymous Coward
        Anonymous Coward

        Re: Intel has a patent wall

        Should be fine unless Oracle win the case against Google over software APIs. That could open a can of worms.

        In reality, is Intel really going to want to take on Apple in a patent dispute? I would suggest it would need an Intel legal team that is willing to gamble pretty heavily.

      4. DS999 Silver badge

        Intel's x86 patents are long expired

        AMD's x86-64 patents probably have just expired as well (I wonder if maybe that's figured into releasing this year...)

        Newer stuff like AVX is still covered by patents, and according to Anandtech Rosetta doesn't support AVX so perhaps patents are the reason for that. Since only some CPUs support AVX, binaries must have a 'slow path' for those CPUs which I guess Rosetta will use when translating.

        1. dharmOS

          Re: Intel's x86 patents are long expired

          AMD owns its own AMD-64 bit instruction set (x86-64) obviously, with MMX, SSE, SSE2, SSE3 etc owned by Intel. I remember those things from the Opteron and P4 days, so may well be that patents are expiring. The x86-32 bit stuff from i386,486 and Pentium must definitely be out of patent protection, which probably explains why the Windows on Arm emulator is 32-bit only at present.

          However which new instructions still have patents protecting them: AVX, AVX-2, DLNB etc?

      5. Anonymous Coward
        Anonymous Coward

        Re: Intel has a patent wall

        Except the initial instruction set was made by AMD for the K8, not Intel. Plus, how long has it been since x86_64 was first introduced? Meaning, accounting for R&D time (most firms have the patent in hand before going all-in on a new tech idea to prevent it being stolen--they do that in medicine, I know), the clock on any fundamental patents on the tech is bound to be running out, no?

    3. captain veg Silver badge

      not sure which is worse

      That the reporter doesn't understand that that JITed code doesn't need to pass through the x86 instruction set *at all*, or that the Reg commentards have understood this as an attack on Rosetta.

      Most odd.

      -A.

      1. Richard 12 Silver badge

        Re: not sure which is worse

        This commentard assumed it was a quote from a spokesweasel and therefore obviously wrong.

      2. gnasher729 Silver badge

        Re: not sure which is worse

        If you ran the unmodified Intel Safari Browser on Arm, it would indeed translate JavaScript to intel code which would automatically be translated to ARM. It would be a small change in the source code to make it compile to arm instead of Intel code (assuming it uses llvm for compilation).

  3. tcmonkey

    Getting away from Chipzilla is all well and good, but to ditch the entire architecture instead of chatting up AMD seems like a bit of a stretch. Plenty have tried this whole mainstream ARM thing before, and all of them have failed, or at least not succeeded in the way they thought.

    1. Snake Silver badge

      ARM?

      Hasn't been successful before?

      Never underestimate the power of the Sheeple. Apple's attempt may prove to be the mouldbreaker.

      1. tcmonkey

        Re: ARM?

        I guess we'll see. IMO it will either work wonderfully, or flop spectacularly. I don't foresee much middle ground.

      2. Mage Silver badge

        Re: ARM?

        It will sell better than the first ARM based PC, the Archimedes. Was it 1987?

        Still, my ARM based Android tablet outperforms the Atom based Windows XP Netbook (now with Linux) and Atom Win10 tablet.

      3. Ian Joyner Bronze badge

        Re: ARM?

        Who are these sheeple? And what have they got to do with anything here?

        The term sheeple is judgemental and just used by people who think they are superior to everyone else, who doesn't see things their way.

        1. Snake Silver badge

          @Ian Joyner

          Sheeple. BAAAaaaaah...

          Sheeple. The people willingly spending lots of money to pay for new kit because Ma Apple tells them to. The people willingly changing architectures - for the FOURTH time - and buying all new software to run on those new architecture because Herr Jobs / Cook tells them to. The people who trade useful kit in for newer, shinier, more expensive kit because they are expected to.

          The people who do not question these things, as long as the directions are coming from the right source.

          You know. Sheeple.

    2. David Austin

      I'll give Apple the benefit of the doubt on this one - They've managed the last two processor transitions reasonably well, and for a project that's supposedly been swirling around for nearly a decade, I get the feeling they wouldn't have announced it now if they weren't confident it's ready and they can pull it off.

    3. Anonymous Coward
      Anonymous Coward

      Really?

      ARM is mainstream if you're looking at smartphones. Current smartphones have more CPU power and more memory and faster communication than the most powerful Microsoft or Apple OS desktop PCs of the 1990s. Quite a lot of them have higher resolution screens, too.

      Tablets are another field where ARM has a strong presence. Desktop PCs are becoming less important; laptops more so. Laptops absolutely must have good electrical power efficiency: ARM scores highly there.

      And don't forget: Apple designed Macs with Motorola's 68k CPUs, switched to the Motorola/IBM PPC CPU architecture, then dropped that for Intel. This is simply the latest change of CPU architecture for the Mac line: hardly "a bit of a stretch", rather more "business as usual".

      1. Sorry that handle is already taken. Silver badge

        Re: Really?

        Current smartphones have more CPU power and more memory and faster communication than the most powerful Microsoft or Apple OS desktop PCs of the 1990s.
        I'd even go so far as to say the mid-late 2000s, and that's possibly being conservative.

        1. _andrew

          Re: Really?

          Recent benchmarks on some sites have put the A13 in the iPhone 11 as faster than the latest MacBook Pro 13-inch. Definitely in the ballpark.

          The Register has commented favourably about several server-grade Arm parts being installed by the rack-load in AWS, Azure and GWS, so the "Mac Pro" grade parts should be fine too (Apple's cores are stronger than the ones in those many-core parts).

          And for aspiration, Fujitsu just pipped the Supercomputer list by a cool factor of two and a half, entirely based on Arm cores with the new SVE vector instruction set, not a GPU in sight.

          1. Anonymous Coward
            Anonymous Coward

            Re: Really?

            Don't forget that phones are limited by power, size, and cooling requirements too.

            A desktop chip could quite happily have ten times as many cores - or they could even just drop a bunch of processors on the motherboard.

            The RISC-PC from Acorn shipped with a 33MHz ARM processor, which could be upgraded to a 200MHz Strong-ARM, which could be upgraded to a Hydra (iirc) daughter-board with six CPU sockets. No reason why Apple couldn't do the same.

            1. doublelayer Silver badge

              Re: Really?

              In many desktop use cases, having a lot of cores will be less important than having some really fast cores at single-threaded tasks. Not that ARM couldn't do that; with the freedom of higher power draw and extra space, they can probably do that rather well. Still, it'd be worth the time of chip designers to keep in mind that a lot of cores will at some point not be so useful as a moderate number of fast ones. I'm curious how the single-threaded benchmarks of Apple's new chips will compare to those of comparable X86 ones. Given their confidence and using them in the MacBook Pro at launch, I'm expecting impressive things.

              1. Anonymous Coward
                Anonymous Coward

                Re: Really?

                Not sure I really agree with that.

                The last single-threaded piece of software that I worked on was pre-Windows 95. And even then, I can't say for certain that that was single threaded. (It was the game Theme Park. It was a long time ago, and I didn't write any of the core systems.) Since then though, everything I've ever worked on was multi-threaded to some degree. Games I worked on in the mid 90s were multi-threaded by default (relying mostly on worker threads for audio, networking, and user input processing), and since the late 90s multi-threaded by design with significant parts of the processing offloaded to worker threads with asynchronous communication channels between different parts of the code.

                I even had a bug report in one of my applications a few years back about a problem when starting up on machines with more than 32 logical processors. (A problem related to setting the thread affinity correctly, iirc.)

                If you're still writing single threaded, performance critical software today, you deserve to run slowly.

                1. Richard 12 Silver badge
                  Boffin

                  Re: Really?

                  macOS has a technical limitation that the GUI can only be manipulated and painted by a single thread.

                  Trying to touch the GUI from any other thread will crash immediately.

                  There's a lot of things that fall into that GUI bucket which you wouldn't expect (font metrics).

                  So you are actually forced to make a lot of things single-threaded.

                  PS: Don't set thread affinity. That Windows API actually makes things slower. There's another, more useful API for hinting to the scheduler about multi-socket and shared cache if you've proven it makes it faster.

                  1. Anonymous Coward
                    Anonymous Coward

                    Re: Really?

                    Yeah, sadly most GUI systems are still depressingly single threaded by nature. It is possible to run windows in different threads in Windows, but usually it's not worth the effort.

                    That said though, pure processing and rendering of GUI should never be a performance bottleneck. As soon as you know you have to perform a lengthy operation in response to user input, that should be offloaded to a background thread. Your GUI thread should *only* be used for GUI, and that should be reasonably lightweight.

                    Rendering in general is a single threaded bottleneck on a lot of systems. OpenGL is restricted to simple uploading of resources, and even that can be broken depending on the platform. DX11 provides support for setting up command buffers in worker threads, but it's so horribly broken in pretty much all driver implementations it's not worth bothering with. DX12 is better in that respect, but it's still quite restricted. Metal is about the same, if I remember correctly (it's been a while). Vulkan, I have no idea, but I guess it's similar to DX12 in this respect. Thankfully, GPUs are a hell of a lot faster these days. :)

                    But then again, most graphics intensive applications keep the main thread just for rendering and push all other processing off into worker threads.

                    Re: Thread Affinity: That *can* make things slower, but if you have a large(ish) set of data that you know will be accessed in a certain manner, thrashing your L1 and L2 caches as worker threads are bounced all over the CPU *will* make things slower. There's a time and a place for setting thread affinity. The trick is knowing when and when not to. :)

                2. doublelayer Silver badge

                  Re: Really?

                  The GUI thread mention above is a good one. There are some other ones you'll want to keep in mind. Here's a good one: lazy web scriptwriters. A lot of web JS is single-threaded and not all of it is efficient. If you've ever been on a large page with an inefficient JS system, you'll probably have noticed that certain operations are slow. For example, a page with a large table doing a filter and sort operation. Users will notice that and will be happier if the core running that inefficient code is faster. Similar things are true of things written simply, such as spreadsheet formulas, which various types of users rely on. And of course there are operations that aren't parallelizeable that code needs to operate; even when you have multiple threads, there may be a few using most of the CPU time while a lot of others wait for disk, network, or user input.

              2. gnasher729 Silver badge

                Re: Really?

                Look at the 3 fast cores in Apples A12: 128kb L1 cache for data and code each. 8MB L2 Cache, 16 MB Unified Memory cache. 30 integer registers, 32 vector registers. 7 instruction decode per cycle. 9 execution units. 2.5 GHz. Encryption operations, FMA.

                It’s not a toy. It’s very tough competition for any Intel processor. Actually, beats all the laptop chips core for core.

      2. P. Lee

        Re: Really?

        Different times.

        Also the cpu changes in the past were upgrades.

        I do notice security has been tightened in OS X so command line commands are blocked.

        Are we looking at an iPad Pro and keyboard?

        1. Dan 55 Silver badge

          Re: Really?

          Yes, the iOSification of macOS continues, the only real difference now with the hardware is the form factor. Not sure why Apple think Windows 8 will be a good idea.

          1. Dan 55 Silver badge
            Happy

            Re: Really?

            The downvoters can take a look at where all the new features in Big Sur come from then vote me back up.

      3. gnasher729 Silver badge

        Re: Really?

        If you compare with the 1990s: Dongarra himself checked that an iPad 2 would have made it into the 1984 top 500 supercomputer list.

    4. Anonymous Coward
      Anonymous Coward

      Apple can't make AMD chips. They can make ARM chips as ARM just licences their architecture. Apple have already made their own ARM chips. It wouldn't make sense for Apple to switch to AMD and then be at the mercy of another supplier.

      Apple love control and vertical markets. This is just getting another piece of the jigsaw under their control. ARM is easily mainstream enough, probably the most widely used computing chip in the world - definitely more than Intel and AMD combined.

      The difference is, unlike Widows, Apple are going all in with ARM, not making a version that is perceived to be inferior for ARM and keeping x86-64 as their main platform.

      1. iron Silver badge

        Apple can't make ARM chips either. They don't own a fab.

        Apple can design ARM based chips and then TSMC can make them.

        1. gnasher729 Silver badge

          That’s ok. AMD can’t make AMD chips either then.

      2. DanceMan

        Re: The difference is, unlike Widows

        Widows, that may be prophetic.

    5. PerlyKing

      Re: Plenty have tried this whole mainstream ARM thing before, and all of them have failed

      This is what Apple does so well: watch other people fail a few times, figure out how to do it a bit better, put some rounded corners on it and start printing money ;-)

      1. Ragarath

        Re: Plenty have tried this whole mainstream ARM thing before, and all of them have failed

        I assume you got the downvotes for the "rounded corners" comment. I am no fan of Apple but what you said gave me a chuckle as it's exactly what they do.

        Although I would change the 'do it better' part to 'market it better' as their solutions are not always better but are sold to the public in the way that they understand.

        1. Anonymous Coward
          Anonymous Coward

          Re: Plenty have tried this whole mainstream ARM thing before, and all of them have failed

          Take a closer look at Apples history, they've failed more than a few times before

          That they are still around might be attributable to some of those failures.

          1. Michael Habel

            Re: Plenty have tried this whole mainstream ARM thing before, and all of them have failed

            Or having MicroSoft come in to save your bacon, at the eleventh hour, pointing to thr magistrate there see if Apple "exist", then we (MicroSoft), can NOT be viewed as a monopoly in the computet market.

      2. Stork Silver badge

        Re: Plenty have tried this whole mainstream ARM thing before, and all of them have failed

        It is called second mover advantage. It is the second mouse that gets the cheese.

        1. Anonymous Coward
          Anonymous Coward

          Re: Plenty have tried this whole mainstream ARM thing before, and all of them have failed

          Not always. A REALLY savvy mouse can avoid the trap, or the trap may not be one-time-use.

    6. juice

      > Plenty have tried this whole mainstream ARM thing before, and all of them have failed, or at least not succeeded in the way they thought

      It's pretty hard to find a mobile phone which isn't running an ARM variant, and these days, the functional gap between a computer and a smartphone is rapidly narrowing.

      Hell, the odds are good that your smartphone has a significantly higher-resolution display than your laptop or PC!

      Beyond that, it's worth bearing in mind that Apple isn't using standard ARM chips. They're using their own heavily tweaked and tuned chips which are based on ARM. And since everything - including the GPU - is in-house, they're able to tweak and tune their software for their specific hardware, rather than having to cover all the bases in the way that Android/Microsoft/Linux has to.

      Then too, they've already got a tried, tested and proven hardware architecture, in the shape of the iPhone and iPad. The only real difference is that they're sticking a new OS atop - though even then, it's one that (at least originally) was based on FreeBSD and designed to be platform-agnostic...

  4. Anonymous Coward
    Anonymous Coward

    Apple baked chips

    Whatever.

    But, a great reason to increase pricing due to all that secret holy Apple-tastic circuitry.

  5. Anonymous Coward
    Angel

    more Apple marketing

    If it works as well as they claim it does, it won't make a bit of a difference to the end-user. It's just a different compiler target. And different opcodes. Does the end-user care about opcodes?

    If it doesn't work that well, what's the point?

    The only thing worse than being talked about is not being talked about.

    1. Michael Habel
      Facepalm

      Re: more Apple marketing

      No I susspect the End-lUser to only care about, why their copy of Adbobe CS is thowing hissy fits when it installed just fine on last years (Intel Mac), just fine. IIRC, wasn't this the reason why Windows RT (Which was also an ARM only solution, that nobody asked for), was such a smashing success for MicroSoft? Because it was just able to run "normal" Windows Software...

      Oh wait....

  6. This post has been deleted by its author

    1. _andrew

      Re: Compatibility is gonna be a problem.

      Might be a (small) problem for developers, but it won't be much of a problem for users. Apple has already trained developers that they need to keep up. Their cunning plan, over many years, is to stop un-maintained code from working at all. Ergo: all code that runs at all on a modern mac is actively maintained. And the next version will have Arm-supporting fat binaries.

    2. Anonymous Coward
      Anonymous Coward

      Re: Compatibility is gonna be a problem.

      I would sometimes prefer old stuff on Windows to break and fail. There are so many lazy software companies that still say "requires Internet Explorer x" or even Windows 7. Even under support contracts (niche software usually).

      If IE was being withdrawn completely form Windows 10 it would not give them the excuse of "well you just need to use IE to run it". Sure you can consider moving away but the cost, disruption, training etc means it gets put to the 'for another day' pile, especially when resources are stretched. If their software was to stop working in 12 or 18 months and they were to risk losing all their customer base then they would definitely get on with re-writing it to modern standards.

      It's the same reason we will still be stuck with IPv4 for the next 20 years.

      1. Charles 9

        Re: Compatibility is gonna be a problem.

        "I would sometimes prefer old stuff on Windows to break and fail. There are so many lazy software companies that still say "requires Internet Explorer x" or even Windows 7. Even under support contracts (niche software usually)."

        Sometimes, it's a matter of the software company not existing anymore, meaning they're kinda stuck with it and lack the budget to contract a replacement.

    3. David Austin

      Re: Compatibility is gonna be a problem.

      Expecting Apple to do what they normally do: Have great compatibility until they get bored with it. As you say, the mac ecosystem is very "Evolve or die."

      Not saying it's any better or worse than Microsoft's commitment to compatibility, even at the cost of system architecture improvements and security: They're just coming at the problem from two different philosophical ends of the spectrum.

    4. Michael Habel

      Re: Compatibility is gonna be a problem.

      Say that as much as you like. But, at least you are going into it with the foreknowledge that a) Its a new platform, and b) That yourgoing to need new software. Kinda beats the notion of releasing RT, and insisting it Windows, when you quiclky discover only after the fact the retched thing, can't even run Photoshop. Or, a ribbon-less version of Office.

  7. Anonymous Coward
    Anonymous Coward

    It'll work.

    Apple has done a Mac CPU switch twice before. Macs started out on 68k Motorola CPUs. 68k failed to keep up, so Apple went for PowerPC (Motorola/IBM). PPC could have kept up, only IBM wanted paying up front to develop updated CPUs, so Apple went for Intel because Intel could deliver adequate CPUs for less money.

    Each time, the transition worked well enough.

    Each time, Apple provided adequate backwards compatibility by enabling most old CPU code to run on the new machines well enough for long enough for most users, and made arrangements for updated applications to install with code appropriate for both old and new CPUs (back in the 68k/PPC era, one could select one or the other so as to save on disc space, even if that did sometimes require a third party utility to strip out the unwanted code). I used Macs through both transitions. Rosetta - the original version from 2006 which let you run PPC applications on Intel Macs - worked very well. I expect Rosetta 2 will work better

    And now, Apple has figured out how to roll its own CPU well enough that it beats Intel's offering for PC/laptop jobs. So for the third time, Apple's going to switch the Mac CPU to a new architecture. It's a well-trodden path for Apple. The firm knows how to do it, the user base knows what to expect, and the developers know what to expect.

    Really, it's not that big a deal. Mac users will in future have more power-efficient hardware which runs the code they want to use at good speed with good reliability and good security. Some Mac users will end up having to abandon ancient software that's not being maintained (that'll be me, then). The change in Intel's sales will be tiny.

    But perhaps this is a sign of what is to come.

    Intel's x86 architecture has been around for a long time. Then again, x86 is more of a marketing term than anything else these days, since 21st century x86-64 architecture has pretty much nothing to do with the original 8086 CPU from 1978. Given how long Intel's been in the CPU game, and how well it's done, I suspect the firm will work out how to keep its head above water - and maybe the term x86 architecture is on its way to the history books.

    1. Charlie Clark Silver badge

      Re: It'll work.

      Macs started out on 68k Motorola CPUs. 68k failed to keep up, so Apple went for PowerPC (Motorola/IBM)

      PowerPC was a planned replacement for the 68k series and Apple. The move to x86 would never have happened without Intel's support.

      1. Anonymous Coward
        Anonymous Coward

        Re: It'll work.

        "PowerPC was a planned replacement for the 68k series and Apple. The move to x86 would never have happened without Intel's support."

        Mmm.

        PowerPC was a joint project involving Motorola, IBM, and Apple. So the move to PPC would never have happened without support from Motorola and IBM.

        Apple was running MacOS X on Intel from the start of the MacOS X project - Intel CPUs were planned by Apple as a possible CPU for the Mac line for many years.

        As I understand it, when IBM wanted up-front money to develop the next generation of PowerPC CPUs, Jobs decided the time was right to switch to Intel. I seem to recall that it took about five years from the first Intel Macs before the high end Macs managed to switch, apparently because it took that long for Intel to catch up with the beefiest PPCs for the type of number crunching needed for video work and the like.

        I'm not sure exactly how much "support" Intel needed to give Apple to switch CPU architecture beyond, one assumes, some heavy discounts on the standard CPU purchase price for buying in large quantities and bigging up Intel's public persona (hey, even Apple's using our CPUs now!)

        1. Charlie Clark Silver badge

          Re: It'll work.

          Intel provided a lot of support for the compiler. Everyone seems to forget that Intel has a large software department and considerable expertise in compilers.

    2. beerfuelled

      Re: It'll work.

      One of the main issues I can see is with virtualisation. What happens with products like Parallels, VMWare, VirtualBox, Docker and even WINE? Unless I'm missing something then they will surely need to move into emulation rather than virtualisation.

      This might be a blocker to adoption for a lot of developers, and may well mean devs moving across to running Linux on x86 laptops.

      This wasn't really an issue with the transition to x86 as virtualisation wasn't widespread back then.

      1. Wyrdness

        Re: It'll work.

        "One of the main issues I can see is with virtualisation. What happens with products like Parallels, VMWare, VirtualBox, Docker and even WINE? Unless I'm missing something then they will surely need to move into emulation rather than virtualisation."

        They showed Parallels running Linux in the keynote. They also said that Docker works too.

        I believe them about Docker. I just spent a few minutes experimentally building an Arm Docker image on an i7 Linux laptop. I could even run the Arm docker image on x86 Linux (Docker transparently uses Quemu). I then installed Docker on a Raspberry Pi, copied the Arm image over and it ran fine.

        I believe that Docker Desktop on Mac already supports building Arm images, but I haven't tried that.

        I expect that we'll see a lot more Docker on Arm as cloud providers (like AWS) increasingly move to Arm servers for their performance per watt ratio.

        1. beerfuelled

          Re: It'll work.

          Cool. With x86 being so widespread in cloud providers I'm not sure I'd want to run the risk of weird issues of developing against a different architecture to the deployment architecture. But if QEMU works well enough then that's great.

          As you say though, this may potentially increase the demand for ARM in the cloud (to remove that disparity).

          Also for most workloads (running on a JVM, V8, etc.) architecture makes little difference anyway tbh!

        2. Richard 12 Silver badge

          Re: It'll work.

          It will have been ARM Linux unless they explicitly said it was x86_64.

      2. doublelayer Silver badge

        Re: It'll work.

        With Linux, you're likely to see few if any problems. For most Linux applications and components, source is available with changes needed for ARM already implemented for several other devices including ARM-based servers, Raspberry Pis, and everything in between. A lot of binary-distribution package repositories already have ARM-compiled versions stored, and things you build manually will likely also work fine. A few old closed-source components exist, but a lot of those are drivers for things you won't have on the new machine. If you're using unusual legacy hardware or something, maybe you'll have some difficulty, but that's basically it.

        For Windows, things are probably less rosy. Windows on ARM should work fine, but I don't know if Microsoft has any limits on what you're allowed to run that on (I don't know that you can just go out and buy an ARM-Windows installation disk; I think it's all been preinstalled versions on specific machines but I might be wrong). Microsoft probably has an incentive to facilitate that as soon as the new machines get released. However, ARM Windows can only emulate X86 (32-bit) internally, so trying to run X86-64 on it is probably going to be tricky. There are three solutions to this problem, each with their drawbacks:

        1. Don't run ARM Windows. Instead, let Apple's emulation run an X86-64 VM host which runs X86-64 Windows and X86-64 applications. If there are bugs in Apple's emulation, you'll probably see them here. I'm guessing it will work, but it will be terribly slow.

        2. Wait for Microsoft to get emulation for X86-64 on Windows on ARM, then things should work well. That should be fine, but it will probably take a while.

        3. If you have tasks that require running X86-64 code on Windows, don't buy an ARM device. Wait for compatibility to be improved on Microsoft's end, and stick to an AMD or Intel processor for the time being. Since existing Macs will continue to work and new ones will be split between ARM and Intel, you can still use one of the Intel models if your preference is to use a Mac as the main machine. In a couple years, I'm guessing the situation will have improved for this use case, or at least if it hasn't you'll know not to expect an imminent change.

  8. Steve Davies 3 Silver badge
    Facepalm

    Wot?

    Cook perhaps is referring to an update for Apple's Mac Pro workstation, where power utilization and battery life top out around 902 watts for the 2019 model.

    What battery other than a very tiny watch sized one does the Mac Pro have? The Mac Pro, like the iMac has to be fed with 230vAC (of 115VAC). I agree for the macBook pro and that it needs more battery life.

    My 2015 MBP (bought secondhand) is getting rather long in the tooth. As I mostly write fiction on it these days I may well get one of the new devices in a year or so. I'll let all the Fanboi's rush out and get them first. Then they can wail long and hard about this function or that function not working like it did in 2009. ROFL. Things change people. I may even pick up one secondhand at a good Apple Tax free price.

    1. werdsmith Silver badge

      Re: Wot?

      I was waiting for Macbook pro to ship with a working keyboard before buying, then WSL2 appeared and covered off a particular need on Win10, so I don't need to change yet. I will look again at Macbook pro when the ARM machines ship, and as you say, they've been shaken down in the wild a bit.

      Apple is following linux which is already very comfortably available on ARM. It is inevitable that Win10 will follow.

    2. Anonymous Coward
      Anonymous Coward

      2015 MacBook Pro

      I also have a 2015 MacBook Pro, but it still works just as well as it ever did (unlike my work 2011 MacBook Pro, which finally started to become noticeably sluggish when upgraded to High Sierra, and which probably really does need replaced now, not least because it is no longer supported by newer MacOS versions, but that's not a bad lifespan overall).

      I did take the gamble (thanks to the non-upgradeability of newer Macs) of speccing more RAM and as big an SSD as I could afford when buying my own MBP, which I'm sure has helped. Yes, it's a painful price gouge at Apple's inflated upgrade prices, but you can almost convince yourself that it's worth it if you regard it as an extra £1/day over a year or so. Almost. But given the number of browser tabs I tend to have open, if there is one thing I've learned, it's to not skimp on RAM if you possibly can.

  9. thames

    Probably not a big deal, if you already have reasonably solid code.

    I ported a load of C code from x86 to ARM last year, using a Raspberry Pi as the testing target. It was relatively easy and I can't recall any particular problems. The main effort that I can recall was figuring out which compiler switches to use to specify the ARM CPU model, something that I expect Apple will take care of for developers.

    Of course previously I had already ported the code from x86-64 to 32 bit, and had it using multiple compilers and operating systems. That had wrung most of the problems out before I got around to ARM, as this process exposes a lot of latent bugs which you might get away with if you only use one target.

    Going from 32 bit ARM to 64 bit ARM later was painless, again once I had figured out which compiler switches to use.

    My conclusions are:

    1) Have a good testing set up.

    2) If possible try doing some sort of porting on x86 first, such as using different compilers to wring out the latent bugs.

    3) Have a good set of benchmarks for the performance dependent code.

    4) Check to see if you are using any in-line assembler or compiler built-ins and have a strategy for dealing with those in the same code base.

    1. Lee D Silver badge

      Re: Probably not a big deal, if you already have reasonably solid code.

      Same.

      The coding won't really change much, except the compiler or cross-compiler in use.

      I spent a lot of time cross-compiling to ARM for a number of handheld consoles (weird Korean things like the GP2X which had two ARM processors and ran off AA batteries). The cross-compiling was the easiest part of porting anything. It "just worked".

      Contrast with when I tried to compile a piece of software that also worked on Windows, Linux, x86 and ARM, 32 and 64-bit to any Apple platform - which basically is almost impossible without owning an expensive up-to-date Mac, paying for a developer's licence, and using XCode on that machine as the way to compile it. It literally stopped me even trying to support Apple devices. Bear in mind that that same code ended up on Wii homebrew, PSP and all sorts of other platforms with little more than a recompile and a tweak.

      I'm not worried about the ARM side. That's probably the most sensible decision Apple has ever made. But the development side, and the ability for Apple to create their own, custom, bespoke chip and throw whatever they want into it... that's going to be pretty scary. They'll make their things incompatible with everything else, and you'll have to use XCode to compile for them, and only the latest XCode, and that will only be updated on the top generation of Macs, and no cross-platform compiler will produce working code. That's what I see happening.

      Literally the advice the last time I looked was to use cross-platform development / debugging tools like Eclipse on every platform "except on Mac, where you just have to use XCode, either directly or as the compiler in the background of the Eclipse IDE".

      Apple's monoculture is going to hurt developers a lot. And, now, they can't even have a Mac with Bootcamp so they can have all their platforms on one device - Intel->ARM translation is NEVER going to be fast enough to be comfortable.

      1. gnasher729 Silver badge

        Re: Probably not a big deal, if you already have reasonably solid code.

        "you'll have to use XCode to compile for them, and only the latest XCode, and that will only be updated on the top generation of Macs, and no cross-platform compiler will produce working code"

        The latest Xcode runs on my 2015 MacBook Pro. If you are afraid to pay for the hardware or $99 a year to be able to put apps on the AppStore, then you are frankly an amateur.

        1. Lee D Silver badge

          Re: Probably not a big deal, if you already have reasonably solid code.

          Your 2015 Macbook Pro is more than I've paid for every machine I've ever used professionally in the last ten years, even for a second-hand purchase of a 5-year-old unit.

          I never said I wasn't an amateur. I'm quite literally a homebrew / hobbyist coder who has dozens of active projects, on dozens of platforms. And not one of them includes Apple because of their policies. I've been doing that for 20 years, and no other platform have I flat-out refused to code on because of the restrictions.

          I work on Intel/Linux, and I want to cross-compile from one machine to all targets, and test on-platform when it comes to things not working on a particular platform. You pretty much cannot do that for only one platform. There's a reason coders were buying Macbooks - because that's the one platform from which you could compile for Apple and bootcamp/cross-compile to literally EVERYTHING else. And now Macs will be ARM, and so Bootcamp will be a stunned sloth.

          Sorry, but the idea of having to have a particular computer just because it's the only way to compile software for that target is completely alien to everything I've done in the last 20 years.

          I'm not "afraid" to pay it. I refuse to. Because not one other mainstream mobile/desktop platform in the world is asking me to do that, or even hinting that it'll make things easier if I did. I code, compile, test, submit, and I'm done. No money changes hands. And at no point is my development environment or working desktop setup determined by anything except "what I already have / like".

          1. DemeterLast

            Re: Probably not a big deal, if you already have reasonably solid code.

            You may rethink that. ARM has been making some pretty impressive in-roads into the server market for what is basically a roll-your-own processor. This move by Apple may shake MS from its stupor and get them really trying with their ARM Windows project.

            The near future won't be all ARM all the time, but it's going to be a significant player.

            I'm not sure why you're down on the Apple hardware either. They've made some real stinkers, but the 2015 MBP isn't one of them. I'm looking to upgrade my 2011 MBP to one. In the Year of Our God 2020 no less. Hell, I keep a $99/year Apple Developer account even though I don't use it just in case I want to. I pay more than that for a stupid Dropbox account because stupid Dropbox double-dips on shared folders and the people I work with love their stupid Dropbox. At least Apple actually provides something meaningful for your yearly Ben Franklin tithe.

            I mean, it's no skin off my nose if you want to ignore a significant market out of pique, but I think it's shortsighted.

          2. thames

            Re: Probably not a big deal, if you already have reasonably solid code.

            I suspect that developers who target the Apple Mac product line specifically will now have to buy two Apple PCs instead of just one. You don't know if something works until you've tested it, and testing means having a running system.

        2. phuzz Silver badge

          Re: Probably not a big deal, if you already have reasonably solid code.

          you are frankly an amateur

          Plenty of software used by millions of people is written by amateurs. Linus Torvolds wasn't paid for his work on Linux until 2003.

    2. gnasher729 Silver badge

      Re: Probably not a big deal, if you already have reasonably solid code.

      "Of course previously I had already ported the code from x86-64 to 32 bit"

      The current MacOS doesn't support 32 bit code. Newer iPhone ARM processors don't even have the capability of running 32 bit code. So everything is 64 bits, no problems.

  10. P. Lee

    Why?

    What’s the consumer benefit?

    I’m not convinced more battery life is enough motivation to break compatibility.

    Has Apple run out of ideas? $3b Is not a big saving for them. As people say, no one cares about the cpu.

    1. Mike 137 Silver badge

      Re: Why?

      Consumer benefit? Who gives a fetid dingo's kidneys?

      The only folks that matter here are the vendors. The fact that "It's been estimated that Apple's slow-motion abandonment of Intel silicon [...] could cost Intel about [...] 4 per cent of its revenue" is mentioned of course, but not the cost to the user. The whole idea is that you'll have to scrap all your applications and buy new versions from Apple.

    2. Dave 126 Silver badge

      Re: Why?

      1, price of Intel CPUs

      2. Security issues with Intel CPUs

      3. Issues Intel has had with its stated roadmaps to new CPUs

      4. Integration with other components. Apple's SoC isn't just a CPU, it has a secure enclave running a verified microkernel controlling access to webcam and mic, storage controller, neural net accelerator, Apple's own GPU, video codecs, modem, low power always-on if desired, etc etc

      5. Battery life

      6. Compatibility with old stuff isn't an issue for maintained software. For unmaintained software, there a several comparability approaches. And then, for users who whom even these steps are insufficient, Intel Macs will sold for a couple more years.

    3. A Non e-mouse Silver badge

      Re: Why?

      I think Apple's arguement is that they feel Intel's CPUs have stagnated whereas their own ARM based SoCs have grown (in performance) in leaps & bounds. Also, by owning the entire stack (CPU all the way up to software) they can design for just one customer with well defined use cases rather than as a more general design.

      This is a brave move by Apple (In my opinion far riskier than the other two CPU transitions they've performed) Only time will tell if it's a good idea or not.

    4. Anonymous Coward
      Anonymous Coward

      Re: Why?

      It is mainly about control and vertical integration.

      Apple don't like to be beholden to anyone, especially not a company the size of Intel.

      1. Anonymous Coward
        Anonymous Coward

        Re: Why?

        Re: "It is mainly about control and vertical integration."

        I suspect the motivation may also be to blur the line between iPad and MacBooks.

        And/or to get desktop apps on an iPad Pro if you attach a keyboard, on the same CPU as iPads.

    5. DJV Silver badge

      Re: Why?

      They'd run out of room on "10/X" stuff and had to do a Spinal Tap!

      Oh, and they needed to let Craig Federighi out so he could publicly salivate over their new icons.

      1. Dave559 Silver badge

        Re: Why?

        If there won't at some point be a release of MacOS 11 nicknamed "Stonehenge", I'll be very disappointed…

    6. Anonymous Coward
      Anonymous Coward

      Re: Why?

      "What’s the consumer benefit?

      I’m not convinced more battery life is enough motivation to break compatibility."

      When Apple switches Macs to a new CPU architecture, it makes sure that most things don't break. It's done it twice before.

      As for why: Apple won't tell us the real reasons, but my guess is that this way, Apple gets cheaper CPUs which work better for Apple's specific applications, because Apple can optimize them for the jobs Apple wants to do.

      From Apple's point of view, simply not being dependent on Intel's ideas on what's best for a PC CPU has to be attractive. It's got a lot of experience in designing ARM systems by now, and one assumes that Apple's engineers have demonstrated that Apple's approach does in fact provide Apple with the benefits it was aiming at.

      For us end users? <shrug> As ever, we can only get what's made available. If we don't want what Apple's offering, we don't have to buy it.

    7. gnasher729 Silver badge

      Re: Why?

      I expect Apple to have a laptop/desktop chip with 6 or 8 fast cores at 3 GHz or higher very soon. With that you get a significant speed gain for every Mac up to 8 cores. At a lower cost.

  11. SecretSonOfHG

    Keyword here is "maintained"

    Compare Apple's approach to Windows and the differences are clear: Windows runs code from 30 years ago, and likely will continue to run it unchanged. Thus, business with legacy apps (many of them no longer have source code or cannot be compiled on a modern system) can still continue purchasing and paying for Windows, as the cost of upgrading those apps is several orders of magnitude bigger than paying that small Windows license tax.

    So Winodws thrives on keeping old hardware and software up and running. Apple thrives on selling hardware with obscene markups, so they need you to change your gear periodically. Line of business apps be damned, which I'm not sure how many are there for iOS but I suppose the number is close to zero.

    1. Anonymous Coward
      Anonymous Coward

      Re: Keyword here is "maintained"

      Bang on. Windows knows that they create a platform for other software to run on, on a wide range of hardware, whereas Mac is an ecosystem which evolves relatively quickly leaving a trail of extinction in its path.

      Mac users can be split into 2 groups:

      1: "I always use Apple, and everything I own is Apple because all my Apple stuff runs well together."

      2. graphics/video folks, whose high end software performance often relies on manual assembly optimisations. Even if they can recompile perfectly on similarly "fast" processors they're still going to see a performance hit until the developers understand the "tricks" of Apples ARM processors.

      1. Maventi

        Re: Keyword here is "maintained"

        > Mac users can be split into 2 groups.

        I know many Mac users, but the majority of them don't fit into either of the groups you list.

        1. doublelayer Silver badge

          Re: Keyword here is "maintained"

          I'd add a few other groups:

          3. People who have used Macs before and gotten used to them. Now they'll keep doing so because they don't like change. This applies to Windows too.

          4. People who use Macs because they already have one and it works fine. They might decide to switch if theirs breaks, but they'll think about it then.

          5. People who have some specific application that's Mac-only. Similar to people who have some application that's Windows-only, these people are mostly attached to their device because the program runs. If an update doesn't work, they'll probably just stick to the old version.

          1. Anonymous Coward
            Anonymous Coward

            Re: Keyword here is "maintained"

            6. People who use Macs because they appreciate the build quality, power management, resale value and like the workflow of macOS.

            The majority of users I know fit into 6.

            1. doublelayer Silver badge

              Re: Keyword here is "maintained"

              You basically have to fit into 6 in order to get into any of the other groups. If you don't like Macs or Mac OS, you likely don't get one in the first place. The question these numbered groups answer is why people stick to Macs when they could do something else, and it usually comes down to liking them more, having a speedbump to moving to something else, or a dislike for change. While I don't have a problem with that characterization, I think the people you know fall into it because it's basically a superset of all the other groups.

              Consider me, for instance. I fall into group 6, because Mac OS gives me a nice set of supported applications with Unix tools. For this reason, I run a Mac alongside various Linux machines. However, I'm also in group 4. When my current Mac dies, I will consider the available options for replacing it, including not using a Mac for a while. Depending on what the current lineup looks like, I will either be interested or I won't, and that will be the main factor in the decision. There are some who would not be making a decisions, and those are the people who don't fall into group 4.

              1. Anonymous Coward
                Anonymous Coward

                Re: Keyword here is "maintained"

                Right, I hear you. I'm the same - group 6 normally, but it was a tough call on my last upgrade to not throw in the towel and get an XPS running Ubuntu. Once Linux folks improve the power management I may be ready to jump ship.

    2. Torben Mogensen

      Re: Keyword here is "maintained"

      "Compare Apple's approach to Windows and the differences are clear: Windows runs code from 30 years ago, and likely will continue to run it unchanged."

      30 year old (or even 10 year old) software should run fast enough even with a naive cross-compilation to modern ARM CPUs. I occasionally run really old PC games on my ARM-based phone using DosBox, and that AFAIK uses emulation rather than JIT compilation.

      And running really old Windows programs (XP or earlier) is not really that easy on Windows 10.

      1. Kristian Walsh Silver badge

        Re: Keyword here is "maintained"

        "30 year old (or even 10 year old) software should run fast enough even with a naive cross-compilation to modern ARM CPUs"

        Cross-compilation is fine, but what exactly are you planning to link against?

        Apple entirely broke backward compatibility in 2001 when MacOS X was launched. Code written for the previous OS's API, the Macintosh Toolbox, will not run without an application host ("Classic") which has not been supported for about 15 years.

        Newer code that ran on OSX may also fail to run on the latest release of OS X due to dependencies on CarbonLib, the API that allowed easy porting of Macintosh Toolbox applications to OSX. CarbonLib was finally discontinued this year (the writing was on the wall back in 2011 when Apple announced that they would not port it to 64 bit).

        In short, getting very old Mac software to run on new Macs generally isn't possible: even if you have the source-code, chances are the old software links against an API library from Apple that is no longer supported.

        (Windows isn't perfect either: 16-bit software has not been supported for a very long time, and 32-bit is going the same way, but Microsoft does provide better supports for getting old source-code to compile and run on a new OS than Apple does)

        1. Charles 9

          Re: Keyword here is "maintained"

          If the software is THAT old, it was probably built against the old PowerMacs and so on and should be put in a virtual machine in any event. There's no way to run 90's DOS programs and 16-bit Windows apps natively in a modern Windows machine given the lack of things like ports and so on, so why can't virtualization be used, unless we're talking custom hardware like that lathe I read about here years back (that HAD to be XP because the custom controller board was ISA and ISA was dropped in Vista).

  12. Oh Matron!

    Patrick Moorhead

    Obviously didn't watch the keynote: "changes in any of their code...." If you've done everything properly and not used poorly / non maintained libraries, then you "should" be okay with just a recompile

    However, I realise that 3rd party libraries are both the saviour and bane of a developers life

    But, and I see this so much in Agile, developers use libraries because they are either lazy or under too much pressure to meet those user stories

    1. Mage Silver badge
      Coffee/keyboard

      Re: Then you "should" be okay with just a recompile

      OK for devs, but not much good for Users. But the likes of Adobe etc only want to rent software anyway.

      1. wjake

        Re: Then you "should" be okay with just a recompile

        "But the likes of Adobe etc only want YOU to rent software anyway."

        FTFY

        1. Charles 9

          Re: Then you "should" be okay with just a recompile

          Renting can refer to both directions. For Adobe's side, it's usually termed "renting out".

  13. trevorde Silver badge

    Cock of the walk one week, feather duster the next

    My ex-boss splurged about £30k on a top of the line Mac Pro, complete with 768GB of RAM plus two £5k Apple display monitors *and* £1k stands. He also spanked about £3k on some nVME RAID drives. Bet he's really p!553d off that it's all obsolete now!

    1. Anonymous Coward
      Anonymous Coward

      Re: Cock of the walk one week, feather duster the next

      If he splashed out £30k for a Mac with a, probably, pointless amount of RAM, then I would presume he'll be excited about having a new toy to splash out anther load of cash on. I'm sure the monitors and RAM will still work in the new machine as well - although with Apple, you can never be sure.

    2. Wyrdness

      Re: Cock of the walk one week, feather duster the next

      How's it obsolete? Apple haven't replaced it with anything yet and we don't know when they will. They've even said that they've got new Intel Macs in the pipeline, and will be supporting Intel Macs for years to come.

    3. gnasher729 Silver badge

      Re: Cock of the walk one week, feather duster the next

      I think if your boss was clever enough to make enough money to buy a £30k Mac, then he will have the mental capacity to realise that this Mac will continue running just fine for many years to come. ARM processors that can replace a 28 core Intel processor will be the last ones to arrive anyway.

      1. timrowledge

        Re: Cock of the walk one week, feather duster the next

        Already here.

        https://store.avantek.co.uk/avantek-384-core-cavium-thunderx-arm-server-h270-t70.html

    4. Lee D Silver badge

      Re: Cock of the walk one week, feather duster the next

      Serious question:

      What does he do with it?

      Because that's more money than I've ever spent in any single transaction, even financed, unless you count a house purchase (and even that, the deposit was cheaper).

      If I spent £30k on a PC, it would literally blow anything Apple out of the water so far they'd be in orbit.

      It barely cost £6k for a 768Gb RAM machine from Dell, with serious server processors. I can't justify that amount of RAM in my professional life running entire networks, what the hell justifies that amount of RAM in a single machine for a single user?

      I know you'll say video-editing or something, but there are far cheaper ways to get a machine capable of that kind of spec than a Mac.

      1. Anonymous Coward
        Anonymous Coward

        Re: Cock of the walk one week, feather duster the next

        agreed, same reason as years ago the lab where i worked changed from Sun and SGI workstations to DELL precision workstations running Linux. You could get something as powerful, but not quite as shinny shinny as the Sun or SGI, for about 1/3 of the cost. And where are Suna and SGI now!

      2. Charles 9

        Re: Cock of the walk one week, feather duster the next

        Well, what's the state of the art of video editing on both platforms these days? I know professional video studios commonly swore by Apple machines in the past.

      3. Matt_payne666

        Re: Cock of the walk one week, feather duster the next

        well, if we are willy waving....

        My SGI Octane2 was about £30,000 new

        My older Silicon Graphics (not SGI) Onyx Infinite reality workstation was closer to £300,000

        apple, smapple... far out design was the late 90's!!!!

  14. Lee D Silver badge

    So this means that Macs will be pretty much useless for running Windows via Bootcamp, one presumes?

    1. gnasher729 Silver badge

      "So this means that Macs will be pretty much useless for running Windows via Bootcamp, one presumes?"

      It was shown to run Intel apps in a Linux VM. No technical reason not to run Windows in a VM. Bootcamp will likely be gone, but then a VM allows you to run MacOS and Windows at the same time. The only ones with a problem are people buying a Mac exclusively to run Windows.

      1. Torben Mogensen

        "The only ones with a problem are people buying a Mac exclusively to run Windows."

        That used to be a thing, when Macs were known for superior hardware and design, but these days you can get Wintel laptops with similar design and build quality for less than equivalent Macs. So while a few people may still do it, it is not as widespread as it was 15 years ago. It is certainly not enough to make a dent in Apple's earnings if they switch to other brands.

        Besides, if you have already bought a Mac to run Windows, you should not have any problems: Windows will continue to run just fine. Just don't buy one of the new ARM-based Macs to run Windows, buy Asus, Lenovo, or some of the other better PC brands.

      2. Lee D Silver badge

        An Intel Windows image running in a ARM OS virtualisation hypervisor is *not* going to be virtualisation. It's emulation, effectively.

        You can only virtualise when the underlying architectures are the same, and even with any amount of clever tricks, it's going to be dog-slow in comparison.

    2. werdsmith Silver badge

      So this means that Macs will be pretty much useless for running Windows via Bootcamp, one presumes?

      It is only a matter of time before Windows follow linux and Apple onto ARM.

      1. phuzz Silver badge

        Windows CE has supported ARM for years, and there's also Windows 10 ARM, which as far as I can tell is pretty much the Microsoft equivalent of everything that Apple has just announced (x86 emulation and everything).

    3. Charlie Clark Silver badge

      Probably, for the two who do it. It was fashionable when the Intel Macs came out for people who wanted the status of Apple hardware to run their Windows apps but virtualisation was generally good enough for most things - I was certainly using Windows XP on Parallels to remote control InDesign in 2008 without many problems.

  15. davenewman

    Safari still bad a video conferencing

    One reason fewer people use Safari on the desktop is the problems people often get when running video conferencing systems on it - particularly the ones that don't have their own client but use WebRTC. as I found out when running an 800 person conference in Hopin.

  16. 45RPM Silver badge

    When Apple announced that it was going to transition to PowerPC, I was nervous. I didn’t like it. And yet, when it happened, nearly all of my old software ran perfectly - and that which didn’t hadn’t worked properly since the move away from System 6 anyway, so it wasn’t the new CPU at fault.

    When Apple ditched Nubus for PCI I was similarly concerned. Why? I have no idea. It’s not as if my Mac was stuffed full of Nubus anyway. But Nubus was a familiar old friend - and PCI came from that other place.

    The move to OS X didn’t worry me too much. I’d been using A/UX. From my perspective, it was just the resurrection of a great OS which had been dead for a few years - not that OS X, being derived from NeXT, had any A/UX in it anyway - but the concepts are remarkably similar.

    The move to Intel only worried me insofar as I thought that developers might concentrate on Windows only, and just wrap their software in a translation layer to support MacOS, resulting in a diminishing in quality. This happened with games, and not much else. As for buying a Mac just to run Windows - I still think that that’s barking mad. There’s some very nice Wintel hardware out there - you don’t need to buy Apple to get great hardware - you just need to be prepared to pay a hefty chunk of money (which you were going to do anyway if you buy Apple).

    From what I’ve seen, the new Macs will run macOS software just fine - and that includes virtualisation and emulation software (so you can still have Windows if you really must). Docker is present, correct, and fully supported by Apple. You can have Linux too (although, and much as I like Linux - it’s on all my servers, and also on my media centre, why would you want to? MacOS gives you Unix and a nice GUI so, there at least, Linux isn’t necessary.)

    My only concern now is that Apple might prevent users from installing software from whatever source they choose. As long as macOS hasn’t been locked down in this manner I think that all is still well.

    It’s human to fear change. The intelligent thing to do is to put those fears behind you, make a rational judgement on whether the change is beneficial or not - and then embrace it if it is. In an age of climate change, anything that provides more CPU power per watt has to be a good thing. Intel has been as relevant as PPC for a little while now - I’m only surprised that this didn’t happen sooner.

    1. DemeterLast

      Often forgotten is the freedom that Apple has, as a near permanent also-ran in the PC world, to run around and break from tradition. People lost their damn minds when the iMac came without a floppy drive and was all USB. "Madness!" they cried from behind their Windows PC. "Apple is going to destroy itself!"

      In the meantime, Apple the company has put together a string of very public and very successful fundamental processor changes with very little disruption, using very clever software hacks to move from instruction set to instruction set. If it wasn't smug, cash-rich, hipster-adored Apple doing it, tech people would be in awe.

      This move is a good one for Apple, if they learn something from the success of their iPhone SE line. An ARM Macbook Air with decent RAM/disk specs priced around $800 would utterly own the education market. They will be able to run the educational iOS apps already on the market with the provisioning by Jamf or whatever, and Apple will brainwas... I mean build a new customer base for the next 20 years.

      1. 45RPM Silver badge

        It is fun and fashionable to laugh at hipstery Apple - and, certainly, their keynotes are risible (but not in a good way - to my mind they feel very contrived). I also object to the way that people buy a product for reasons of fashion, rather than function. It's the same reason that so many people buy BMWs, Audis, Mercedes etc. Sure, for one or two people they might be the perfect car - for most though, it's just the cool thing to do. Blegh.

        But your use case and mine are different. For some people, gamers mostly as far as I can see, Windows is genuinely the best option. Linux is the best option in many cases. But, for what I need to do, macOS is absolutely the right OS - and, for my use case, it has been for thirty years, through the good times and the bad. Fashion has nothing to do with it.

        Apple has been guilty of nicking its fair share of ideas over the years. Superclock! and Watson to name two. It's also been accused of nicking things that were, at best, only inspirations - like the Xerox GUI (Apple's use of icons, menus and overlapping windows hadn't been seen before Apple came up with them). It's also been first with other personal computer technology - modern multimedia (with Quicktime), password managers (with Keychain, first seen back in 1993 in System 7.1 Pro), multiple monitor support (some time before 1987), metadata filing system (even before the Mac, back in 1982). I'm sure if I thought about it I could probably come up with some other examples too, but those are just the ones that occur to me at short notice.

        So, whilst fashion followers are always slightly irritating, it's unfair to characterise what Apple has done as brainwashing. Sometimes their technology is just better for a given use case - and, if enough people have that use case and if the competition can't copy the technology fast enough, brain washing isn't necessary.

        1. gnasher729 Silver badge

          “ Apple's use of icons, menus and overlapping windows hadn't been seen before Apple came up with them”.

          The overlapping windows story is funny. When Bill Atkinson returned to Xerox for a second visit, he remarked how difficult overlapping windows had been to implement. “What overlapping windows?” “The overlapping windows on your desktop”. “We don’t have overlapping windows”. “I’m sure I saw overlapping windows when I was here the last time”. “We’re sure you didn’t”..

      2. Charles 9

        "People lost their damn minds when the iMac came without a floppy drive and was all USB. "Madness!" they cried from behind their Windows PC. "Apple is going to destroy itself!""

        Apple may have been a little ahead of the curve there, but not by much. By the turn USB sticks in sizes of 8MB and more showed the way forward, given they weren't as restrained by physics as magnetic disks and later optical discs.

    2. MrBanana

      "From what I’ve seen, the new Macs will run macOS software just fine"

      The new Macs will run macOS software just as badly as before - FTFY

  17. Jason Hindle

    New Intel Macs in the pipeline?

    "In fact, we have some new Intel-based Macs in the pipeline that we're really excited about."

    I wish Apple the best of luck selling those!

    1. Anonymous Coward
      Anonymous Coward

      Re: New Intel Macs in the pipeline?

      "In fact, we have some new Intel-based Macs in the pipeline that we're really excited about."

      If I recall correctly, it took Apple about five years to fully switch to Intel CPUs - during the transition period, the firm was selling lots of high end G5 PowerPC Macs alongside cheaper Intel Macs, because the Intel Macs just weren't fast enough.

      1. doublelayer Silver badge

        Re: New Intel Macs in the pipeline?

        I'm not sure that's true. The earliest Intel models were made available in January 2006 (iMac and MacBook Pro). By May 2006, just four months later, the last PowerPC laptop was discontinued. By August, the PowerMac line was discontinued. The only remaining models were rack servers, and they only made it until November, just ten months overlap.

        The only way your figure might work is with software support--Apple didn't drop support for PowerPC machines until they released Snow Leopard in August of 2009. Still, that's only 3.5 years after the first public availability of an Intel model, and I think that's not the right figure to use. I think the seven to ten month number is closer to what we're discussing.

        Source: I used the database from the Mactracker app.

  18. Mage Silver badge
    Angel

    Yes and No.

    I said years ago that Apple would migrate the Mac to ARM. When it suited them. People laughed at me.

    Perhaps big companies over use branding, like MS calling things "Windows" that are very different and may or may not be compatible for existing applications.

    Apple "Mac" 68000, Power PC, x86 and recently x86-64 with 32 bit forbidden. Now ARM. Maybe all these families should have used different branding after the Mac 68K.

    Also the Mac OSX is based on BSD via NextStep. A totally different (and better) thing to Mac OS9 and earlier. At least they didn't make the mistake with iPhone that MS made with PDAs and Phones and called the OS, iOS rather than pocket Mac OS.

    1. 45RPM Silver badge

      Re: Yes and No.

      In fact, IIRC, the first iPhone was described as running OS X - before they renamed it iPhone OS and then iOS.

  19. andy 103
    Thumb Up

    No different to changing ports

    This whole thing reminds me of the debates that have gone on about Apple dropping certain ports from their hardware. Whether it's a 3.5mm jack on a phone, or a USB 3 port on a Macbook.

    Things move on.

    Years ago I had an AMD based laptop that had a parallel port, serial port, infrared (yes, infrared) and CD drive. Is any of that something I miss now? No, not at all. Same goes with the 3.5mm jack on my previous iPhone compared to the one I have now. Or that I can't plug in that 13 year old USB 2 printer to my Macbook.

    The clear reason they've done this is because they feel it's a genuine improvement over any of the Intel offerings. Particularly the fact they have little control over production and design of those chips. Given Apple's propensity to make money, they must be pretty confident consumers and end-users won't miss an Intel processor. Whatever your view, it's not a step backwards, even though at the time it might seem that way. In the same way that nobody will care that their phone doesn't have a 3.5mm jack, once they've got in their mind that things naturally progress and you don't carry on doing things in a certain way because "it's what we know".

    Also remember that ~80% of users of their hardware are non-tech people who don't even know - or care - what processor their device has, as long as it works, and works well. Their target market isn't "person who gives a shit what architecture the CPU is". It's, "person who will spend money on this device". Interestingly, Apple has a very good track record at the latter.

  20. Anonymous Coward
    Anonymous Coward

    RIP Hackintosh

    I can't see people being able to get newer versions of macOS working on non-Apple hardware a few years down the line. A shame, because my kid's chuffed with macOS running on his 8-year-old Dell.

    I do feel like Apple are consistently missing a trick here - not everyone can afford to buy their expensive hardware, but we could do with a real alternative to Windows that can run the likes of Office and Creative Cloud without having to faff with dual boot or VMs.

    1. Torben Mogensen

      Re: RIP Hackintosh

      "I do feel like Apple are consistently missing a trick here - not everyone can afford to buy their expensive hardware"

      Apple is primarily a hardware company, and they actively oppose running their software on other hardware, as the software is mainly there to sell the hardware. Same reason they don't license iOS to other phone makers.

      1. andy 103
        Facepalm

        Re: RIP Hackintosh

        Apple is primarily a hardware company, and they actively oppose running their software on other hardware, as the software is mainly there to sell the hardware.

        There's also a bigger reason. Consider all of the variants of Android, running on a huge number of different devices comprising of hardware from different manufacturers.

        There's a reason people talk about Apple's offerings saying they "just work". Because in terms of iOS, it's literally 1 OS with no variations (aside from version releases) running on hardware which doesn't vary much. They know their software works on their hardware. So supporting it isn't as expensive as if you throw in a load of variations.

        That's why it "just works". That's why it sells. The last part is what Apple (a business) care about.

        not everyone can afford to buy their expensive hardware - but there are plenty who can, and these are Apple's target market. They don't care about people who won't give them money. Because they are, you know, a business.

        1. Anonymous Coward
          Anonymous Coward

          Re: RIP Hackintosh

          All fair points. But given Apple make a big cut on App Store purchases, there's a decent chunk of the market there that they're potentially missing out on too.

      2. Lee D Silver badge

        Re: RIP Hackintosh

        Got nothing to do with people discovering that MacOS is just shiny interface over poor hardware, then?

        I ran a MacOS VM inside VMWare for years - allocating it all the resources of the Mac hardware that people were running it on, it looked very slick. But it's all looks. Sure the slidey bottom back was smooth as silk (pre-generated cached bitmaps in a range of sizes to make it look like it was shrinking/growing the icons in real-time distortions). But under the hood it was pathetic in performance.

        In reality, the hypervisor running that VM had something like 3-4 times more resources, and laughed at running MacOS in a VMWare box that out-specced anything Apple were selling at the time. And it was running on a laptop. A second-hand laptop. A nice one, no doubt, but not some £3000 monster. And it could virtualise MacOS with equivalent specs while I encoded video and played games in the background. It laughed at what MacOS required and the MacOS VM still worked faster than a real Mac.

        MacOS was all show, and it wasn't that long ago that I did it.

        I'm not sure that there is a serious hardware person out there who thinks Mac hardware is well-specced, certainly given the price. Going to Intel showed just how far behind they were.

    2. Havin_it

      Re: RIP Hackintosh

      I think you overlook the amount of resources MICROS~1 have to sink into making Windows compatible across every rando whitebox PC in all creation. (Never enough, but hey at least they try.)

      That's their choice and the ubiquity it's earned them has made it worthwhile I guess, but I can't really blame Apple or anyone else for not wanting the hassle.

  21. Anonymous Coward
    Anonymous Coward

    Dec Rainbow

    Some of the comments remind me of my first desktop PC at work - a Dec Rainbow that had 2 CPUs. There was a Z80 to run CP/M and an 8086 (or 8088 - I can't recall which) to run the new-fangled DOS. It had two 5.25" floppy drives and a 10MB HD.

    I never booted up CP/M (other than as an experiment) as all work was done in DOS - either a terminal emulator to access the corporate network (and access to the telex system) or to run Lotus Symphony - this latter ran a spreadsheet (plus integrated database and word processor) that was central to our team's work.

    Memories!

    1. Warm Braw

      Re: Dec Rainbow

      The Rainbow was a rather half-hearted hybrid. Although it ran DOS it was neither hardware nor BIOS compatible with IBM PCs and initially software that used anything other than DOS system calls wouldn't work. Later updates made it more compatible, though never totally.

      It demonstrated that "partly-compatible" isn't really a way forward against another vendor's product. Apple, of course, will be able to optimise the level of compatibility to placate existing customers whilst still encouraging them to buy new hardware.

    2. NetBlackOps

      Re: Dec Rainbow

      Which reminds me of my Amiga 2000 equipped with a 16 MHz 68030/68882, 4 MB of 32-bit RAM and a modified Bridgeboard with 80286 and hardware compatible with an IBM PC. I even ran Mac and BSD software. There wasn't anything, as I recal it couldn't either natively, virtually, or emulate at the time. Really nice Swiss Army Knife at the time.

      1. Charles 9

        Re: Dec Rainbow

        But I'd have to wonder about the price tag. I still remember observing those computers in the late 80's/early 90's but also recalled the price listings. Plus the fact that PC tech had moved on by then.

  22. Anonymous Coward
    Anonymous Coward

    Another win for ARM

    Theres a good chance it will soon rival Intel in revenue. Sadly our politicians in the UK are a bunch of techno illiterate cretins (demonstrated only last week with the failure of their "world beating" covid tracer app) and allowed this crown jewel of the UK tech industry to be sold to Japanese company Softbank without even a cursory check of whether it was a good idea for the UK economy and security. It makes me weep.

    1. Anonymous Coward
      Anonymous Coward

      Re: Another win for ARM

      Depends on your point of view. You can look at it as a Japanese company injecting nearly £30billion into the British economy, without moving the jobs or tax base out of the country. That's probably a whole lot better than it being bought by Apple, Intel, or some other American corporation. And whilst it's a UK company it could in extremis be nationalised if that truly were in the national interest.

      In the meantime it's a significant tie up between two tech companies in two countries that, very soon, really want close trading relationships (albeit due to some other "couragous" (Applbian interpretation) though democratic descision), and what better to help cement a political deal than a nice, juicy, friendly commercial deal. And if you want to get really abstract about it, letting a one company being bought by another when there are no global competition objections is a good way of saying "Britain is a good, no-nonesense place to do business" which, especially in today's world, is really rather important to the health of the economy and all our future job prospects. Britain being a good place to do business has to date been the only reason why there's so many large foreign car manufacturers set up in this country.

      If you want to look at what happens if you have an interventionist government in the mold of Comrade Corbyn, look no further than communist Russia or present day China, or North Korea. That might be the way you want to vote, but clearly here not many people agree with you.

      1. Anonymous Coward
        Anonymous Coward

        Re: Another win for ARM

        Well thats certainly the free market way of looking at it.

        "You can look at it as a Japanese company injecting nearly £30billion into the British economy"

        I'm not sure buying out shareholders counts as injecting money into the economy when many of the shareholders were/are foreign businesses.

        "without moving the jobs or tax base out of the country"

        And that could change at any time depending on how the economic winds blow. Plenty of large british science, tech and engineering companies have been bought up in the past (westland, marconi, ICI, the list goes on) where initially they kept a presense in the UK but eventually that was downsized and disappeared altogether. And "tax base" is a weasel term - sure the employees still pay tax but company profits disappear abroad and the exchequer sees very little corporation tax.

        "If you want to look at what happens if you have an interventionist government in the mold of Comrade Corbyn"

        Would you call France interventionist marxists? They have much stricter rules who can take over economically vital companies. You might think the free market should be left to its own devices but all you get if you do that is a race to the bottom with companies being asset stripped for their IP then the rmains thrown on the bonfire along with a lot of their employees and their former economic stimulus.

  23. Wade Burchette

    One doubt

    I can see many reasons for this move. And if anybody can pull it off, it would be Apple. I do one one doubt, however. For some reason, I doubt the prices of Apple computers will drop with this move.

  24. OldSoCalCoder

    hand-wringing

    No comments on the last three paragraphs about the Automatic Handwashing Detection? Are you kidding me? First of all, what's the tvOS doing listening in on my visit to the bathroom? Gee, I'm really looking forward to '...get a little coaching to do a good job' from the geniuses at Apple regarding my hand washing skill. Maybe this isn't designed for adults but you're supposed to strap $500 Apple Watches to your children instead of teaching them simple hygiene yourself? I would sincerely like to tell the VP of technology et al at Apple to come up with an ass-wiping machine learning app that starts with them all shoving their Apple Watches up their collective asses. Sorry, but this just sounds like too many smart people with too much time on their hands or absolutely clueless to the meaning of 'unnecessary intrusive technology'.

    1. Anonymous Coward
      Anonymous Coward

      Re: hand-wringing

      It makes sense, I always knew people with Apple watches were a bunch of automatic handwashers, if that's what they're calling it now.

    2. anthonyhegedus Silver badge

      Re: hand-wringing

      I wear my watch on my left hand, and I wipe my bottom with my right hand. Will I need to buy TWO apple watches now to make sure i’m doing it right?

  25. Peter Christy

    I wonder what they'll call the new iMac?

    Archimedes, perhaps?

  26. Dwarf

    Two minds

    I'm in two minds about this.

    I like Intel and I like ARM, but I don't want to be strong-ARM'ed into that being my daily workhorse, so how about offering both types of CPU and allowing me as the customer to choose what works best for me.

    If you can pull it off and give me equivalent performance as my current MacBook running the collection of workloads that I need for me to make my day rate - like remote software for customer sites (VM's,. standard Windows and Linux installs, Citrix,Zoom, that sort of stuff) then perhaps I might consider it, but if the supported products aren't there and aren't supported by the 3rd parties, then its no deal.

    I really enjoy my current MacBook, as its convenient and it just works, but I enjoy being able to pay the bills and buy a pint a lot more than having a new CPU that can last longer on a single charge or is cheaper than the existing model and in regard to that, I strongly doubt that a new machine will end up being cheaper to the end customer.

  27. Torben Mogensen

    Acorn welcoming Apple to the RISC club

    When the PPC-based Macs first came out in 1994, Apple claimed to be the first to use a RISC processor for a personal computer. Acorn, that had done so since 1987, ran an ad welcoming Apple to the club (while pushing their RISC PCs).

    If Acorn still existed, they could run a similar add welcoming Apple into the ARM-based PC club.

  28. Kevin McMurtrie Silver badge

    Curse of the LC

    In Apple's history, moves to new processors has gone extremely well... until the LC model comes out to recover R&D expenses. They were 1/5 the performance at 4/5 the selling price. The ARM migration can go well if Apple can resist hitting the "quick money" button.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like