back to article Apple's M1: the fastest and bestest ever silicon = revolution? Nah, there's far more interesting stuff happening in tech that matters to everyone

Apple Silicon has been the autumn’s hottest news in cool chips. Giving Intel two years’ notice, the first laptop and desktop with the new Arm-based M1 chip have shipped and the benchmarks run. If you believe some of the more febrile headlines, Apple has upended the industry, sparked a revolution and changed the face of computing …

  1. MatsSvensson

    Headline-whoring

    "there's far more interesting stuff happening in tech that matters to everyone"

    OK...

    Like what?

    Maybe actually write an article about that?

    1. el kabong

      I fear that too much shiny is taking a toll on some people's attention span.

      He just that article, look at this:

      "The real problems - the interesting problems - in computing are never solved by an SoC. The real problems - the interesting problems - are moving data, not doing sums in a CPU."

      See? I know, it's Shocking, isn't it.

      1. Ciaran McHale

        Re: I fear that too much shiny is taking a toll on some people's attention span.

        One of the reasons Apple silicon is so fast is that that the RAM embedded in the SOC is shared between the CPU and GPU, and this removes the need to move/copy data between the two.

        1. Tomislav

          Re: I fear that too much shiny is taking a toll on some people's attention span.

          He is not talking about moving data to the GPU. Moving data is meant between computers and/or data storage. For example having the fastest CPU in the world will not give you faster Netflix streaming, network DB access, better looking TikTok videos etc. It might reduce your wait to render said video, but uploading it will still take the same time as it would on a 10 year old Core Duo.

          1. Ciaran McHale

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            I agree that "moving data" can refer to many different use cases. But one of those use cases is moving data between the CPU and GPU. Apple silicon provides a very impressive way of dealing with that specific use case. The author of the article dismissed Apple silicon as being uninteresting because he claimed it offered no improvement for moving data, yet he completely ignored its impressive moving-data accomplishment. This suggests that (independent of the merits or otherwise of Apple silicon) the article was not well researched/written.

            1. Joe W Silver badge

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              This is actually mentioned up in the article, in one of the top paragraphs.

              (To paraphrase your comment: this suggests that the article was not read / understood) - I actually don't want it to sound so nasty, but it is pretty much the same sentence as above.

          2. David Lawton

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            Might not make Netflix any faster, but it's not just the CPU speed thats impressive. Its the up to 20 hour battery life in the 13" pro, or the fact the you just never hear fans on these things

            On my desk right now i've got 3 laptops, 2 Mac's 1 Windows. One Mac is a 16" Pro with an i9 Intel and the Windows is an i5. I do not have to do much on either of these to make the fans kick in and the battery drain quicker.

            The other Mac is a 13" M1 Pro. Everything seems to open extremely fast, things just feel instant, and as hard as i have tried the battery just will not drain fast, and i still have not heard the fans. I was even playing Cities Skylines on this thing, smoothly on an iGPU no fans blasting and it was not even an ARM app it was a translated Intel App. My i9 16" Pro even with dGPU and running natively goes bonkers on fans with this game.

            The Apple Silicon is exciting, because it absolutely obliterates its competition, this M1 is the Macbook Air chip really, which would normally have a Y Series intel in it. It not only wipes the floor with the Y series, it can put up a good fight against processors not even it its class, all while not sucking up much power. What happens when Apple releases the bigger, faster M1's for devices that would traditionally had a U or H series processor in them? Then the following year Apple increase the performance of them all by 20 to 30% on M2's? Then 20 to 30% again over that on M3's? Exciting.

            1. big_D Silver badge

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              I never hear the fans on my deskop either. And, battery life? It doesn't have any.

              My ThinkPad only revs up the fan at lunch time, when the AV software does its scan. That spends 99.9% of its time tethered to a docking station, either at work or at home, with large external displays and the dock charges the ThinkPad, so, again, battery life is not relevant to me.

              What they have achieved is very impressive, if you need what they have produced. It certainly gives Intel a kick up the butt, but if you are locked into applications that are Windows only, the Apple processors could be twice as fast as their Intel equivalents and it still wouldn't make a difference.

            2. NeilPost Silver badge

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              “Might not make Netflix fastet’.

              It does not take a genius to migrate the M1 Mac Mini to a new market segment. Convert to a Datacentre oriented blade and develop a Blade chassis and supporting ecosystem to support this.

              macOS is already just a pretty face and API set sitting on top of a branched version of Unix ... certified to Unix 03 level.

              1. anonymous boring coward Silver badge

                Re: I fear that too much shiny is taking a toll on some people's attention "span.

                "macOS is already just a pretty face and API set sitting on top of a branched version of Unix"

                Yes, and that's a good thing.

            3. imanidiot Silver badge

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              "or the fact the you just never hear fans on these things"

              That's because apple prefers to start throttling the processor before turning the fans on as an absolute last resort. Instead of giving you as much compute power as possible for as long as possible.

          3. Anonymous Coward
            Boffin

            @Tomislav Re: I fear that too much shiny is taking a toll on some people's attention span.

            Actually he was talking about things like adding memory which would be outside the SoC and thus talking about moving data along the bus negating any advantage of the SoC.

            Nothing to do with network traffic which also has more factors that occur outside your PC.

          4. StargateSg7

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            ".....having the fastest CPU in the world will not give you faster Netflix streaming, network DB access, better looking TikTok videos etc. It might reduce your wait to render said video, but uploading it will still take the same time as it would on a 10 year old Core Duo. ..."

            --

            I think we can solve THAT ISSUE!

            Quantum Teleportation aka Spooky Action At A Distance which allows a bunch of Xenon atoms trapped in a quantum well to be read in and written out at full-duplex PETABYTES PER SECOND data rates without being "trapped" by the quantum decoherence issue is one innovation coming from our Vancouver, British Columbia, Canada-based company soon enough.

            Since we can print quantum wells into BOTH CMOS and GaAs substrates directly, we can embedded high-speed communications to peripherals and external RAM right onto the processing chips themselves.

            Since we have BOTH an in-house designed and built 575 TeraFLOPs per chip 60 GHz GaAs general purpose CPU/GPU/DSP and a TWO THz fully opto-electronic DSP-and-Math-oriented array processor (19.2 PetaFLOPS per chip!), we can (and DO!) embed external access to memory and storage via quantum teleportation means. You only need 8192 bits of trapped Xenon quantum wells to make a PETABYTES per second pipeline and that takes up a bare few square millimetres of on-chip real estate. The quantum decoherence issue caused by reading/writing quantum bits is SOLVED by letting the accumulation of errors BECOME the data transfer mechanism itself!

            It's also the world's FASTEST wireless network system which basically has unlimited range (i.e. Quantum Teleportation propagates at 50000x the speed of light!) no matter how far apart the chips are and no matter the in-between terrain!

            This will all be coming out soon enough for public sale along with a few of our other inventions such as our ultra high-end non-aerodynamic principles-based aerospace propulsion system. The fellow Black Budget World has had quite a few computing tricks up its sleeve for quite a while and VERY SOON NOW, we will come up from under-the-radar and go all public sale and disclosure with our in-house products.

            Other companies in similar vein to ours ALSO have nearly the same types of products we do and they too are going "White Budget World" soon enough because of us. We'll see who shows their public cards FIRST!

            V

            1. very angry man

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              WHEN ? WHERE ? and HOW MUCH ?

              Can it play Crysis?

              1. StargateSg7

                Re: I fear that too much shiny is taking a toll on some people's attention span.

                YES! It can play Crysis --- That was ONE OF OUR VERY FIRST TESTS by the way!

                The RISC chips emulate ENTIRELY IN SOFTWARE, the entire x86 instruction set in real time and we can run Crysis at max-everything settings at ridiculous frame rates. The internal bandwidth tests we just stopped at 10,000 fps at 16,384 by 8640 at 64-bit RGBA colour resolution (we modded the game internally to create new resolution and FPS limits!)

                The super-cpu-chips are normally used for aerospace applications and are used in our "Infinite-Phyre" supercomputing system which is the REAL WORLD'S FASTEST SUPERCOMPUTER at 119 ExaFLOPS sustained using 128-bit floating point operations.

                The yield and cost of production on the actual GaAs chips in 2020 is now at such great levels that we can sell the 60 GHz, 575 TeraFLOP 128-bits wide Combined CPU/GPU/DSP chip with onboard 3D stacked Petabyte Non-Volatile RAM memory for less than $3000 USD and STILL make a fantastic profit.

                These chips are ENTIRELY "Designed and Made In Canada by Canadians" to ensure ITAR-free sales capability so EVERYONE in the world will be able to buy them. An original holding company in a foreign non-treaty aligned country (i.e. being the owner who subcontracted us to design and build) originated and holds all the Intellectual Property so we can export finished chips everywhere in the world without issue! We will OPEN-SOURCE the layout and tape-out files (i.e. the chips designs) and let ANYONE produce the chips if they have the expertise to do so. No royalties needed! We will then produce our own chips on our own lines for sale worldwide and competition is certainly welcome!

                THIS TIME, no-one will be able to hog the technology and/or fortune as we WILL be giving away the IP away FREE AND OPEN SOURCE!

                The supercomputing systems which we are running in both Vancouver and Northern British Columbia are running a WBE (Whole Brain Emulation) which simulates all the Sodium, Potassium and Phosphorous electro-chemical gating done in human neural tissue at a VERY HIGH FIDELITY.

                We basically digitally "Grew" a human brain in a computer and "trained" it like we would a child 24/7/365 with synthetic inputs including vision, auditory, physically sensory (i.e. touch) and let it learn by itself. We even instilled EQ (Emotional Intelligence) to simulate various human emotional traits

                including empathy, sympathy, cooperativeness, etc.

                We estimate it's current IQ at about 160 Human equivalent which makes it a super-intelligence. Then we put it to work on basic physics, quantum mechanics and chromodynamics, materials engineering, electrical power production systems, aerospace propulsion and medical systems research and development at NOBEL Laureate-levels of inquiry and end-results.

                It has profited us GREATLY with new insights and breakthroughs that are STUNNING to say the least. Because of these new breakthroughs, we can now afford to introduce new products and systems that will pretty much OBSOLETE EVERYTHING already out there!

                Most of the major stuff we will give away FOR FREE AS OPEN SOURCE designs and instructions.

                The medical and scientific-oriented "Star Trek Tricorder" device is now coming sooner rather than later!

                Coming soon to an online and real-world store near you!

                P.S. Lithium-Ion and Aluminum-Air batteries ARE DEAD IN THE WATER !!! We have something MUCH MORE POWERFUL and much longer-lasting!

                V

                1. GrumpenKraut
                  Boffin

                  Re: I fear that too much shiny is taking a toll on some people's attention span.

                  Error: your engineering is imaginary. Please rotate by 90 degrees and try again.

                2. Anonymous Coward
                  Anonymous Coward

                  Re: I fear that too much shiny is taking a toll on some people's attention span.

                  Upvoted both your posts. Why the downvotes? It's obviously humour!

                  PS. If it isn't humour, please use your SUPERBRAIN to INTELLECTUALLY DEDUCT an upvote ON THE FLY !!!

                  1. Anonymous Coward
                    Anonymous Coward

                    Re: I fear that too much shiny is taking a toll on some people's attention span.

                    This is not the first such "humor" posted by this user. If it is intended as humor, repeatedly claiming having done impossible things with a bunch of technobabble thrown in, I for one am getting tired of it.

            2. The Bam

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              Utter BS.

          5. danbi

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            And, while Apple is insistent on having Thunderbolt 3 (of Intel fame) and USB4 on their computers for connectivity with the outside world, the rest of the industry is happy with USB3. The bandwidth of these is hugely different.

            Granted, for the user they look the same :)

        2. Anonymous Coward
          Anonymous Coward

          Re: I fear that too much shiny is taking a toll on some people's attention span.

          How is that any different from what AMD and Intel CPUs with integrated GPUs have been doing for many years already?

        3. big_D Silver badge

          Re: I fear that too much shiny is taking a toll on some people's attention span.

          But you still have to get those terabytes of data into that pifflingly small amount of RAM and back out again...

          Look at SAP HANA (oh, God, did I actually bring that up as an example? The shame!) That run on huge machines, often with more than a terabyte of RAM. Our "relatively" small servers have replaced the old ones with 128GB RAM, the new ones have 512GB each. They also have SAS SSD-SANs for storage, because what is on the storage and how quickly it can be retrieved into memory and written back out again are almost more important than the actual speed of the processor and the cache RAM.

          Our clients still mainly have Core i3 processors and 4GB RAM, that is enough for Outlook and RDP, the "real" work is done on terminal servers and backend servers. And we spend a lot of time fine tuning that environment to get the most out of it. Moving to an SoC with onboard RAM isn't going to be useful, especially if we suddenly need to increase the RAM to cope with new loads.

          With the servers, you chuck another couple of ECC DIMMs at the problem. If it was an Apple SoC, you'd have to "throw away" the whole thing and hope they have a bigger SoC that meets your requirements.

          Don't get me wrong, I'm very impressed with what Apple have achieved, but in its current form, it is irrelevant to what I do on a daily basis, because I am stuck in a Windows and Linux world that needs heavy weight processors with lots of RAM.

          I'll keep an eye on what Apple is doing, and for the user who can find all their software under macOS and will never need more than 16GB RAM, they are a great option. I will be very interested to see what they do for professional level devices and not just entry level devices. I think that is when we will see how this move is really going to pan out.

          1. snoopy1710

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            Apple is just one licensee of Arm IP, but there are many others who focus on server workloads. Just imagine one of them "pulling an Apple" on the server side.....

            1. doublelayer Silver badge

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              That's been tried, and it hasn't sold very much. Not really for any defect in ARM; if you throw 96 ARM cores on a single server, you're going to get some pretty good performance if your task can easily run on that many CPUs but couldn't or wasn't ported to GPUs. However, it didn't increase the speeds that the M1 has and for many of the reasons stated in the article. Server ARM chipsets don't have memory inside the SOCs, so they don't get the very fast transfer from and to memory. They are also able to handle more memory because it's kept separate, so everything's a tradeoff.

              It really depends what you care about. I do some compute-heavy things on a local machine, so a processor that runs very fast is quite useful. Simultaneously, I don't need a lot of memory for those things, so an M1 with 16 GB of on-chip memory would probably be quite nice, and I'll have to consider it if my current machines need replacement (they don't yet). That said, many of my compute-heavy tasks aren't time sensitive, so although the M1 could probably do them faster, I don't need them to go faster right now. There are others for whom these advantages are less important. I don't really see much benefit in giving Apple's chip designs blanket praise for revolutionizing everybody or dismissing them as unimportant; both views are limited.

              1. big_D Silver badge

                Re: I fear that too much shiny is taking a toll on some people's attention span.

                Exactly, the right tool for the right job.

                1. zuckzuckgo Silver badge
                  Coat

                  Re: I fear that too much shiny is taking a toll on some people's attention span.

                  Are you talking about the author or the product?

          2. katrinab Silver badge
            Meh

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            Remember that this is the chip for a couple of ultra-portable laptops and a small form factor desktop, not for a server running SAP HANA. Apple pulled out of the server market many years ago, and I'm pretty sure they will have something a lot more powerful for their workstation offering.

            Yes, the M1 beats the pants off the 10900K in some workloads, and yes it does fall behind in others, but the 10900K is not competition for the MacBook Air or Mac Mini. The fact that you can make sensible comparisons between the two shows just how much they've done.

          3. danbi

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            "When all you have is a hammer"

            There is always more than one way to solve a task. In your example, one way is what you do: throw a "bigger" computer at the problem. Another approach is throw "more computers" at the problem.

            The latter is what "supercomputers" do, for the very practical reason you just can't build a "very big" computer that can compete. It is also what "cloud computing" does - it runs on a lot of "smaller" computers.

            Until now, the tools they have were power hungry Intel CPUs and let's not give them too much slack, not many years ago an entry level Intel "server" could not have more than 32GB RAM (while a cheaper AMD server you could fit with say 512GB). We also had ARM server chips, trying to emulate what the Intel chips were doing and trying to compete on cost (less profit).

            Now Apple has demonstrated that high performance, high-integration SoC can be done.

            SoCs are not now. Pretty much all microcontrollers around are of this kind. There are microcontrollers with wildly warying amounts of RAM/FLASH, I/O and CPU cores etc. Every one highly optimized for it's task. We take this for granted in the embedded world.

            So what Apple have demonstrated is you can have the same choice in the "desktop" and likely soon in the "server" world.

            Remember, once upon a time, the cache SRAM was a separate part you could replace in a DIMM slot. Today nobody argues that cache SRAM should be user replaceable, because having it integrated in the processor provides so much benefits and resolves so many issues.

            Now back to your SAP example. I am sure whoever writes SAP code might one day experiment on an "server" that instead of two-socket 28-core Xeons and 512 GB of RAM uses say a 8-socket 16-core 64GB RAM each (same 512GB RAM) M1-like SoCs, with fast interconnects (we haven't see this yet as there is no use for it in a notebook). With a total power consumption (SoCs with integrated RAM) of say 100W.

            Do you think you will prefer such server to your current one?

          4. tip pc Silver badge

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            If you can read from disk, process and put back to disk before the user notices any hesitation do you need more than 16/GB of RAM?

            Don’t forget the M1’s are reading ~3GB/s from its long term storage (ssd/nvme)

            On all the tests I’ve seen it’s proved the 8GB ram is adequate, I’d still get 16GB though.

        4. Anonymous Coward
          Anonymous Coward

          Re: I fear that too much shiny is taking a toll on some people's attention span.

          One of the reasons Apple silicon is so fast is that that the RAM embedded in the SOC is shared between the CPU and GPU, and this removes the need to move/copy data between the two.

          The CPU and GPU have been sharing memory on iOS for years. It wasn't something that was visible in OpenGL because that has its own shit design issues to deal with, but in Metal as long as you obeyed certain alignment requirements memory was memory regardless of who accessed it.

          1. ThomH

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            To be fair, `glMapBufferRange` was introduced with OpenGL 3.0 and permits a GPU buffer to be exposed within the CPU's address space for direct access where hardware supports it. Though `GL_MAP_PERSISTENT_BIT` which asks for a persistent mapping — i.e. one that isn't invalidated the next time you issue a draw command — arrived only in OpenGL 4.4 and therefore has never been available on the Mac. The OpenGL 4.4 specification was announced in 2013, so marginally before Metal but after Apple stopped putting any effort into OpenGL.

            But, yeah, it's another OpenGL-style workaround for a workaround.

            As someone who has recently converted a pile of OpenGL code to Metal, the big wins for me were formalised pipelines, resolving the threading question, and getting to be fully overt about which buffers are ephemeral at the point at which they're either loaded or unloaded from GPU cache.

            Apple's tooling for Metal is also leaps and bounds ahead of where its macOS tooling for OpenGL ever was, especially with regard to profiling, debugging, etc, so it's nice to have that supported in a first-party capacity but I think that's probably just a comment on Apple's lackadaisical approach to OpenGL over the years. Other OpenGL-supporting environments do a much better job here — even iOS had pretty cool frame capture/edit/replay facilities for GL ES back when iOS still supported that.

            1. Richard 12 Silver badge

              Re: I fear that too much shiny is taking a toll on some people's attention span.

              If one is going to compare APIs, better to compare like with like.

              That means Vulkan vs Metal.

              Metal is different to Vulkan for the sake of it, purely to lock you into Apple. There's no technical reason for Metal to exist whatsoever - Apple stayed in the working group just long enough to steal most of the ideas ATI had put forward, then withdrew and spent the next few years creating Metal.

              1. ThomH

                Re: I fear that too much shiny is taking a toll on some people's attention span.

                I was refuting that shared buffers are not something you can do with OpenGL. Though I neglected to include one important caveat: in Metal you can share texture storage and even compel the GPU to use linear ordering to avoid swizzling costs, if you know that that's the correct trade-off for you. I do not believe you can do this in OpenGL.

                Your theory about Apple is trivial to discount, however:

                The first meeting ever to discuss Vulkan happened in July 2014, and the call to form a working group happened in August. Metal was first released in June 2014.

                So it is trivially false that "Apple stayed in the working group just long enough to steal most of the ideas ATI had put forward" — there was no Vulkan working group until after Metal had launched and Apple was never a member. For that reason one can also immediately discount the claim that "Metal is different to Vulkan for the sake of it".

                Metal takes AMD's ideas from Mantle and adapts them to something that works across AMD, Intel and Apple's homespun GPUs. Wishing that Apple wouldn't be so quick to go it alone and so reticent to adopt a later standard is valid, alleging a weird conspiracy doesn't really stand up.

                1. Anonymous Coward
                  Anonymous Coward

                  Re: I fear that too much shiny is taking a toll on some people's attention span.

                  in Metal you can share texture storage

                  So that's probably why I had never had any interest in glMapBufferRange before. 99.9% of the data I was pushing around every frame on iOS was texture data. :)

                  1. ThomH

                    Re: I fear that too much shiny is taking a toll on some people's attention span.

                    Oh, well on iOS you could have used the wilfully obscure CVOpenGLESTextureCache to share CoreVideo buffers (which can just be BGRA) between CPU and GPU without copying. But it doesn't guarantee no conversions or reorderings, it just minimises them.

                    ... and it's not even Mac/iOS cross-platform. It's iOS only.

                    1. Anonymous Coward
                      Anonymous Coward

                      Re: I fear that too much shiny is taking a toll on some people's attention span.

                      Iirc we avoided that simply by only having to deal with data coming out of the video decoder. By setting up the config correctly the decoder would output Metal-friendly buffers to which we could map textures directly.

      2. Electronics'R'Us
        Holmes

        Re: I fear that too much shiny is taking a toll on some people's attention span.

        In really high performance computing, which is massively parallel, the speed of moving the data between nodes is a key performance metric, quite apart from the speed of the cores.

        You can see the state of the art www.top500.org

        There are a number of things involved (latency is a key issue) but in those environments there is a huge amount of data being shovelled around.

        1. danbi

          Re: I fear that too much shiny is taking a toll on some people's attention span.

          Very much true. But what is preventing the Apple SoCs to be part of this?

          It is just an efficient packaging of the typical "compute node" components on one die. Even better for massively parralel computing.

          1. Anonymous Coward
            Go

            Re: I fear that too much shiny is taking a toll on some people's attention span.

            It's not a huge leap to imagine the 16GB of on die memory being treated as L3 cache.

    2. Anonymous Coward
      Anonymous Coward

      Re: Headline-whoring

      It's pretty hilarious that there is literally nothing Apple can do that won't have somebody's panties in a bunch. They could literally create a cure for cancer and somebody would get mad about it. It is obvious to everybody + dog that what they are doing now with these chips is the most exciting thing that's happened in this space for ages and will probably push the entire industry ahead when others have to compete with them. It's a win-win situation for everybody, even if you don't want to use a Mac.

      1. iron Silver badge

        Re: Headline-whoring

        > They could literally create a cure for cancer and somebody would get mad about it.

        Yeah because their cure for cancer would involve visiting an overly expensive iDoctor, could only be administered using an iNeedle and require you to take an iTablet for the rest of your life (which would turn out to be basically candy corn & Zima).

      2. MrReynolds2U

        Re: Headline-whoring

        Pardon my ignorance but haven't they just taken what leading mobile manufacturers (including themselves) have been doing for years and just updated it with some high-end specs? It's SoC, you know like a RP or my mobile phone. It's nice but it's not revolutionary.

        1. Martin an gof Silver badge
          Happy

          Re: Headline-whoring

          You are right; it's something that people have been predicting for many years. At some point, "mobile" technology was bound to be sufficient to run a "desktop" computer. The difference - as is often the case with Apple - is that nobody else has quite managed to line all their ducks up and get it all right at the same time. My own opinion is that the main thing holding this development back is that Windows has proven time and time again to be totally unsuited to running on this kind of system. MacOS or whatever it's called these days has been fine tuned over the last few years - probably using experience from iOS and definitely using Apple's experience of previous architecture switches (68k -> PPC -> x86) - and was anyway a much better base to start with. I suppose you could compare it with the optimised Linux systems that have been crafted to run on low-resource computers such as the Raspberry Pi, though the M1 is hardly "low resource" by that standard!

          The only thing that Apple never gets "right" is the price - but that's from my point of view, and there are plenty of people out there willing to pay Apple prices. Sometime in the not-too-distant some other manufacturer will come along with a similar device, cheaper. It won't be running MacOS, so the question is what will it run, and would the great unwashed buy something that doesn't run Windows?

          It really doesn't matter if Apple's new computers are "non-upgradable". No, I wouldn't buy one on those grounds alone (let alone the price) but at the moment they give all the desktop computer that most people will need. The really interesting question is whether someone can make an affordable "M1 clone" computer that is upgradable. Maybe have 4GB in-package RAM and an external memory bus for expansion?

          Reminds me somewhat of the 1980s again - silly things like the 256 bytes of "page zero" on the 6502 which could be accessed much more quickly than the rest of the memory map, or even the differences between "Chip RAM" and "Fast RAM" on the Amiga.

          Interesting times.

          M.

          1. danbi

            Re: Headline-whoring

            Apple's notebooks had already their RAM chips soldered on the motherboard for years.

            So for their users there is no difference that the RAM chips are now soldered inside the SoC.

            When vendors can supply Apple with denser RAM parts, we will sure see 32GB, 64GB RAM SoCs etc.

            It is already trivial to run open source OSes on the M1, provided Apple permits baer metal loading.

            On the other hand, current MacOS has hypervisor calls built in, so you can create VMs and "boot" pretty much any ARM OS. There are people who already made ARM Windows run on it and that already emulates x86 code by itself. So those who need to run Windows on the new M1 Macs can already (technically) do it... if and when Microsoft decides to sell licenses for it, that is.

          2. Morat

            Re: Headline-whoring

            The day has come for Linux on the Des<HEADSHOT>

            No-one needs to hear that again...

        2. gnasher729 Silver badge

          Re: Headline-whoring

          The point is, they have done it. Nobody else has.

      3. krf

        Re: Headline-whoring

        Fanboyism is not something new. Youngsters here can't recall, but I can remember the last half of the 1970's - you know, the time of the start of the "Microcomputer Revolution." The 8080 processor from Intel had its fanboys, who laughed at the simplified 6502 from Motorola, and both were looked at with disdain by dudes using the later Z80 from Zilog. Just as today, no opinions, however reasonable or rabid, ever convinced anyone in the other groups.

        Enjoy the ruckus from the sidelines while you are using your own favorite gear.

        1. genghis_uk

          Re: Headline-whoring

          The old RISC vs. CISC wars of the 80's...

          Tech types will always find something to be evangelical about! (guilty as charged!)

          1. eldakka

            Re: Headline-whoring

            The old RISC vs. CISC wars of the 80's...
            .. and 90's.

            But the RISC fanbois actually won that one. As the article alluded to:

            no processor has run x86 code natively for decades, there’s always a much more efficient inner core chewing through microinstructions after the x86 code has been decoded and stripped down.
            That inner core on x86 is a RISC core.

            ARM is RISC, RISC-V is RISC (duh), Power is RISC, x86 is RISC (cores). There isn't much outside special-purpose limited run custom logic that isn't RISC.

            1. Michael Wojcik Silver badge

              Re: Headline-whoring

              I recall a bit of common wisdom from circa 1990: The 80486 was the best CISC CPU ever, and the i860 was the worst RISC design ever, but the 860 still outperformed the 486. (I said it was common; I didn't say it was right. But there was a grain of truth in it: despite its design flaws, the 860 managed 5-10 times the MFLOPS of the 486, so if floating-point was what you wanted...)

              I believe the IBM z10 was still true CISC, dispatching the actual zArchitecture CISC instructions to the cores (based in part on this IJRD article).

              That was 2009, though. The current z CPU is the z15, and this writeup mentions "CISC instruction cracking" in one of the illustrations, which certainly sounds like the pipeline is decoding CISC instructions into simpler ones.

              That would also make sense because z10 is superscalar but in-order, while z15 is out-of-order. It's generally easier to reorder RISCy instructions.

              z has over a thousand opcodes, between the public instructions and the special ones used in microcode. Going to RISC cores was probably inevitable. z10 cores were big - a thousand opcodes means a lot of gates.

        2. ThomH

          Re: Headline-whoring

          Pfft. Multiplexed bus on the 8080 plus two-phase clock input = instant fail. And I don't care what anybody says.

        3. drgeoff

          Re: Headline-whoring

          The 6502 was from MOS Technology. Motorola's offering was the 6800.

          1. Dan 55 Silver badge
            Windows

            Re: Headline-whoring

            The 6809 was where Motorola got it right and allowed OSes like OS-9 and UniFLEX. Shame it wasn't used in very many 8-bit home computers (Dragon, Coco, and that's about it).

        4. Anonymous Coward
          Anonymous Coward

          Re: Headline-whoring

          I distinctly remember the 6502 v Z80 arguments but I dont think I ever remember anyone having a good word to say about the 8080A. Which even Intel found faintly embarrassing. Hence the 8085. Or even the i432 (disaster) for that matter.

          Now the Z80 had a weird afterlife. I remember almost 15 years later people approaching me about GameBoy titles - Hey, you know Z80 asm? Yes. Like to fix some bugs in our code? Eh, No...

          Not as weird as the 6502 thought. When Apple shipped the Mac FX the trick question was to ask people what was the faster processor in the box. It was nt the M68030, it was the very fast custom 6502 I/O controller. Which due to the RAM wait states for the main processor could run code faster than the main processor. Although you only had a small scratch RAM to run in.

          Was never much of a 8086/88 man myself, it was 68K all the way for me. Although I must say I feel quite at home in ARM land. MIPS was quite nice too, but SPARC, just bloody bizarre. Reinventing ideas that Burroughs mainframes proved did not work back in the 1960's.

          Why has nt anyone revived the Transputer. That other processor of pure genius from the 1980's. That and the ARM.

          1. Michael Wojcik Silver badge

            Re: Headline-whoring

            The 432 had some good ideas - a capability CPU could have made a huge difference for security and reliability of PCs. But then as now the market was more than willing to sacrifice security for performance, and the instruction-set ship had sailed. Firms like IBM and Apple with captive markets could still get away with changing architectures; Intel and most of its customers couldn't.

            1. jmcc

              Re: Headline-whoring - the i432 train wreck..

              The big problem with the i432 was it was the poster child for how not to design a processor. It was huge, multi chip / slow and the software never really worked (in Ada). At the time the architecture was just bizarre. The 68000 was clean and elegant. The NS32000 series had some really nice ideas. Even the Z80000 was reasonable. Then you had the 8086 shantytown shack with random bits bolted on everywhere released as a stopgap processor until the "proper" processor came out. The i432. Those kind of projects always end in fiasco.

              Although the first iteration of the M68k MMU chip had issues by the time the 68851 MMU chip was shipped you had a rock solid hardware protection architecture with clean software support. The CPU coprocessor architecture was so clean you could implement a coprocessor in either software or hardware. Or a mixture of both. I know of at least one completely locked down system that had system security implemented / enforced by a custom soft/hard coprocessor. This would have been mid 1980's. Try doing that on a x86.

              In fact I once had a very funny conversion where I threw out as an aside the comment - of course if we were running on 68K we could just roll our own coprocessor - and getting completely blank stares from the assembled x86 types. Not only was the concept unknown to them but the very thought you could easily extend the CPU architecture on the fly by rolling your own coprocessor instructions was something they could not get their head around. Had to break out a 68K hardware manual to show them I was not making it up.

              At least for CPU processors the mid 80's to mid 90's is about as good as it gets and its been all downhill ever since. As someone once pointed out, the original 6502 was a RISC. And the first PowerPC Mac that was as fast as the M68060 motherboard Apple had internally in '93 was the PPC604 based 8500 that was shipped almost two years later. Because Moto did with the '060 what AMD did with the x86. Keep the instruction set but use RISC execution cores. The PowerPC was a complete dead end with a weird instruction set. EIEIO...

        5. Francis King

          Re: Headline-whoring

          Just as the Atari ST ran at a higher clock speed than a Commodore Amiga, for another example.

        6. MrNigel

          Re: Headline-whoring

          I remember purchasing both the 6502 and Z80 add-on modules for my BBC Model B in the early 80's. Back in the day when I had more money than sense via working in Saudi on the TEP4 contract.

      4. gnasher729 Silver badge

        Re: Headline-whoring

        Well, Apple solved the track & trace problem for COVID. (Adding Google because an iPhone only app wasn't going to work). That got lots of people's panties in a bunch, especially in the UK.

      5. gnasher729 Silver badge

        Re: Headline-whoring

        The reality is that they created a chip to be used in their low-end Macs, which happens to be in performance right between Intel six- and eight-core chips. No more, no less. Which is pretty astonishing. And they are working hard on better chips.

    3. amanfromMars 1 Silver badge

      Re: Headline-whoring

      Howdy, MatsSVensson

      Re: "there's far more interesting stuff happening in tech that matters to everyone".... Like what?

      Like this, and all the other stuff shared in that below, which is following and worth following and actually writing articles about. And it's just the tip of a titanic iceberg. :-)Real Cool Source/Hot Core Force:-)

      This environment is "inhabited" by knowledge, including incorrect ideas, existing in electronic form. It is connected to the physical environment by portals which allow people to see what's inside, to put knowledge in, to alter it, and to take knowledge out. Some of these portals are one-way (e.g. television receivers and television transmitters); others are two-way (e.g. telephones, computer modems). ...... http://www.pff.org/issues-pubs/futureinsights/fi1.2magnacarta.html

      The world as IT is, but not as you may know or have known it .... and experienced it ‽ .

      Here's more essential reading to help you towards a greater understanding for ACTive Advanced Astute Engagements in Mutually Beneficial, Positively Reinforcing Agreement with Practically Novel Field AIgents....... Here's our Future

      Or are you planning on heading, without a moment or two's deeper thought, to somewhere else with everybody infested and infected with rabid tales of COVID-19, following the info and intel fed to media to present to the masses and further realise for the pleasure of a very selective few?

      That's not a pleasant place to be dragged and/or drugged into.

    4. twenex1978

      Re: Headline-whoring

      Agreed. Scant on detail and makes it sound like Apple are going back to writing their operating system in machine code. Maybe I'm just an Apple fanboi - despite writing this on KDE Neon on a Lenovo ThinkPad - but data mining how many people are buying grey socks won't ever be interesting to me, even in holiday season. Whether my apps run faster than they did before or if I'll notice if the architecture changes underneath them (if it does, it's bad, and yes I'm aware that some of the command line Unixy stuff doesn't work on M1 yet): that's interesting to me.

    5. Anonymous Coward
      Boffin

      @MatsSvensson Re: Headline-whoring

      I have to agree with your post. To a point.

      What makes this important is that this CPU (SoC) really helps out in the front-end space.

      You're not asking your laptop or desktop to really do a lot of things except present data and surf the web.

      Maybe a power point / keynote presentation....

      The author inelegantly points out that most of the time, you're developing or running code off your desktop and either on a local server or in the 'cloud'...

      So if all you're doing is presenting data... how much horsepower do you really need?

      It seems more that the author hates Apple than is willing to actually be objective on his topic.

    6. Anonymous Coward
      Anonymous Coward

      Re: Headline-whoring

      Yeah, I've noticed that whoever is doing the headlines lately on a few occasions, and this is one of them, has been wide off the mark.

      It's a shame, because the article itself is relatively insightful (regardless of whether you agree with the author's conclusion) but you wouldn't know that if you went by the headline.

  2. Snapper

    You forgot 'Apple is doomed'.

    You got paid for this drivel!

  3. IGotOut Silver badge

    Wow.

    I'm no fan out, but bloody hell that just smacked of "I know better than the richest company in the world"

    1. Ciaran McHale

      Re: Wow.

      The Register has a history of mocking the first few generations of Apple products in a new line. This included the iPod and iPhone. So it is unsurprising that this tend continues with Apple silicon.

      1. Anonymous Coward
        Anonymous Coward

        Re: Wow.

        and while secretly, El Reg staff are using Apple badged kit for their work (just perhaps, nudge, nudge, wink, wink)

        Don't forget folks, Apple is nothing but a Foxconn re-brander.

        Apple didn't make the M1, TSMC did. Apple innovation?

        Being serious for a moment, there are moves to make Linux run on the M1 bare metal. That will get rid of the walled garden but how many people will take advantage of that? Very few or will 2022 finally be the year of the desktop?

        Will Microsoft rise to this challenge or stick their head in the sand? That will be something to watch out for.

        I'm sure that Samsung and Qualcomm are working on SOC's to rival and even beat the M1.

        The big losers here will be Intel and AMD.

        The CPU games are going to get interesting again and despite the Apple Haters out there, we will have to thank them for that.

        1. Dan 55 Silver badge

          Re: Wow.

          Being serious for a moment, there are moves to make Linux run on the M1 bare metal

          Being serious for a moment, I don't see Apple fully documenting the M1 and the products which use it and making that documentation freely available for open source projects.

          1. ThomH

            Re: Wow.

            To agree with you emphatically: it took two years from the release of the Raspberry Pi to persuading Broadcom to release specs for its VideoCore GPU and that was with heavy vendor pressure, so the probability of Apple ever releasing much about its GPUs or the AI stuff must be negligible.

            So bare-metal Linux is likely always to be a second-class citizen, even if it comes to exist at all.

          2. danbi

            Re: Wow.

            For some reason, Apple does not live to your expectations :)

            https://developer.apple.com/documentation/hypervisor

            1. Dan 55 Silver badge
        2. danbi

          Re: Wow.

          You don't have to "run Linux on the bare metal", because MacOS already includes the VM creation/management hooks. Just create an "Linux bootloader" app and you are done. Today. Apple not only not prevents this, but they actually encourage it.

          People already managed to boot ARM Windows this way. Now, they indeed have to decide if they will play by licensing it's use.

          In my opinion the bigger losers are the component manufacturers, because people will severely reduce their purchases of RAM sticks as more and more computers move those inside the SoCs. But the part manufacturers will benefit, if they can secure an contract with the likes of Apple.

        3. Dominic Sweetman

          Re: Wow.

          Well, TSMC built the M1 silicon.

          ARM provide a CPU-core-kit: "compile it", add silicon and it will run really well -- that's what Qualcomm do. But it won't make an M1. Apple design their own cores, probably with ARM's help, and with some tricks to turn compiled designs into faster silicon: including some they bought.

      2. Dan 55 Silver badge

        Re: Wow.

        Apple has a history of the first generations being a beta test paid for by customers. Witness first generation iPads which didn't receive as many OS updates as later ones, Apple Watch retconned into "Series 0"...

        1. ThomH

          Re: Wow.

          Luckily Apple's first generation ARM processor was the A4, which was in those first-generation iPads.

          So Apple now has a decade's experience at ARM-based SoCs, plus a couple of successful examples of architecture moves for the Mac. If even 1990s-era Apple managed not to screw something up then there's probably not too much cause for concern.

      3. Martin an gof Silver badge

        Re: Wow.

        To be fair, the first generation of a new Apple product often is a bit underwhelming - hardware-wise anyway. I suspect they use the "feedback" from that to improve future generations (iMac, iPod, iPhone - didn't the first iPhone come without a camera?), or ditch the product altogether (Lisa, Newton...).

        It's the software and the "ecosystem" they tend to get more-or-less right at the outset. The first iPhone wasn't brilliant hardware-wise, didn't do anything that other companies hadn't already done, but it was probably the first to put it all in one package and certainly that first release of iOS was a game-changer.

        Maybe they've broken that trend with these new devices. We all know the OS is pretty much together, and from the early reviews it looks as if they've nailed the hardware too - for your average Apple consumer at any rate.

        Speaking here as someone who has only ever owned one Mac (a 32-bit only Intel "Core" MacMini - the classic example of didn't-get-the-hardware-right-first-time), though I have used quite a lot over the years. It would take a lot to persuade me to buy anything else Apple-branded, though a 75% reduction in price wouldn't hurt :-)

        M.

    2. Anonymous Coward
      Anonymous Coward

      Re: Wow.

      The article points out that there's nothing revolutionary in what Apple has done. They have just made the hardware equivalent of a monolithic kernel - everything in one blob, with some performance advantages but at the cost of extensibility. The lack of a memory bus external to the SoC means the machine you buy will have no upgrade options, instead you'll need to buy a complete new machine. Great for Apple, sucks to be you.

      1. hammarbtyp

        Re: Wow.

        One thing the article also forgot to mention was the limitations of the I/O channel on the SoC

        As a workday laptop, it is at a great price point. Problem is, the days when laptops were standalone items, are long gone. Nowadays they are tend to desktop replacements, hooked up to all sort of external gear.

        The legacy of the ARM architecture and the SoC approach means that you will probably have far smaller I/O bandwidth than similar Intel/AMD laptop. This limitation probably explains why there are only 2 thunderbolt ports on the M1. For many this will not matter, but you may find the limitations when you try and hook it to your 2 6K monitors and your external SSD.

        Today PC's are not measured by raw performance solely. they form part of a ecosystem. The SoC is well execute, but as the article said, it does not represent a new highway, but a relatively limited cul-de-sac

        1. Dave 126 Silver badge

          Re: Wow.

          You attribute the low number of Thunderbolt ports on the M1 Macs to some inherent limitations of ARM.

          Given that an ARM Mac Pro is part of Apple's roadmap in the next two years, and the Mac Pro range since the Trashcan has been more about blisteringly fast IO than performance, it would perhaps be a safer bet that the current M1 Macs don't have more Thunderbolt ports because their intended use-case doesn't require them.

          Apple were leaders in the adoption of Thunderbolt (just as they once were of FireWire), the image of Macs is in part based on their use by video editors - are you really suggesting that Apple have jumped ship to an architecture that for some unspecified reason is unsuitable for their traditional power users?

          1. MJB7

            Re: Low number of Thunderbolt ports

            The OP doesn't attribute the low number of Thunderbolt ports to an inherent limitation of Arm (we know they can do IO). He attributes it to a limitation of the M1 SoC. If that is so, it is probably going to be a problem quite quickly.

      2. Charlie Clark Silver badge

        Re: Wow.

        You're right nothing revolutionary but it's still bold and good luck to them. Google is the real loser here. If they weren't so obsessed with Chrome OS, we'd probably have had some for of Android on ARM for a couple of years. For many developers Apple has produced the most compelling notebook of the last ten years.

        1. Dan 55 Silver badge

          Re: Wow.

          Google aren't thinking doing anything special with ARM laptops, they're thinking of hooking the next generation of children into Google's services. That's what a Chromebook is for and many countries' education systems have blindly walked right into it.

          1. DS999 Silver badge

            Re: Wow.

            It isn't to Google's advantage to have Chromebooks that perform as well as an M1 anyway. They want everyone reliant on Google's Cloud, so performance beyond what is necessary to accomplish that goal is wasteful - better to save money on a lower spec CPU and make the Chromebook cheaper. The more affordable it is the more schools buy it and force kids into the Google ecosystem, where Google hopes they will remain for life.

  4. Dan 55 Silver badge

    Remember that Apple once touted racks of Minis as a replacement for XServe

    But I don't think any BOFH would consider a device with with fixed storage, fixed RAM, and no gigabit Ethernet as a server device.

    Also people who use an eGPU with a 2018 Mini are also a bit hosed when it comes to an upgrade path at the moment.

    1. Dave 126 Silver badge

      Re: Remember that Apple once touted racks of Minis as a replacement for XServe

      > Also people who use an eGPU with a 2018 Mini are also a bit hosed when it comes to an upgrade path at the moment.

      They are a group of people who didn't need the absolute fastest GPU performance (else they would have bought a Mac Pro). They are either budget conscious, or else their workload is heavily skewed towards CPU light, GPU heavy workloads.

      If they are budget concious, then they will be watching with interest the GPU performance of the M1 X or M2 - whatever chip that it is safe to assume will be in the next iMac (Pro) and bigger Macbook Pros - and whether an M2 Mini (or a keenly-priced, entry level ARM Mac Pro) arrives.

      1. Dan 55 Silver badge

        Re: Remember that Apple once touted racks of Minis as a replacement for XServe

        If they were budget conscious they wouldn't be using a Mini.

        The only reason to have a Mini + eGPU is probably because you depend on Mac-only rendering software, but there are many businesses which do. Any software would need to be ported to ARM and the future product would have to support eGPUs before they could consider upgrading. They're probably looking at buying a few refurbished 2018 Minis if they haven't already in case next year's line-up is no different.

  5. Ross 12

    That was an article?

    You may as well have added 'just do your research!!!' to the assertion that there's 'far more interesting stuff happening in tech that matters to everyone'

    1. Dave 126 Silver badge

      Re: That was an article?

      The individual sentences and paragraphs of the article all made sense, but perhaps its narrative arc could have been more clearly signposted.

  6. chivo243 Silver badge
    Windows

    Butt Hurt?

    "...and if that’s all you want from life then congratulations." Did you buy MacPro wheels? Pretty bitter...

    Why do you write about Apple if you don't use it? Seems a bit biased?

    1. Dan 55 Silver badge
      Happy

      Re: Butt Hurt?

      Why do you write about Apple if you don't use it? Seems a bit biased?

      Even Apple things can be compared with not-Apple things.

      I guess the bias creeps in when the Apple things are found wanting, right?

    2. Anonymous Coward
      Anonymous Coward

      Re: Butt Hurt?

      So if he used / owned Apple products it would be unbiased?

      1. chivo243 Silver badge
        Thumb Up

        Re: Butt Hurt?

        At least he could be speaking from experience then, and not parroting what other non apple users are saying? I get the walled garden push back. I resist as much of it as I can, and faced with it being a problem, I don't use that service or feature of the walled garden. I use Apple gear, but don't drink the kool-aid...

        1. Anonymous Coward
          Anonymous Coward

          Re: Butt Hurt?

          But this 'article' wasn't about usability, it was about the affect of an architecture created by Apple and its affect on the industry. That requires no use of a product, only the knowledge of said architecture, the industry and said companies influence in architectures.

    3. el kabong

      Why does it hurt you so much, you're not an apple shareholder are you?

      Every time I see someone making a big emotional investment in an industrial corporation that only exists to turn a profit I wonder why. Why did it all have to go so wrong?

      1. Anonymous Coward
        Anonymous Coward

        Re: Why does it hurt you so much, you're not an apple shareholder are you?

        I think it's the tribal instinct built into all of us. Some people identify with a football club, some with a computer company. At least it's generally[1] safer than the people that identify too closely with a religion or nation state.

        [1] I've been to a Millwall vs Arsenal match, hence the "generally".

      2. Anonymous Coward
        Anonymous Coward

        Re: Why does it hurt you so much, you're not an apple shareholder are you?

        > Every time I see someone making a big emotional investment in an industrial corporation that only exists to turn a profit I wonder why

        Thought history people's relationship with the world around them has been more than just intellectual. We use tools with our hands and bodies. It's not unusual for people to feel something akin to fondness for, or praise, a knifes that cuts well and feels well made, or a favourite hammer. Objects that do their job better make us feel better, and so perhaps this appreciation for the knife is extended to the blacksmith who forged it.

        In any case, I suspect that what you perceive as people's emotional attachment to a company might often just be people expressing an interest. And why shouldn't they? Features or absences first found on Macs often come to PCs, and those instances when that doesn't happen can be instructive. And it's not as if anythingtely else very interesting is happening with PCs at the moment (and that's fine, it's usually a good sign when this year's tool is much the same as last year's - it means someone has probably had time to refine the smaller details.

      3. Anal Leakage

        Re: Why does it hurt you so much, you're not an apple shareholder are you?

        You mean like a decades-old emotional investment in Psion? Perhaps the one that El Rag keeps close to its heart, likely until death?

  7. Fruit and Nutcase Silver badge

    Tim Cook's Granny

    At the time of writing, she's not listed on ebay

    https://www.ebay.com/sch/i.html?_nkw=Tim+Cook+granny

    Right now that search brings up 3 Hammer Horror cards - at #4, #5 and #6, and much cheaper than the top 3 results which are Tim Cook signed memorabilia

  8. Rosie Davies

    Well...OK

    I take your point. It's Apple doing what Apple does and doing it very well. I'm not convinced that writing off architectural changes to produce something that is very fast and very efficient is a complete goer though. Energy usage by a single consumer device may be inconsequential compared to energy use by a screamingly fast 4U beast but (guessing TBH, I've not checked the numbers) there are a lot more consumer devices than there are 4U beasts. Less energy used at the consumer end is less energy used overall and that's mostly a good thing.

    The other bit that doesn't seem quite fair when writing this off is that it's a different approach that has been shown to give good results. That by itself is likely to cause some thinking outside of Cupertino about similar architectural changes; if only to get up the noses of Fanbois bleating on about their phone going 15 weeks between charges (not the greatest of reason for doing anything mind). Even if it is just phones and laptops at the minute there's little historical precedent within IT for anyone respecting artificial market segment boundaries when it comes to implementing good ideas. Aforementioned 4U beast only with 1/10th of the power consumption for the same performance would be a big saving for anyone's energy bill.

    Yes, I know that it's more about shunting data around as quickly as possible rather than chewing through it at the moment. That's likely to change; storage, bandwidth and compute tend to jockey around playing pass the bottleneck so it'd be surprising it at some point in th next couple of decades people weren't complaining about not having enough fast enough cores to chew through the bits they're delivering. It'd be nice to have the compute side of things going screamingly fast when questum entangled transfer interfaces (that's made up BTW) are drowning them under a tsunami of data.

    Rosie

    1. Anonymous Coward
      Anonymous Coward

      Re: Well...OK

      I agree with all you said.

      But the 4U example, if we go with AMD and their 64 core CPUs (7702P), they currently draw max ~4.4W per core (from the TDP) compared to Apples cores (firestorm) of ~3.45W.

      But AMD is currently using 7nm (14nm for IO), Apple is built on 5nm, so AMD switching to 5nm should drop that power consumption.

      So on the high end, high core count Apple haven't really done much better, but they may be able to reduce their power draw on the higher counts like AMD have done. But so will AMD going to 5nm and probably with their next design. Which will be before Apple even consider such a CPU.

      1. Charlie Clark Silver badge

        Re: Well...OK

        Notebook vs 4U comparison will fail not least because of the display in the notebook.

        1. Richard 12 Silver badge

          Re: Well...OK

          AC is only comparing the CPU complexes.

          If it makes you feel better, put the screen to sleep and then compare.

      2. Sr. Handle

        Re: Well...OK

        Ryzen processors are really good better than Intel perhaps even faster than Apple’s however if you use them unplugged the M1 is by far the winner, don’t get me wrong I know Apple is for the money, they are a corporation not a charity, maybe I wouldn’t call it revolutionary but at least they will shake the industry

      3. janusng
        FAIL

        Re: Well...OK

        > But the 4U example, if we go with AMD and their 64 core CPUs (7702P), they currently draw max ~4.4W per core (from the TDP) compared to Apples cores (firestorm) of ~3.45W.

        You have missed the power drawn from chipset, ram and so on in the AMD cores when comparing to M1. Hence you figures missed the reality by miles.

  9. Mike 137 Silver badge

    Natural selection

    "It’s not much of a future, though. Extreme efficiency in an ecosystem invariably leads to extinction when that ecosystem changes."

    Having studied ecology, I would say "extreme specialisation" rather than "extreme efficiency" and I might be a bit cautious about "invariably" . Nevertheless Rupert Goodwins makes an important general point. The fundamental advantage of the personal microcomputer that led it to dominance was versatility. The original IBM machine had practically no peripherals on board - you could select the ones you wanted (from an ultimately massive range of alternatives) and plug them in to create a system to your own specification. Now, that's much less possible, and certainly not possible at all with Apple kit.

    1. TimMaher Silver badge
      Windows

      Re: Natural selection

      Which is one of the reasons that I still use a 2012 cheese grater.

      1. Neil Barnes Silver badge

        Re: Natural selection

        Works with cheese from any year.

    2. Anonymous Coward
      Anonymous Coward

      Re: Natural selection

      But realistically, most people don't want to do that.

      People who buy Macbook Airs - or any other ultralight laptop - don't generally want to do that.

      They don't want an external GPU or a huge external storage drive, they don't want to plug into industrial machine. At most they want to be able to plug into a monitor.

      They want to browse the internet, edit documents and media, log into a Citrix session at work, maybe even write and compile some code.

      All Apple have done is delivery a new Macbook that is massively faster and more efficient at doing the things that people who buy Macbooks - or other ultralight laptops - want to do. The global market for ultralight laptop computers is not all of computing, but neither is it insignificant.

      1. John Robson Silver badge

        Re: Natural selection

        Yep - Whilst it's a deal breaker for me at the moment (current 5 screen layout is OK) I know that I am in a distinct minority.

        Given the capability of running sufficient monitors (which looks pretty possible:

        YT video) then it would suit me rather nicely, I rarely need the laptop to last long, but occasionally need it to deal with a 10 hour flight, so that's ~14 hours of potential usage.

    3. mevets

      Re: Natural selection

      2020s edition of the isa bus is thunderbolt, which really is just an external pci bus; you can connect a chassis up with all the cards your heart desires without having to burden the vast majority of owners that will never do such a thing.

      Expensive? Those IBM PCs were $3000 + in the early 80s, about 1/2 price of an efficient vehicle.

    4. Dave 126 Silver badge

      Re: Natural selection

      > Now, that's [flexibility, adaption] much less possible, and certainly not possible at all with Apple kit.

      Well if we do a quick 80/20, we'd expect that 1/5 of the possible adaptions would give 4/5ths of the benefit. What that means is that even a limited range of adaptability can be sufficient for most of the people must of the time.

      So on Mac expandability: RAM upgrades and eGPUs are not longer an option, but Thunderbolt allows the display and storage to be expanded.

      iPhones / iPads are very adaptable for their class ( of hand held computers) - they are well supported by 3rd party hardware such as microphones, camera gimbals, laser scanners, point of sale systems etc. The iPhone remains unaltered, but it is still adaptable through peripheral hardware.

      As I'm sure you know from ecological studies, evolution doesn't happen at an even pace. There are periods of when very little change occurs, interrupted by periods of explosive change. It could be argued that for the last decade or so what people actually use their computers for has remained fairly unchanged. Compared to when the IBM PC was released, we all today know what we want from our computers.

    5. amanfromMars 1 Silver badge

      Re: Natural selection/extraterrestrial assumption/exceptional presumption

      The original IBM machine had practically no peripherals on board - you could select the ones you wanted (from an ultimately massive range of alternatives) and plug them in to create a system to your own specification. Now, that's much less possible, and certainly not possible at all with Apple kit. ..... Mike 137

      That is as may be, and have been, Mike 137, but Apple appears to have gone down the root of providing all the peripherals for you to utilise as best you can. I suppose they imagine it aids systems compatibility and program uptake of whatever you would be servering and testing Proprietary Core Operating Systems Markets with. Once accepted therein it's akin to having a license to print money if your programming is up to surpassing Apple standards.

  10. Ciaran McHale

    Shortsighted analysis

    If the only laptop/desktop products Apple were to make with its new silicon were the three already released, then the author of the article might have a point. But it seems likely that Apple will release desktop/workstation machines with even faster silicon and more RAM. Some people will appreciate faster webpage loading or faster games, but the big win for Apple is likely to come from niche application areas in which an Apple silicon-based computer is as fast as other computers costing, say, five times as much. Machine learning (ML) might be such a niche area: https://www.zdnet.com/article/the-new-m1-macs-make-cutting-edge-machine-learning-workstations/

    Although ML is niche, it is a large and quickly growing niche. Perhaps Apple will even produce blades containing multiple SOCs for use in data centres. If Apple can compete successfully in the ML/data-centre niches, expect its stock price to rise significantly. And also expect other CPU manufacturers to start designing CPU+GPU+RAM SOCs to compete against Apple silicon. So, yes, I would say Apple Silicon has a good chance of being revolutionary.

    For what it's worth, I don't own any Apple products (I prefer Linux and use a dumb phone).

    1. Anonymous Coward
      Anonymous Coward

      Re: Shortsighted analysis

      For ML, there are far faster ASICs out there than what Apple Silicon can do.

      1. Anonymous Coward
        FAIL

        Re: Shortsighted analysis

        @A/C.

        Quote" For ML, there are far faster ASICs out there than what Apple Silicon can do" unquote.

        So where do I buy one of these far faster asic systems for the same price as a Mac Mini?

        Just asking

        Cheers… Ishy

    2. big_D Silver badge

      Re: Shortsighted analysis

      Yes, and that is the big question, how will they deal with that extra RAM. It is a squeeze to get 16GB on there and you get a lot of waste, because if the RAM doesn't work or a core doesn't work or a GPU core or ML core, you have to bin the whole thing, or sell it as a reduced spec device (hence 8GB and fewer GPU core variants).

      Upscale that to a professional workstation with 64 cores and 128GB, 256GB or more of RAM and you have one huge die! I doubt you'd get may SoCs from a whole wafer, or even if such a beast would fit on a wafer - my understanding of wafer sizes and die sizes gets a bit hazy when trying to imagine that scale of an SoC.

      But, if Apple are going to offer something to compete with Xeon based workstations, including the current Mac Pro, they are going to have to go outside of the SoC model of the M1, at least for the RAM and probably for external GPUs as well, let alone fibre channel and 10gbps network adapters etc.

      That is when we will see what Apple's real vision is. That is when things will be really interesting.

      1. Anonymous Coward
        Anonymous Coward

        Re: Shortsighted analysis

        It's not all on one die.

        Each part will be tested separately before mounting on the SiP interposer. Think: mini PCB. You don't just test the final assembled product. That would be lunacy.

    3. Pascal Monett Silver badge
      WTF?

      Re: Shortsighted analysis

      Machine Learning on an Apple ?

      If you have a reference, please share.

      In the mean time, that's just fanboi daydreaming.

      1. Ciaran McHale

        Re: Shortsighted analysis

        https://www.zdnet.com/article/the-new-m1-macs-make-cutting-edge-machine-learning-workstations/

        1. Pascal Monett Silver badge

          Thank you for the link. I've learned something.

          You could have referenced that to start with, though.

  11. phy445

    The games console model

    Isn't what Apple are doing essentially the same as Sony, Microsoft and Nintendo do with their games consoles? A lot of people like this approach – a machine you can just turn on and it does pretty much what you expect and you will probably want to upgrade to a new model every 3-5 years. Come to think of it, that sounds like most consumer products from cookers through to large format TVs. Its almost as though Apple want to appeal to consumers...

    1. big_D Silver badge

      Re: The games console model

      Yes, but all the games on those consoles are written to those specs and you can't buy a higher spec version of those devices, until 5 - 10 years later, when the next generation is released. (There are often mid-life revisions to reduce power and manufacturing costs, but they still have the same performance as the original.)

      With a general purpose PC, whether it be Windows, Linux, macOS etc. and whether it is Intel, AMD or ARM, the software expects more power each year and the hardware manufacturers bring out faster devices each year. If you bought an Xbox One on release in 2013, games written for that platform today will still expect that level of hardware. But an M1 based Mac today and in 7 years, the current software will expect something much more powerful in order to run smoothly.

      With my old laptop from 2010, I just threw out the HDD and put in an SSD and added more memory and it was "fast enough" again to run a current Linux distribution smoothly. Good luck doing that on a 10 year old M1 based Mac.

      My current work ThinkPad has 8GB RAM, but that is getting a bit tight, running multiple test VMs. I just need to release 2 screws and I can clip in another 8GB, or I can take the existing 8GB out and plug in up to 32GB. You don't have that option with an M1 based Mac.

      1. Anonymous Coward
        Anonymous Coward

        Re: The games console model

        But an M1 based Mac today and in 7 years, the current software will expect something much more powerful in order to run smoothly....

        My wife has a 8 year old Mac air which still runs like a dream ain’t much better or worse than last years MacBook Air, due to apples Amazing closed garden hardware their Machines actually get faster over time especially on the macOS performance update every 2 years.. this problem you talk of is a windows problem which I really don’t get because as far as I can see windows has been stagnate for about 10 years now..

        1. Dan 55 Silver badge

          Re: The games console model

          their Machines actually get faster over time especially on the macOS performance update every 2 years..

          Please tell everyone which machines they are because my two machines have done the exact opposite. I've had to chuck RAM and SSDs at my Macs to counteract the bloat in MacOS.

    2. Dave 126 Silver badge

      Re: The games console model

      The games console model is to initially sell the model at less than cost, because of an expectation that each buyer will buy X number of games over the console's lifespan. A chunk of each games retail value goes back to the console vendor. The cost of compenents tends to fall over time in a predictable manner, too.

      The Apple model is sell at greater than cost, and not charge for an OS and several applications (after all, the costs of both hardware design and software can be shared across all the units they sell)

      1. Dan 55 Silver badge

        Re: The games console model

        Apple has the App Store. They've got it both ways, expensive hardware and a cut of software/film/music sales.

  12. Fazal Majid

    Except the Cloud is going in the same direction

    Amazon is not-so-slowly but surely moving towards it own Graviton2 SOCs for AWS, even if they are not yet as tightly integrated, and I would very surprised if Apple didn’t have a data center variant of the M1 (or perhaps the M2 that will inevitably follow for higher-end MacBook Pros, iMacs and Mac Pros) for its own extensive data center operations (running Linux, BTW).

    1. druck Silver badge

      Re: Except the Cloud is going in the same direction

      Exactly, the M1 makes Apple products better, but for something that is going to change the world, chips such as the Graviton 2 are the place to start looking.

  13. DrBobK

    ...leads to extinction when that ecosystem changes.

    This is the company that switched from Motorola 680x0 to PowerPC to Intel to Apple's own take on ARM pretty painlessly*. They seem quite good at dealing with major changes to me. Pretty ridiculous article if you ask me.

    * I'm not counting the 6502 - switching from that also involved the user switching to a quite different OS/Interface.

    1. Dave 126 Silver badge

      Re: ...leads to extinction when that ecosystem changes.

      Yes, it would appear that Apple have in the past benefitted from being flexible. And there are many clues that they know this.

      For example, FireWire was originally fitted to Macs to use high-resolution scanners, but this IO flexibility later made them ideal machine to use with digital camcorders, and later made the first (Mac only) iPod possible (because the then common USB 1 was far too slow). FireWire aided Macs in music studios, and so years later the MK1 iPhone had wireless MIDI and low latency of human input baked into its OS.

      Apple's flexibility comes from fitting parts as standard that many existing users have no use for - at least initially. You can only get away with the extra cost of fitting parts no user asked for if those users are accustomed to spending a little bit more money on your products, or see a value in doing so.

      As a consequence, developers are more likely to support a hardware feature since it has a large user base.

  14. Throgmorton Horatio III

    What I do locally matters too.

    No everyone works 'in the cloud' or uses AWS etc other than as an end storage point. There may be far more interesting problems that need solving, but the guys processing pictures held on a local drive in Adobe Lightroom won't give a wet slap about those compared to how smoothly and precisely their brush applies a mask. Granted, if you have to work over a network (as I do) that the performance of the individual machine is almost irrelevant, but many do not do that.

    I'm no Apple lover, but there are plenty of people around who would rather have a fast, power-efficient and cost effective SoC based computer, even if it means buying an Apple machine. Don't get so tied up with 'bigger picture' stuff that you overlook the importance of the smaller one.

    1. Pascal Monett Silver badge

      Re: What I do locally matters too.

      You have basically just justified the entire premise of the article : Apple is painting itself into the smaller picture corner.

      1. Dave 126 Silver badge

        Re: What I do locally matters too.

        A smaller picture that for Apple has far greater scope for maintaining margins and maximizing than the so-called 'bigger' picture.

        Apple is a greater expert on Mac user's requirement of an SoC than it is on the data centre market. What killer advantage would Apple have over Amazon or nVidia if they designed chips for data centres? Remember, it's not enough to be merely competitive, because competition reduces profits for all players.

        That's not to say Apple are blind to possibilities. Who knows, would it be crazy if Apple offered a ARM cloud compute service that worked seamlessly with MacOS and with software from Apple and partners? Maybe, maybe not - that's what Due Process is for, taking into account such factors as possible cannibalism of higher spec Macs.

        1. amanfromMars 1 Silver badge

          Re: What is done locally matters too and can easily be terrifying/groundbreaking/earth-shattering

          That's not to say Apple are blind to possibilities. Who knows, would it be crazy if Apple offered a ARM cloud compute service that worked seamlessly with MacOS and with software from Apple and partners? Maybe, maybe not - that's what Due Process is for, taking into account such factors as possible cannibalism of higher spec Macs....... Dave 126

          Are Apple blind to the possibility that certain MacOS users already are offering ARM cloud compute services that work seamlessly with MacOS and with software from Apple and partners and their erstwhile competitors and opposition too ...... and in so doing render to the mothership, secure and overwhelmingly powerful home and forward operating base advantages/direct virtually remote sensitive leverage?

          It would be very surprising and some would even say unusually disappointing if they are basically unaware of that utility ...... although that itself would be extremely exciting as it opens up Apple to a whole new wave of novel enterprise to engage and entertain, exploit and exhaust, export and encourage, which both have and are enabled to grant access to core root kernel processes.

          1. Dave 126 Silver badge

            Re: What is done locally matters too and can easily be terrifying/groundbreaking/earth-shattering

            > a whole new wave of novel enterprise to engage and entertain, exploit and exhaust, export and encourage

            I thought that was Civilisation VIII, going beyond the four Xs of explore, exploit, expand and exterminate. Or was that Rollercoaster Tycoon?

      2. Throgmorton Horatio III

        Re: What I do locally matters too.

        Apple sell hardware to end users. The article appeared to be snippy about the way they were failing to address areas of computing outside their core business. However, although the basic approach they've taken (SoC with all parts closely linked for very high data transfer speeds while using less power) seems to be working out very well in that scenario, it doesn't require much imagination to see that a similar apporach might usefully be deployed in those other areas too.

        Normally El Reg's articles about Apple are spot on, but for once this was a bit of a miss.

  15. 45RPM Silver badge

    I dispute that making RAM expandable opens the M1 up to be bugger of Physics. I’m still hoping for an expandable Mac Pro, where the 16GB of onboard memory effectively acts as a bloody enormous cache - but where I can still plonk terabytes of memory in if I want to. It’s not as if CPUs haven’t had on-die memory before - it’s just that they haven’t had gigabytes of on-die memory.

    I think that Apple understands that a Mac Pro needs to be expandable - or its not Pro. Let’s just see what they do.

    As for the M1 being in some way less significant because it’s tightly tied to the OS, this is a good thing from the users perspective - more performance, for less power consumed - but it’s also a non argument. Testing with (virtualised) ARM Windows hacked onto the M1 also demonstrates that the M1 is a screamer in this use-case too (and, let’s face it, Microsoft, Google, Amazon, Huawei - all of these companies are big enough and rich enough to develop their own M1 equivalents if they have a mind to, and now that Apple has shown what’s possible perhaps they will).

    For my own geeky interest I’d like to see Linux running on it - but, realistically, this doesn’t go as far as being a ‘use case’ for me since I only ever install Linux on my PCs. Since my Macs are already running Unix, I can’t quite see the point of installing a Unix clone on them.

    1. Anonymous Coward
      Anonymous Coward

      I've just built a new home PC - first time I've done so since moving to laptops about twenty years ago. The AMD Ryzen based machine I built outperforms Apple's M1 based Mac Mini and cost me about the same.

      1. 45RPM Silver badge

        I don't dispute that - I have a Ryzen 5 powered machine myself, and very nice it is too. But don't forget that the M1 is many times more power efficient than Ryzen, and is more akin to AMD's APUs. It's Apple's cheapo* power-sipping chip for mobile use.

        If I'm right about this, Apple's next chip will make your argument rather like comparing Zacate to Ryzen. No contest.

        Besides, power consumption is a critical part of the argument. We live in a world of global warming and rampant power consumption. This is an untenable situation. We need to get to a world where our computing devices are as power efficient as they are powerful**.

        * cheap being relative in Apple's case. A Mac Mini is certainly cheaper than Apple's high end machines!

        ** for this statement alone I deserve a downvoting. Not because I'm wrong (I'm not), but because I'm a massive hypocrite, with my Ryzens and Xeons and power guzzling monsters.

        1. SuperGeek

          "** for this statement alone I deserve a downvoting. Not because I'm wrong (I'm not), but because I'm a massive hypocrite, with my Ryzens and Xeons and power guzzling monsters."

          Consider yourself Thunberg'd! "How dare you!" She's a bit young to know about 10cc.......

        2. John Robson Silver badge

          "** for this statement alone I deserve a downvoting. Not because I'm wrong (I'm not), but because I'm a massive hypocrite, with my Ryzens and Xeons and power guzzling monsters."

          Nah - you're aware of it, but use what is available.

          I have a strong temptation to get an M1 Mini to replace some of my servers, but I'd need an external drive array as well... So not yet. The relatively low power microserver will continue to do it's job. for a few years yet.

          To the person you were talking to...

          Saying that a Ryzen based machine is *as powerful* for a similar cost really does illustrate how good the Apple silicon is.

    2. Doctor Syntax Silver badge

      "I’m still hoping for an expandable Mac Pro, where the 16GB of onboard memory effectively acts as a bloody enormous cache"

      I think this is the obvious next step or next step but one for somebody. The next step for everyone else will be the SOC including fast RAM.

      1. John Robson Silver badge

        What *is* the latency on USB4?

        Given that it can push 40Gbps it's not completely unreasonable to consider it as a "slow ram" connection interface if the latency can be low enough. Can't see any information about PCIe or TB latency, just their speed...

  16. JohnMurray

    ....guess the reg won't be getting any shiny test offers from Apple....as usual.....

    1. amanfromMars 1 Silver badge

      Re:....guess the reg won't be getting any shiny test offers from Apple....as usual.....

      The developments at Apple are something for El Reg to test for future desirability and practical realisation ..... JOINT Engagement with Joint Operations Internetworking Novel Technologies.

      And to paraphrase Vice Admiral Lord Horatio Nelson ...... El Regers expect that to be the case ...... baiting the hands which feed it.

      One would have thought that Prime Register Territory and an Almighty EMPowering raison d'être.

      And there's bound to others with a similar view and opinions ....... Fellow Travellers on the Same Journey on Parallel Tracks with Other Base Elemental Drives ....... and which now discover they can ACT in Consort with the sharing and presentation of what is known, and what they know regarding the Realisation of the Results of Immaculate Dreams ..... aka Future Building.

  17. Elledan

    ASARM matters

    Apple Silicon (ASARM) matters a great deal. Not only does this firmly push ARM into the desktop space for the first time since the early 1990s, it also allows us to see what the cost of CISC is today. Even if modern-day x86 CPUs use a RISC architecture inside, all the transistors being used for the CISC-opcode-to-microcode decoding process do not come for free.

    As a result of ASARM, developers and publishers can no longer ignore AArch64 as a target for desktop applications. Suddenly everything from browsers to productivity applications has to work on AArch64, but also smaller projects. MacOS is still a relevant target for desktop OSes, and interestingly also for smaller open source projects.

    Previously the most well-known AArch64 (along with ARMv7) target were SBCs like the Raspberry Pi, which do not even remotely cover the same market as ASARM does. Now suddenly, people see that ARM is suitable for desktops and laptops, after years of laughing at Windows-on-ARM. From the looks of it, an ASARM-based laptop has the potential to be more efficient than an x86-based one, if only because of the very mature BIG.little architecture that is so common with ARM SoCs.

    I'm not a betting person, but I'd wager a bet that ASARM is making a certain blue chip manufacturer feel quite nervous right about now, with AMD also looking worriedly over their shoulder.

  18. Vulture@C64

    What ? Another blatantly anti-Apple article from the Reg, shocker !

    What Apple said they did, if you listened to the presentation, was take a number of selected workflows and designed a SoC which would handle those specific tasks very well. They have not designed a data centre CPU, they have not designed a system which Disney will use to create the next blockbuster. They took workflows which the likes of me and you use - working on documents, editing photos and simple video from action cameras or blogging cameras and creating a system which was very low power, very fast and specifically designed for these types of work.

    And Apple has been spectacularly successful in doing just that !

    What will come next are chips for heavier workloads, where more memory is needed, or faster GPU . . . the M1 is the first chip, the first of its kind. There is bound to be an M1x and maybe an M2 . . .

  19. Remontado

    Am I lost?

    Are we talking about two different topics here? Personal computing and Cloud's?

    1. Anonymous Coward
      Anonymous Coward

      Re: Am I lost?

      You are quite right. The problem is when it comes to cloud, blockchain, or AI/ML these fanatics have tunnel vision and think everyone must follow their respective cult. Even worse huge numbers of them don’t even understand the pros/cons of their religion.

  20. Dave Null

    Apple may have killed Intel here

    To be clear, what Apple have done, is shipped an entry level fanless laptop that can emulate Intel code and outperform an I9, whilst offering all-day battery life.

    Intel can't pivot to SoC and consumer *will* like this.

    Intel still don't have their 7nm process working and is now looking at 2022 or further out and contingency plans with third party foundries.

    AMD aren't constrained on the fab side so will likely take the lead in PC for the next couple of years.

    nVidea buy ARM so go after the DC market, and that leaves Intel in a very bad position.

    To the consumer, these are going to be very appealing...

  21. mindprint

    Maybe I'm looking at this very simplistic, but I need to edit and render a video in Final Cut. The new M1 based notebook will allow me to that faster and for longer time without a power outlet for much less money than any other comparable solution. And that's just one of the real world examples for a typical Mac user.

    So what is the "far more interesting stuff happening in tech that matters to everyone"? Emphasis on EVERYONE.

    1. Strahd Ivarius Silver badge

      Could you elaborate on the alternate solutions?

      (btw I doubt that a typical Mac user renders video every day... typical use case would be as usual web browsing, mail, word processing and calculations)

      1. ThomH

        The power savings are the main benefit to a typical use case, I think — 20 hours on a charge for the 13" 'Pro'. But for people like me that develop native code, there's a huge reduction for build times and for video and iage editors there seems to be quite a bounce via the GPU.

        So, benefits for people who just want to carry a laptop around and browse, for developers, and for media production. Isn't that essentially Apple's entire user base?

        That comment is made while acknowledging the article's point, of course: many, many people do work that is entirely disjoint from Apple's user base, and this change will matter not one jot to them.

  22. Anonymous Coward
    Anonymous Coward

    just the opinion of a server guy

    I think the article is a little harsh, but people DO tend to forget the data moving.

    whether this is on a network with directors with the shiny new kit still complaining that the horoscope site their visiting is slow or, especially in the virtualised world, people buying the latest greatest server kit and then slapping a 5 year old SAN on the end of it and wondering why their databases are still a nightmare.

    Apple is a consumer company, that's all they want to be. They've been dragged kicking and screaming to provide a modicum of enterprise management capability but why would they? You can see the clusterfuck that is HPE's business model, Dell isn't making a huge margin.

    Why bother doing stuff outside the walled garden for Enterprise and the geek squad when you're literally able to print your own money doing shiny consumer kit.

  23. Howard Sway Silver badge

    OS on a chip

    That's where I see this going. Think about the advantages for the likes of Apple and MS. All user files will still be stored on an SSD / Hard drive, but the OS will run straight off the chip. A computer will be an instant-on device again, like home computers were back in the 80s. No more piracy / licensing / OS file corruption issues. Upgrading will require purchasing a new chip, so be even more lucrative, even though some limited flashing of fixes onto on-chip ROM might also be deemed possible. And total control by the giants. All that lovely total control and $$$$.

    This will all have many many downsides, but the civilian population who just want appliances will not care a jot.

    1. Dave 126 Silver badge

      Re: OS on a chip

      There was a meeting of Apple engineers and Steve Jobs walked in. He pressed the power button of an iPad and it was instantly ready for use. He then turned on a MacBook, and the there was a delay. "This..." pointing at MacBook "doesn't do this." pointing at iPad. "Make this.. do this!"

  24. Gene Cash Silver badge

    "sell anything other than finished computers"

    I'm not an Apple user by any means, but the last time Apple did sell bits, they nearly lost their shirt. That may explain the attitude.

  25. Joe Gurman

    Yet another in decades' worth of Reg articles....

    .... complaining about Apple being good at what they do.

    Why should Apple, or anyone else, be everything for every commuting need? Attempting that sort of thing almost always produces the lowest common denominator. Should you be criticizing Microsoft for never producing their own silicon to change the world, or Intel for failing twice in a row to get smaller chip processes to work reliably?

    Bit of a double standard, much?

    No, it's all right, you're just permanently shirty about Apple given a lot of people what they want, rather than what _you_ want. Got it.

  26. John Robson Silver badge

    So they won't sell the M1...

    But they've shown what is possible, and I'll wager that Amazon and Google are going to be getting very interested in running ARM chips rather than x86. The savings in power usage will be substantial.

    I wouldn't bet against Apple deciding that selling to the hyperscalers with a slightly customised chip (losing the GPU primarily, but given the power consumption - who cares) might make them an even larger pile of cash than just selling to the public, and a few corporates.

    The M1 may well spawn the D1 (desktop) and the C1 (cloud) in turn. The first is more than just likely, the second is a distinct possibility.

    1. David Webb

      Re: So they won't sell the M1...

      Amazon are already using Arm, and have been for quite a while.

      El Reg Article Circa 2018

      1. John Robson Silver badge

        Re: So they won't sell the M1...

        Yes - though I don't know quite how much of it they use... this might be a significant driver - high performance cores running at substantially the same compute capability as x86 hardware for significantly less power.

        I can see them pushing people to ARM even if they keep half of the power savings for themselves.

  27. trevorde Silver badge

    Saturated market

    Apple have less than 10% of the desktop market and falling. There is little to no native software and all the good software is on Windows anyway. Most people will buy these toys to run Chrome ie overpriced Chromebook

    1. Vulture@C64

      Re: Saturated market

      More like 16% and growing. People are sick and tired of Windows and all the issues that go with it. Dell make some nice laptops but then spoil it with Windows.

      As for software on Mac, even Microsoft have had MacOS versions for years and are building M1 versions of Office 365, as are Adobe et al . . . it's popular and getting more popular or Apple wouldn't have bothered investing in the M1.

      Then there's iOS . . . again increasing in use given the stats from my sites.

      1. ThomH

        Re: Saturated market

        I wasn't sure who to believe on this, so I checked StatCounter, which attempts to monitor trends through web traffic analysis. Make of that methodology what you will.

        Worldwide it does indeed look like a ~16% share for macOS, on a gradual upswing, with Windows very slightly fading. In Europe macOS is nipping at 20%, and in North America it's more like 27.5%, but apparently the continent that likes Macs the most proportionally is Oceania where Apple gets almost to a third. I did not see that one coming.

        But it's easy to oversell: in the worldwide all-OS that chart both Android and Windows are basically as important as each other, both hanging around just below the 40% total share mark; from Appleworld iOS appears to be about twice as used as macOS.

        1. DS999 Silver badge

          Re: Saturated market

          I wonder how they are measuring that...

          While I don't doubt that the macOS share has been growing, 17% WORLDWIDE seems pretty unlikely to me simply given the entry price of a Mac vs the entry price of a Windows PC, and how many people in the world for whom even an entry level Windows PC is more than they can afford.

          1. John Robson Silver badge

            Re: Saturated market

            " how many people in the world for whom even an entry level Windows PC is more than they can afford."

            They're running an android phone, not a PC at all.

          2. Anonymous Coward
            Anonymous Coward

            Re: Saturated market

            I think the thing is, the people in the past who would buy a shitty low end Windows laptop to work on no longer need to and can get away with tablets or phones so we should be seeing the Windows numbers declining. Certainly in the home space.

  28. mark l 2 Silver badge

    I personally don't see the lack of expandability on the current M1 based Apple kit an issue for most end users. People have sort of got used to many electronic gadgets not being upgradable, I mean there is no way to add more RAM to your phone or iPad. And even my current laptop which can have the RAM upgraded, the manufacturer wasn't expecting many to bother going down this route when designing the laptop, as you have to pull the whole laptop apart to get to the DIMM sockets. Unless you are tech savvy not many end users would probably be comfortable to do it.

    1. DS999 Silver badge

      Yeah, this is just an argument people can use to hate on what it has done without recognizing that they replaced the low end option on several low end products. That the performance is competitive (in single thread/low multithread) with Intel and AMD's fastest shouldn't distract people into thinking Apple is trying to push these at high end customers who need upgradeable RAM etc.

      I'll bet if it were possible to measure the percentage of new Windows PCs sold to consumers (the whole market, across all price points) the percentage that EVER get an upgrade of RAM or storage is probably low single digits. The lack of upgradeability the tech writers and the kind of people who read The Reg think is terrible is something the average person would simply not care about.

      1. John Robson Silver badge

        Funny, I didn't see a Mac Pro - the MBP has been rather less than Pro for a while (I have a 2019 one my laptop stand at the moment).

        These are firmly consumer oriented.

        The markup on the increased memory isn't even that much, it's substantially less than MS charge for memory on their Surface line up... so if you need 16GB, just buy it. If you need more, then this isn't the machine you are waiting for - give it a couple of years (or buy one of these for your desk and a proper server).

        1. DS999 Silver badge

          Apple only replaced the low end version of the Mini, Air, and MBP13" this time around. The "high end" version is still x86 (slower, but with more memory/storage options)

          When the faster / more core CPU/GPU comes out in H1 2021 they'll fill the remainder of those product lines (plus maybe the low end of the MBP16" and iMac) and offer larger memory configurations. Maybe using DIMMs in some cases, we'll see. If they weren't going to support DIMMs or m.2 storage the Mini could be a lot more "mini" so I have to assume they have something in mind for it in higher end configs that allows for some expandability.

          1. John Robson Silver badge

            I am slightly surprised they didn't do the mac-micro.

            An appleTV sized M1 powered box with hdmi, power, ethernet, dual USB-c, and probably a single/dual USB-A port as well. Make it limited, 8GB RAM only, the 7 gpu core version... but that'd would suit alot of people, and the amount of space in the current mini suggests it's easily possible.

            It's not as if the M1 needs huge amounts of cooling, just stick it to the aluminium shell of a TV sized box and be done with it.

      2. Anonymous Coward
        Anonymous Coward

        Hell even my desktops never receive a RAM upgrade.

        I buy enough to start with and in 5 years time I'm probably being held back by something else, in which case it's a new mobo and new cpu anyway and what's the chances the new mobos even using the same type of ddr anymore let alone that I can match it exactly to increase capacity without ditching my old modules.

        Real people upgrade storage and that's about it.

  29. martinusher Silver badge

    The values of a great corporate PR department

    This is just another strand to the RISC versus CISC tradeoff. One thing I learned very early on in my career was that in terms of instruction flow RISC will always run rings around CISC provided there was effectively infinite memory bandwidth because the vast majority of instructions were straightforward RISC type orders rather than the multifunction complex instructions that a CISC processor might issue (instructions which are likely to be implemented in microcode anyway). There have been more or less successful attempts to match CISC orders to the needs of programming languages -- the PowerPC seems to be the best example -- but you start trading flexibility for performance.

    So Apple has made a RISC that's closely coupled with its memory. Great for them, especially as they're always been in the "computing appliance" business (just like a phone you buy it and use it as a complete unit). For the rest of us that need high performance computing that's relatively low power there's the MIPS architecture. MIPS has never registered much in desktop computing but its the go-to processor for heavy lifting, especially moving a lot of network data. This type of architecture turns up in soft processors and is also similar to the open RISC-V standard so I'd expect to see it around for some time.

  30. Lorribot

    Apple have produced a highly optimised processor design that works extremely well in their vertical integration business model.

    There are two devices, that are probably the best selling PC devices, the Playstation and the Xbox, that have a similar hardware model, these are currently using AMD designs. AMD has a licence for ARM but has yet to produce any hardware. I can see the sense in aSoC in these devices as they are effectively sealed boxes and integrationg AMD graphics and ARM cores and other specialist processors that support specific gaming related processes in to an SoC woudl save costs and improve performance. 4-5 years time is long enough for them to actually spin something up in this field and with MS moving Xbox over then Windows would go to, though it could be the other way round though x86 emulation is a bit of a sore point between Intel and MS if I recall correctly. Oh and Nvidia are buying ARM so that could be interesting too.

  31. Richard 12 Silver badge

    There's going to be a shedload of buggy apps

    The ARM architecture is very different to x86/amd64 in terms of memory coherence.

    A lot of places are going to believe Apple's "Just recompile!" hype and produce software that randomly fails in ways that a naiive examination of the source code (and even machine code) would say is totally impossible, because of the way ARM re-orders memory access.

    Apple know this, because their amd64 translation layer puts the M1 into a special memory coherence mode that papers over the cracks. The "native" apps don't get that special treatment because it slows it down.

    So the next few years is going to have a slow, quiet wave of utterly impossible bugs on "native" Apple Silicon, that are unrepeatable but persistent.

    1. Hi Wreck

      Re: There's going to be a shedload of buggy apps

      Good grief, what on earth are you talking about anyway? If your application breaks due to cache-coherence, I dare say your application is borked to begin with because it is playing foot-loose with critical sections.

      1. Richard 12 Silver badge

        Re: There's going to be a shedload of buggy apps

        Memory access ordering is not cache coherence.

        You are one of the developers who will - likely already has - made these mistakes, because x86 simply has a different memory model to ARM.

        Codebases that have had decades of real-world usage on x86 fail on ARM because of the different memory model. No software house is going to find all of those cases.

  32. bigtreeman

    gradually making new paradigms

    Early Intel processors had external memory, interrupt, dma, npu, i/o, etc units.

    The cpu wasn't much, registers, alu, branch/next instruction fetch & decode, etc.

    M1 has shown the efficiency of bringing the main ram onboard, big deal, no, progression, yes.

    First to do it, hell no.

    Rupert is correct in that Apple has tailored the M1 to a specific task

    and this is where the ARM ecosystem has been headed for a long time.

    A designer has a need, finds an MCU to fit the requirements,

    or if you're big enough, roll your own MCU.

  33. This post has been deleted by its author

  34. The Sprocket

    * Yawwwwn *

    Heard this whining in the transition of '040 to PPC to Intel, and now M1. Here we go again.

    My M1/Intel Affinity Software updates run smooth as silk.

  35. Hi Wreck

    Good grief...

    Kudos to Apple to shrinking the PCB down to a System-on-a-chip. Welcome to the world where physical chips are becoming obsolete and where companies can now stick whatever bits and pieces they need into a single device and then get TMSC or whomever to churn out countless copies. Seymore Cray recognized memory bandwidth as an issue during the Jurasic era of computing - his machines were a marvel at the time. Shedding an external memory bus allows a lot of innovation in the memory system,. As for bemoaning the ability to plug in other stuff onto a bus, you need new stuff to plug into the new Universal Serial Bus.

    1. anonymous boring coward Silver badge

      Re: Good grief...

      "Welcome to the world where physical chips are becoming obsolete"

      A single chip is a "physical chip". Also, varying power requirements for various functions will probably always create a demand for some additional chips.

  36. ZeiXi

    You can’t add RAM to your Mac Mini M1

    Neither can you hitch a trailer to a Lamborghini.

    1. John Robson Silver badge

      Re: You can’t add RAM to your Mac Mini M1

      "Neither can you hitch a trailer to a Lamborghini"

      Couldn't you have chosen a brand that doesn't make machines designed to pull serious loads?

      Lamborghini

    2. Anonymous Coward
      Anonymous Coward

      Re: You can’t add RAM to your Mac Mini M1

      Of course you can, Lamborghini makes a fine range of tractors

  37. Sr. Handle

    Just another Apple is doomed article

    Since Microsoft launched long time ago Windows for ARM I can agree isn’t that revolutionary the difference is Apple is doing it the right way.

  38. anonymous boring coward Silver badge

    Tedious moaning as usual.

    People are generally idiots, and even those who aren't can still be fooled.

    For the moment a walled garden is what's needed. OK?

  39. dave 93
    WTF?

    Because Apple controls its entire hardware and software stack...

    ...it can make the whole thing work extremely well.

    As you say, nothing at this price point comes close in performance and low power use.

    You didn't mention access to millions of new apps from the iPhone and iPad.

    Because you choose to eschew all things Apple, the M1 will not make a difference to your world...

  40. Mike Friedman

    Wait. What? Apple wants to continue to make oodles of money by selling computers and phones?

    <GASP> HOW! DARE! THEY!?

    I use Apple products because I like them. They're well made and they last a long time (although Windows is better at this now) so they tend to be a good use of my limited money. I'm not an apologist for them and they sometimes do silly things. But expecting a well established, phenomenally profitable company to change its business model is also incredibly silly. In 1997, no one expected Apple to survive, let alone become the largest corporation in history. They're apparently doing something right.

    1. The Sprocket

      Indeed they are. Perhaps to the chagrin of some.

      I used Macs through the 1990's, 2000's, etc., because I was working in the ad agency world, so postscript was standard. Today, I still use Macs, and I my MacBook Air is smooth. But every now and again, I will fire up my . . . PowerBook 3400c and it runs wonderfully as well (800 x 600). With 16mbs of RAM. Yes, they CAN last a long time.

    2. Sr. Handle

      I know, a company trying to have profit, shocking right?

      Thank God Microsoft does it’s crappy software as a charity.

  41. KimJongDeux

    I've been trying to move to Apple since about 2005. Every time I or someone in my family takes a tentative step they get clobbered by unreliability. Usually absurdly short hardware lives. Current but not first problem is trying to get a 400Gb iTunes library to play on a phone. The Apple Music people are very sweet but as ignorant as I am. I doubt whether Apple is really any less reliable than anyone else, but their reputation and pricing says they're really good.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like