back to article 114 billion transistors, one big meh. Apple's M1 Ultra wake-up call

On March 9th, Apple had its spring reveal. The stars of the show were a nice monitor, a new budget iPhone, and the Mac Studio, a Mac Mini stretched in Photoshop. Reaction was muted. There'd been some very accurate pre-launch leaks, sure, but nobody had cared about those either. If you're over five years old, you'll remember …

  1. AlanSh

    I was there

    I wrote my first Fortran program in 1971. I "grew up" (in the computing sense) alongside IBM 360's, DEC PDPs, CPM, DOS, WIndows, Unix.. All that from 1975 to 1990. Those were the brilliant chenaging years. Who remembers the book "Undocumented DOS"? Using that, you could do anything with your PC. Who worked on Vax VMS - you could do magic with DCL alone, never mind all the other build in tools.

    I designed and wrote some crazy s/w in those days. If I'd had a commercial hat on, I may have been rich - but it was far too much fun to share it all out for free.

    Since then, it's really been more of the same but better. More PCs, faster chips, better WIndows (although some would doubt that with WIndows 8 and now Windows 11).

    So, yes, the MAC Ultra is nice - but sorry, there's nothing "new" - it's just more of the same but faster, better and sometimes cheaper.

    It's still fun these days but it's not Exciting, Innovative, New.


    1. Peter D

      Re: I was there

      I first programmed using punched cards on a system running the George 3 OS. Productivity has certainly gone up since those heady days.

      1. Roland6 Silver badge

        Re: I was there

        >Productivity has certainly gone up since those heady days.


        Whilst we can sit at our desks and write code etc. and not have to mess around with cards. do we actually produce finished code significantly faster?

        1. Anonymous Coward
          Anonymous Coward

          Re: I was there

          If only they came up with a computer that could watch streamed video for us and remove the ads...

      2. smot

        Re: I was there

        ICL 1901A with George 1 then 2.

        Even a flirt with Auto Operator in the very early days....

      3. MX9000

        Re: I was there

        1) I was lucky enough to code IBM Assembly with 32 bit register.

        2) and Atari 800 programs in Forth.

        That was exciting.

        This high performance Apple computer is nice, but, it doesn't run FoldingAtHome, which would possibly take advantage of those GPUs.

        And the exciting thing for me today is 6 Virtual Machines, set up for specific purposes, that I can load and run when needed. 4 of those running Windows 10, 64bit, or Windows Server 2019.

        You can't run 64bit Intel code on this machine.

        So, Apple was smart, to Cut Intel Out of the Profits, but, they've cut off a lot of software too.

        And it's another dump your game purchases, they're not compatible.

        And Apple is dropping a lot of their own software support.

        1. Bruce Hoult

          Re: I was there

          And I programmed PDP-11s and VAXes and other similar machines ... PR1ME, Data General Eclipse.

          >This high performance Apple computer is nice, but, it doesn't run FoldingAtHome, which would possibly take advantage of those GPUs.

          It certainly does. I just downloaded and tried FoldingAtHome on my original M1 Mac Mini. It runs. The performance stats seem to be 800 points per day, 3.00 days ETA for a work unit. I have no idea whether that is good or bad, but it runs.

          >You can't run 64bit Intel code on this machine.

          Of course you can. I do it every single day. Apple includes an x86_64 emulator which runs x86 code and about as fast as the Intel machines they previously sold.

          Here's one benchmark I wrote many years ago and run on everything:

          Note that the Rosetta (x86 emulation) result is faster than a ThreadRipper, a Ryzen 5, or a Xeon Platinum.

          You appear to be misinformed.

      4. cdilla
        Thumb Up

        Re: I was there

        Marvelous to see George still being remembered.

        Can't say I'd want to write COBOL programs any more and then hand puch the cards because I couldn't wait for a gap in the Data Prep teams schedule.

        But the times are remembered fondly from afar.

        Nice article.

        1. ICL1900-G3

          Re: I was there

          George was wonderful... after working for ICL, I had a spell on 370s, I could not believe how primitive the OS was in comparison.

      5. ICL1900-G3

        Re: I was there

        I am very proud to say that I wrote bits of GIII. It was the most fun (apart from sex and drinking) I've ever had in my life.

      6. NoneSuch Silver badge
        Thumb Down

        Re: I was there

        Productivity has gone way down.

        Back then, all you dealt with was pure coding. Now you have MS continually interrupting you with pop-ups to sign into their (unnecessary) services, an OS that spies on you and multiple features you cannot turn off or uninstall.

        The minimal install size for a program generated using MS coding tools today is eye-wateringly ridiculous.

      7. tokai

        Re: I was there

    2. JDX Gold badge

      the MAC Ultra is nice - but sorry, there's nothing "new" - it's just more of the same but faster,

      So that's basically the last 30 years of computing, huge increases in performance are quite exciting to some.

      However the M1 IS pretty innovative in the sense that it is SUCH an increase in performance and decrease in power consumption. Look at the benchmarks of intel/M1 Macs online, this isn't an up-tick it's a huge change.

      1. batfink

        Re: It's just more of the same but faster,

        Agreed - but the whole point of the article is that even though it is a great step forward, nobody really cares anymore.

        1. fattybacon

          Re: It's just more of the same but faster,

          It's like Google Stadia. It's a staggering achievement, you are playing top end video games without any lag, without a console, using any old controller and browser screen (laptop, tablet, phone, fridge, microwave, pc) but people just shrug their shoulders and say it won't work. It does work. It's incredible.

          1. Anonymous Coward
            Anonymous Coward

            Re: It's just more of the same but faster,

            Stadia works so well Google are shutting it down. It does not work without the best internet connections that are also really close to Google servers.

            1. Not Yb Bronze badge

              Re: It's just more of the same but faster,

              I was curious, so checked the news... they've shut down the Stadia game dev studios, in February of last year, but not the rest of it.

              Can see why you went "Full anonymous" on that post.

            2. Rooster Brooster

              Re: It's just more of the same but faster,

              Works for me over a basic BT 50Mbps service - It's amazing.

          2. Anonymous Coward
            Anonymous Coward

            Re: It's just more of the same but faster,

            That's odd; what you describe is what you *would* expect from a cloud gaming service. But I heard the complete opposite, that (IIRC) it's horribly flakey unless you have a ludicrously fast net connection- even when running at moderate resolution- and a great setup.

            Completely defeating the whole point of having it "in the cloud" rather than having to run it yourself.

            Not to mention other nonsense like having to buy the games outright to be able to play them at prices the same as the desktop version (or greater, when the latter is discounted and the Stadia version isn't). Rather than them being "on demand" even though you have to keep paying your Stadia subscription.

            This is all from memory a few months back, so correct me if I'm wrong, but I got the impression it was a complete clusterf**k.

          3. Alex Stuart

            Re: It's just more of the same but faster,

            Stadia is an impressive achievement, yeah.

            But it's for casual, or relatively casual, gamers only.

            It's unusable on a standard ADSL connection when I tried it recently. Both lag and graphical quality were abysmal.

            Even if I had a better connection, there's no way it can replicate the fidelity of, say, my PS5, given that HDMI 2.1 bandwidth from the console to the TV is 48Gbps. There's simply no way to funnel that through any remotely normal internet connection without massive compression loss.

        2. Anonymous Coward
          Anonymous Coward

          Re: It's just more of the same but faster,

          The question though is why does no one care?

          I don't care either, and there are two big reasons for it:

          1. I don't use proprietary software or walled gardens. Since there is presently no open source system software that makes good use of this technology, its hypothetical raw performance isn't useful to me. Even if I were willing to do so, I wouldn't trust Apple's direction because I believe they want to move to a model where all software must be signed by them, distributed by them, and sold on a subscription basis with a 30% markup. I prefer to own my software outright and obtain it either free of charge or for a one-time fee without paying extra markups that don't add value.

          2. It's basically desktop technology. While I appreciate that there are people for whom faster desktops would be of great value and importance, I'm not one of them. Most computation is now done on headless servers; the core count and memory size of Apple's M1 family are not designed for that application. I don't need nor ask much from my desktop (or laptops); its function is to display large windows full of text and a couple of browsers. The work happens elsewhere, on machines far larger and more powerful than any Apple have made.

          There's a third reason that applies in my case but may or may not be relevant to others: it's ARM. That doesn't matter at all when it comes to the software I execute, but it means that I need cross-compilers when I go to actually do my job, because my job includes building software to run on large-scale amd64 machines. Those machines can't be, and more to the point haven't been, replaced by ARM processors (yet), and while LLVM and some of the technology built on it have made cross-building much easier than it was in the past, it's still a hassle and variable I don't need. I *like* ARM, but if I'm using my desktop as a dumb X-terminal I don't need this SoC, and if I'm doing real work on it then it needs to be amd64. Hopefully this will change someday.

          The bottom line is that for what I use computers to do, this SoC isn't useful to me. It might be in future, but not now. And while I am fully supportive of vertically-integrated, co-designed systems, I don't particularly care for the choices Apple have made in theirs. Worse, I assume that Apple will eventually make it impossible to run any software on macs that didn't come from their app store (complete with 30% markup that I'll be stuck paying year after year). It's difficult to make such a large long-term investment in something I can't trust to remain even partially open.

          I'm willing to bet that fear/dislike of Apple's business model and the giant price tag account for a lot of the disinterest. If you've already bought big into Apple's ecosystem, this is exciting and you're going to buy it. If you haven't, it's probably for one of these reasons and there's probably no level of performance that's likely to get you excited. I think it's fair to say that if AMD were making something like this for desktops/laptops that could run open source OSs out of the box with full capabilities, there'd be considerably more interest. But the sort of people who tend to geek out over computer architecture also tend to be pretty cool toward proprietary software and walled gardens. It's a great SoC, it's just not exciting to me because of the other choices Apple have made.

      2. MachDiamond Silver badge

        Re: the MAC Ultra is nice - but sorry, there's nothing "new"

        The power efficiency is huge. I was looking at a new cheese grater to replace my old cheese grater with the small holes, but the cost and need to wire a dedicated circuit was a big road block. I'd really like to run the house most of the time from battery power and a solar system so the less power I need to draw, the cheaper that system will be to configure and install.

        It used to be that many things I'd do on the computer would give me time to go brew up a cup of tea while the program showed me a spinning beach ball. Now most of the processes are done quickly enough that I don't have much more time than it takes to have a nice stretch. The speed of the computer isn't raising my work efficiency by doubling any more. Costing me less in leccy and not heating up the office in the summer is a big benefit. Not having to pull permits and have a sparky come over and tear open the walls to run more circuits is also a big savings.

        The Mac Studio is on my list to get this year. Perhaps in another couple of years it will be time to upgrade the windoze box. Linux, for the things I do with it, works perfectly well on my older hardware as there isn't as much required OS bloat to deal with.

    3. big_D Silver badge

      Re: I was there

      I agree and disagree. I think what Apple has done here is a huge leap forward in chip design.

      Yes, it is more of the same, only faster... But the technology behind it is well worth looking at.

      I love the old DCL days. I wrote a parameterised menu system for DCL that could use either normal menus (vertical) or 1-2-3 menus (horizontal). Written in DCL, with about 5 lines of C to enable it to poll the keyboard for a keypress, and if it was an escape sequence, to return the name of the key pressed (E.g. Up, Down, Gold, PF2 etc.).

      A friend of mine did a fantastic VMS job scheduling system, that allowed you to view the jobs on all of the VAXes in the computer room (around a dozen) and copy and paste jobs between machines, using ANSI graphics to make the interface easy to use.

      1. Smeagolberg

        Re: I was there

        >I agree and disagree. I think what Apple has done here is a huge leap forward in chip design.

        Once again, very well done with your world-leading design, A...


        1. hoagy_ytfc

          Re: I was there

          Except that they're not ARM designs. They are Apple's own designs. You either pay attention and know that, or you don't pay attention yet think your view is worth posting anyway.

    4. Anonymous Coward
      Anonymous Coward

      Re: I was there

      I love your articles Rupert, but there's one big elephant in the room in all of this.

      Dido's Test n' Trace, and it's cost. £41bn and climbing. That astronomical cost says something. If you put all that technology in the wrong (incompetent) hands, they'll spend (our) money producing hot air until the cows come home. However good the computing hardware, on the software front - coding wise - £9K a day invoiced fees, really doesn't go very far in the hands of incompetent people dishing out orders.

      I still can't help feel a bunch of rough and ready Synology NAS sync'd together, region by region storing the absolute bare minimum of data, would have done a better job of providing accurate data on test and trace in a timely manner, than the convoluted bullshit they dreamt up.

      Whatever the technical advancements, the golden rule always has to be, keep it simple, make sure it can do the bare minimum first and foremost.

      1. werdsmith Silver badge

        Re: I was there

        Dido's Test n' Trace, and it's cost. £41bn and climbing. That astronomical cost says something. If you put all that technology in the wrong (incompetent) hands, they'll spend (our) money producing hot air until the cows come home.

        You've said it yourself. TEST and trace. The very largest proportion of that money is to give out free test kits and process the results. The IT part of it is a small fraction. The Wetherspoons pub bore still claims that the app on the phone cost all that money.

        1. BurnedOut

          Re: I was there

          Well done for pointing this out. However, I don't feel it is fair on Wetherspoons customers to suggest that they are the main ones to propagate the idea that all of the money has been spent on ineffective IT.

        2. Ian Johnston Silver badge

          Re: I was there

          According to Google, 691 million LFTs have been sent out in the UK. You can buy them for about five quid each, so that's £3.5bn. The only figure I can quickly fine for PCR tests is Scotland, where 15m have been carried out. Assuming roughly the same rate in rUK, that's 150m at £26 - £69 (also the Scottish figure). Let's say £50 - that's another £7.5bn.

          So that's a total of £11bn on all the LFTs and PCR tests carried out in the UK. Would you say that's the largest proportion of £41bn?

          1. veti Silver badge

            Re: I was there

            Okay, that's testing. Now let's discuss tracing.

            For every person who tests positive, someone has to phone them up and spend probably a considerable amount of time finding out where they have been and who they've been in contact with, then follow up tracking down those people and telling them to get tested. You could pay a private detective to do work like that for hundreds of pounds per day - per person to be traced.

            1. AJ MacLeod

              Re: I was there

              The tracing part has been a colossal waste of time and money right from the beginning. By the time I ended up on the government's radar as having tested positive I was already on day 7 of symptoms (and had isolated from day one anyway.) There was zero point in either locating or telling anyone I'd been in contact with a week previously as by now they either had the virus or they hadn't... all that would be gained would be a bit of unnecessary anxiety and for the government an insight into my movements and contacts.

              Needless to say I didn't return the calls from the contact tracers, and those who I'd visited in the period before becoming symptomatic (including an elderly lady on immunosuppressants) thankfully hadn't picked it up from me anyway.

          2. NightFox

            Re: I was there

            You're making a massively flawed assumption that what you can buy an LFT for is what HMG would have paid. HMG would have had to absorb the true costs of these.

      2. Disgusted Of Tunbridge Wells Silver badge

        Re: I was there

        You are very misinformed. Stop reading Twitter.

        Testing more than any country in the world ( per capita per day ) was expensive. It was a hugely successful project.

        That's where almost all of the money went to.

        1. BurnedOut

          Re: I was there

          True, although I dread to think what number of Lateral Flow test kits will have been ordered and exported to be sold in countries where they are not available free of charge!

        2. Anonymous Coward
          Anonymous Coward

          Re: I was there

          Nothing to do with Twitter, I watch it direct from the horse's mouth. Select Committee hearings.

          1. Disgusted Of Tunbridge Wells Silver badge

            Re: I was there

            You heard what from the Select Committee hearings?

            It wasn't that £37bn went to SERCO or whatever nonsense it is that you're peddling.

        3. Dave 126

          Re: I was there

          >You are very misinformed. Stop reading Twitter.

          The uk gov IT sysyem for test and trace, and whose mates got the contracts, has been covered by Private Eye magazine.

          1. Disgusted Of Tunbridge Wells Silver badge

            Re: I was there

            I can only assume then that you don't read Private Eye.

            The IT system didn't cost £37bn. Almost all of that money went on testing. Testing is expensive.

            Do you even believe your own rhetoric?

            How can you spend £37bn on a software project?

            Really. Do you think ( even if we ignore the fact that it didn't happen ) that that happened? Because if you do that severely questions your capacity for common sense.

            1. Alligator

              Re: I was there

              Haven't you read anything about NHS IT?

              It's a long running British farce, even more notorious than British Leyland

        4. Ian Johnston Silver badge

          Re: I was there

          That's where almost all of the money went to.

          See my back of envelope calculation above. At standard rates, all the tests done come to about £11bn, which is not "almost all" of £41bn.

          1. Disgusted Of Tunbridge Wells Silver badge

            Re: I was there

            "Almost all" of the money was "allocated" to testing.

            The £37b or £41b figure, or whichever it is, that was how much was *allocated* to test & trace.

            Almost every penny being sub-allocated to the "test" part.

            If you don't believe me ( or just simple common sense ), the major fact checking services have completely debunked your nonsense. Feel free to look it up.

            I get the impression that you don't really believe this but you really really want to because of how much you hate the government. That's a poor reflection on you, don't you think?

        5. Ken G Silver badge
          Paris Hilton

          "Testing more than any country in the world ( per capita per day )"

          Can you cite your sources for that? I've seen the same claim made for the US, Israel, Malta, Iceland and Luxembourg. I've probably missed some.

          1. Disgusted Of Tunbridge Wells Silver badge

            Re: "Testing more than any country in the world ( per capita per day )"

            Not off the top of my head, I've something I have looked up in the past but I can't remember where from. You'd be best Googling. Possibly "our world in data"?

            As an example, a few months ago when there was a shortage, over a million tests per day were being done. That's an absolutely incredible figure.

            And at the same time people were complaining about both the slight shortage of tests and the cost of the testing system ( having believed the intentional lie about SERCO being given £37bn for contract tracing ).

            In about October 2020 I looked it up and Britain were 6th in the world for testing per day per capita - after five much smaller countries like Luxembourg, Singapore, etc. ( which obviously have an advantage in per-capita figures because logistically it's just a different prospect ). From memory the figure was something like 700k/day.

            Britain might not literally be the highest in the world, but very close - and only beaten by minuscule countries like Luxembourg and Singapore.

            1. Lars Silver badge

              Re: "Testing more than any country in the world ( per capita per day )"

              @Disgusted Of Tunbridge Wells

              If the testing is that good what went wrong when the result is so bad, too late, wrong persons or what.

              Britain's deaths per 100K is bad, worse than so many EU countries like France, Germany, the Netherlands, Portugal, Spain and all the Nordic countries and more.

              1. Disgusted Of Tunbridge Wells Silver badge

                Re: "Testing more than any country in the world ( per capita per day )"

                That was been debunked just a few days ago in a study in the Lancet (reported in The Times).

                Britain's Covid excess deaths (which are the most accurate method for international comparison) is below the European average and on par with Germany and France.

                Also England is the same as the devolved nations before you claim that the SNP or Welsh Labour did a better job ( England lower but within the margin of error ).

                1. Lars Silver badge

                  Re: "Testing more than any country in the world ( per capita per day )"

                  @Disgusted Of Tunbridge Wells

                  Proud about being average is of course something.

                  But these are some of the numbers and claiming that Britain is on par with Germany and France is simply not true, but you are more on par with the East European countries.

                  Death per 100K and #deaths

                  finland 49.3 2719

                  norway 32.8 1753

                  denmark 90.7 5280

                  sweden 173.8 17874

                  estonia 178.4 2367

                  france 210.7 141321

                  germany 151.5 125912

                  netherlands 128.9 22339

                  portugal 207.8 21342

                  spain 215.4 101416

                  italy 260.4 156997

                  ireland 134.0 6624

                  britain 244.6 163454

                  These numbers change but they will not drop.

                  Please don't take after Boris and Trump.


                  1. veti Silver badge

                    Re: "Testing more than any country in the world ( per capita per day )"

                    You're talking about the reported COVID-related deaths. GP was quite explicitly talking about "excess mortality", which is something quite different. It's a way of getting around the subjectivity and variability about how different countries count their "COVID-related" deaths.

                    See here for a detailed discussion.

                    I haven't crunched the numbers, I don't know if GP is right. But please, at least argue with the point being made, not with a completely different set of numbers. We're not politicians here.

                  2. Disgusted Of Tunbridge Wells Silver badge

                    Re: "Testing more than any country in the world ( per capita per day )"

                    As the other poster says, I'm talking about the scientific study that compared excess deaths.

                    "Proud about being average"

                    In the face of people like you lying about Britain being amongst the worst in Europe, being below the European average and on par with France and Germany is at least different.

            2. Ken G Silver badge

              Re: "Testing more than any country in the world ( per capita per day )"

              I live in Luxembourg.

        6. ICL1900-G3

          Re: I was there

          Stop reading the Daily Heil. The Economist would disagree with you.

          1. Disgusted Of Tunbridge Wells Silver badge

            Re: I was there

            Then the economist is lying.

            These aren't opinions. These are facts. Easily verifiable facts.

            How about Channel 4's Fact Checker:


            Stop spreading lies please.

            From FullFact:

            "£37 billion is the two-year budget for Test and Trace. The vast majority is spent on testing not tracing and does not go to Serco."

      3. Roland6 Silver badge

        Re: I was there

        >Dido's Test n' Trace, and it's cost. £41bn and climbing.

        A few months back there was a radio piece on this, it compared the UK with Germany.

        What was discovered was that once you took into account how Germany financed their test and trace programme and thus joined up the funding pots, things got very similar, until you took into consideration that the UK bulk purchased lab time and so was able to lower its costs by pushing as many tests as it could through the labs, Germany on the other hand paid the labs per test and if they had conducted as many tests as the UK their costs would have been significantly higher...

      4. Anonymous Coward
        Anonymous Coward

        Re: I was there

        we know where it went. On outsourcers.

        I'm sure people here are well aware that for every project outsourced, the amount of money thrown away is enormous. Try asking for a techie to be hired on your much stress is that. They'll literally THROW project managers and Business managers your way.

        If you have a dev or infrastructure team of 5 ppl. You'll have a SDM or 2, a PM or 2, several management types in the tree, an entire structure of admin people, comms people, customer relationship managers. each outsourcer involved with have their own PM and Architect. The customer will need their own PM and Architect to have any chance of keeping the outsourcing firms in check.

        Just go on a call with BT (Bring Ten to each meeting) or the likes of Accenshite, Krappy MG or even PissPoor WC. You'll have 1 techie (if you're lucky) on their side and a host of business types screaming things like "commercials" at you. You can't have meetings quickly because every person on the meeting has to negotiate their diary.

        The sheer amount of time wasted, the number of people in IT who aren't techincal, have no idea of the technical side and are literaly there to spend their entire day in meetings to avoid doing real work.

        THATS where the money goes.

        1. Disgusted Of Tunbridge Wells Silver badge

          Re: I was there

          No it didn't. Read fullfact.

          The vast, vast, vast majority of the money went on buying and processing tests.

          Tests are expensive and Britain performed more tests per capita than pretty much anybody else.

          It almost all went on testing.

    5. I Am Spartacus

      Re: I was there

      Fortran on an Elliot 903, using paper tape. Feed in the compiler program, feed in your program, and it spewed out a paper tape of compiled code. You fed this in to a second machine that would execute the program. And it was exciting!

      The on to the Universities IBM 360/158 running MTS. Which was great, because we found out how to work around the security (thats what happens when you give the students the source code to the O/S!) Again it was fun.

      Then on to Vaxen. My all time favorite operating system. The joys of DCL. The power in a properly written system service calls. Hey, being able to add your own system service calls. Macro-32 and Bliss.

      So a faster version of what we already have is, yeah, a bit meh.

      1. Lars Silver badge

        Re: I was there

        "So a faster version of what we already have is, yeah, a bit meh."

        Yes, I can understand that feeling too but the reality is that there are a hell of a lot that can only be done when there is that speed. And there are still things we cannot do because we don't have the speed to get it done.

        Computer will always remain slow. Looking back there is not one fast computer, and that will not change.

        1. jmch Silver badge

          Re: I was there

          "So a faster version of what we already have is, yeah, a bit meh."

          There is that, yes, but I think the article author discounts too much the natural age cycle of these things. Simply put, young people are more excitable. Experienced IT pros who have seen it all are a bit more cautious / skeptical.

          That doesn't make this chip any more or less exciting, than say the Pentium way back when. It's mostly in the eye of the beholder

  2. Ian Johnston Silver badge

    It's very nice and impressive that the hardware is so fast, but does it actually do much more than a 2001 iMac? Because generally software just seems to bloat to fill the hardware available with no particular net improvement.

    1. AMBxx Silver badge

      Most users need less than ever. 90% of workload is in a browser. Even with the crappy JS that's being written, you don't need much.

      For developers, so much is cloud based that you're not doing much on the client.

      I now have a really powerful main PC to run lots of VMs.

      For laptop, I'm using a 5 year old i5. It's a nice machine, but far more than I need to use RDP and a browser.

      Even casual gaming is moving towards streaming.

      1. Korev Silver badge

        Most users need less than ever. 90% of workload is in a browser. Even with the crappy JS that's being written, you don't need much.

        Not much CPU, but plenty of RAM (cold hard stare at Chrome...)

        1. DevOpsTimothyC

          Not much CPU

          Just try running the crappy javascript on a cpu more than 5 years old and with more than one or two tabs open you'll see it can consume quite a bit of CPU.

          1. nintendoeats Silver badge

            Yeah, I connect to work using an i5-2400S. I have to be very careful about tab management, and youtube is a constant struggle.

          2. Ramis101

            Or alternatively, block the vast majority of the crappy javascript and this Elitebook 8540w i'm typing on now still cuts the mustard nicely.

            As others have commented, the guff just fills out to use the resource. I remember Windows 3.11, it was such a marked difference over 3.1 for speed, but M$ soon learned the error of their ways & went back to shovelling in tons of "pretty" guff.

            1. veti Silver badge

              The whole point of improving performance is to shovel in more guff.

              To be sure, you could have one without the other. But you'd have to pay through the nose for it. If you're on a budget, your guff will always slightly overfill the available capacity.

              1. Charles 9

                So, basically, a take on Parkinson's Law, then?

          3. Ian Johnston Silver badge

            I'm writing this on my desktop computer, a ten year old Lenovo ThinkCentre USFF M58 with a ... pauses to run dmidecode ... Intel(R) Core(TM)2 Duo CPU E7400 @ 2.80GHz which ... pauses to check the Intel web site ... came out in 2008. Cost me £35 including delivery from eBay, souped up with 8GB of RAM and an SSD, runs Xubuntu.

            And it does everything I need. LibreOffice, Teams, Zoom, Audacity, ripping DVDs. I have YouTube running as I write and it's taking 17.6% of one processor to do that. The only time I have run out of RAM was when Firefox leaked memory and ate the lot.

            So while I have no doubt that a modern PC could run much, much faster, for me there is absolutely no point.

            1. fidodogbreath

              "Harumph, all these fancy-schmancy Workstation PCs do is add up a few numbers and move some pixels around on a screen, exactly the same as 386SX."

              1. Sudosu Bronze badge

                The "fastest" machine I ever remember playing with was a P90 at the computer store, with Win95 or 98.

                Its response was "instant" you clicked on something (i.e. Word) and it was open before you lifted your finger off the mouse button.

                For some reason my modern decently high spec windows machines never feel that fast.

            2. Sudosu Bronze badge

              I still have my E8500 with an SSD repurposed to a living room gaming machine and it works well enough for me on most of my games other than current AAA shooters.

      2. big_D Silver badge

        If you are editing multiple 8K source streams into a finished film, you probably aren't doing that "in the cloud". There are a lot of tasks that do need plenty of local power.

        But the average user is probably well catered for with a Core i5 or an Apple M1 chip.

        But it is great to have the power there for those that do need it.

        1. Arthur the cat Silver badge

          But the average user is probably well catered for with a Core i5 or an Apple M1 chip.

          I suspect the average user is well catered for by a tablet. Most average users don't "use a computer" these days, they run apps(*).

          (*) Anyone wanting to nitpick that you need a computer to run apps is missing my point. The average user doesn't think of themselves as a computer user, computers are for nerds. They're watching films/cat videos or chatting with family and friends.

          1. big_D Silver badge

            It depends on where you look, of course.

            We have over 500 PCs, over 20 ESXi clusters, 30 company smartphones and 15 tablets at the company I work for. Most users have dual-screen set-ups to enable them to do their jobs efficiently.

            Yes, home users can probably often get away with a tablet, but there again, these high-end chips that Apple are currently releasing (Pro, Max and Ultra) are not aimed at home users, they are aimed at media professionals, designers and scientists, among other high end specialists, plus graphic and neural cores.

          2. SundogUK Silver badge

            People keep making this claim but I'm not convinced. Where I work we have about 40 people, every one of which works on a full spec. laptop. I don't think we are unusual.

            1. Arthur the cat Silver badge

              Where I work isn't the location of most computer users.

              There's at least as much computer use at home and other places outside work as there is in work, which was my point. Computers are now ubiquitous and becoming invisible. Someone watching football on their mobile down the pub is technically a computer user but would never describe themself that way.

              1. big_D Silver badge

                Yes, but we are talking specifically, with this article, about high end professional chips that will be used almost purely in niche business situations.

              2. SundogUK Silver badge

                "Where I work isn't the location of most computer users."

                I suspect a lot of this is semantics.

                I don't believe a lot of productive 'work' is carried out on mobiles. Mostly it's still mainly 'communications' - which were previously carried out by phone/fax/whatever anyway. Nobody designs a car or runs their month end finance on a mobile phone.

                What the article is talking about is actual PC users, and I think that is still (and probably will be for a while yet) dominated by gaming and work related users.

        2. Roland6 Silver badge


          If you are editing multiple 8K source streams into a finished film, you probably aren't doing that "in the cloud". There are a lot of tasks that do need plenty of local power.

          However, for that you need GPU power and bus capacity, not CPU power ie. an AMD Ryzen 5 on an X570 motherboard coupled with a couple of decent DaVinci Resolve compatible graphics cards is more than adequate for the task.

          1. big_D Silver badge

            No, if you are doing that, you need memory throughput and IO, so you are talking AMD Threadrippers, Intel Xeon Platinum or M1 Max or M1 Ultra.

            1. Roland6 Silver badge

              Well, given after I posted I realised that I was thinking of a 4K video workstation (it's what a client uses for small scale video production with 4 x 4K streams), your comments are kind.

              However, looking at the supplier theircurrent top end Pro 8K PC workstation systems are all AMD Threadripper's with relatively low core count but top end per core speeds - as it is generally accepted that the video editing suites don't scale well, so do better with fewer faster cpu's. However, when it comes to the GPU's, its top end NVidia PNY. Interestingly, they only install 4x32GB of system RAM (max.8x32), but combine it with a significant number of fast SSDs.

              I think a few years back Xeon was the best because it meant you had to use a server motherboard, which provided the IO throughput; the top end AMD motherboards seem to be more than equal in non-server applications.

        3. katrinab Silver badge

          It is called a Mac Studio, so it is intended for people who edit multiple 8k source streams.

          Previously that market was served by the likes of SGI, who sold machines that cost many times more than your house cost.

          1. Anonymous Coward
            Anonymous Coward

            Not to disagree too strongly, but those SGI folks were making Real Money while making Real Movies. They weren't just editing videos of themselves eating lunch or putting on makeup or lipsyncing. Meaning - yeah, some actual professionals may use a Mac Studio, but I gotta think some of the core audience that Apple is aiming for is going to be the "social influencers" crowd who get chunks of spending money and need something to spend it on.

        4. Anonymous Coward
          Anonymous Coward

          Given how much more powerful an M1 is than an i5 that is quite a spread you’ve given.

        5. Loyal Commenter Silver badge

          If you're editing multiple 8K sources into a finished film, the odds are you're doing most of that heavy lifting on NVidia silicon, not Apple silicon.

      3. Snapper

        The hint is in the name, Mac (not MAC, it's 2022! WHEN are you going to notice?) Studio.

        This is for people in Studio's.

        Graphic Design.





        3D Animation.


        Think of it as a very powerful iMac with screens you can choose yourself. Oh, and 10GBE.

        1. Roland6 Silver badge

          >This is for people in Studio's.

          Looking at the Mac Studio spec's these are the ad and creative design studio's, where image probably rates slightly higher than delivery capability, rather than film/tv studios.

          Which given Apples past, is probably its target demographic, and it probably delivers more than sufficient punch to this market.

    2. big_D Silver badge

      Yes, it can. Some of the software may have bloated, but for technical stuff, the software is still often well written and economical.

      In addition, you have those neural cores, which didn't exist back then.

      So, yes, it can do more than a 2001 iMac, it can also do the same things the 2001 iMac could do so much faster. Ploughing through terrabytes of data, the iMac would have struggled with megabytes of data. Machine learning was beyond an iMac, or most other machines, back then.

      The world of research has moved on and the new M1 Ultra, and Intel Xeon Platinums, AMD Threadrippers and nVidia Tensor, Fujitsu A64FX etc. provide power that leaves the supercomputers of that age in the dust.

      Yes, consumer applications haven't moved forward at the same scale as research models etc. and they don't seem that different, but consider the number of pixels a photo editor has to push around nowadays, compared with then, the amount of automatic processing and filtering that they can do in real time, instead of you having to wait for the low-res image to be drawn on the screen.

      1. Ian Johnston Silver badge

        So, yes, it can do more than a 2001 iMac, it can also do the same things the 2001 iMac could do so much faster. Ploughing through terrabytes of data, the iMac would have struggled with megabytes of data. Machine learning was beyond an iMac, or most other machines, back then.

        Which is great for the ... what, 0.1%? 0.5% ... of users who need to plough through terabytes of data.

        1. Doctor Syntax Silver badge

          Those who don't ensure there are processors for those who do by buying the same processors to run bloatware.

        2. Adrian 4

          > Which is great for the ... what, 0.1%? 0.5% ... of users who need to plough through terabytes of data.

          If 0.1% or 0.5% of users buy one, Apple will be very, very happy. And will probably put the price up.

        3. big_D Silver badge

          Well, yes, the Studio is aimed at these professionals, which are probably a little more than 0.5% of their customers, but it is for those studio professionals, like a Xeon or Threadripper workstation, it isn't aimed at the average user, it has serious amounts of performance (and in this case, at very low power draws, in comparison to Intel and AMD) and costs serious amounts of money.

    3. neilo

      You asked: but does it actually do much more than a 2001 iMac?

      More? Hard to say. Does it do it better and faster? Absolutely. Video editing is a breeze. Audio editing is a breeze. In every aspect, the M1 Macs are simply a better experience of a 2001 PowerPC iMac.

    4. gotes

      I expect the 2001 iMac would struggle to to what most people take for granted these days; stream high definition video.

    5. fidodogbreath

      does it actually do much more than a 2001 iMac?

      How many 8K streams can you run on a 2001 iMac? How many high-end DSLR RAW files could it have open in PhotoShop?

      1. Ian Johnston Silver badge

        Niche markets. Big deal.

    6. katrinab Silver badge

      More than a 2001 iMac, yes. I don't think you would be able to watch Netflix on it.

      Compared to a late 2012 iMac with Ivy Bridge, the new one will certainly be faster, but for most tasks, the Ivy Bridge is fast enough, provided you replace the Fusion Drive with an SSD.

      By fast enough, I mean that you are measuring response times in ms, and while the new one will take fewer ms, you will perceive both as being instant.

    7. Dave 126

      > It's very nice and impressive that the hardware is so fast, but does it actually do much more than a 2001 iMac?

      Yes and no.

      No it doesn't. It is suitable for editing video that meets the standard distribution format of the day (Netflicks HDR Turbo or whatever), just as the 2001 iMac could edit the video of its day.

      Yes it does. Of course it does more - if you're throwing it at the right tasks.

      1. Ian Johnston Silver badge

        Yes it does. Of course it does more - if you're throwing it at the right tasks.

        What I meants was "In real-life usage are these things actually doing more than a 2001 iMac" rather than "... can it do more than ...". I've no doubt thtat some people find them useful, but I suspect that most purchasers will never, ever use the full power available to them. In much the same way that 99% of what most Porsches actually do could be done just as well by a Vauxhall Corsa.

        1. Lars Silver badge

          True, but a Porsche beats a Vauxhall Corsa just doing nothing, just sitting there.

  3. Stuart Castle Silver badge

    I do like the Apple Processors, I'm sitting here looking at a Mac Mini that is both a lovely machine, and in terms of raw processing speed, probably the most powerful PC I have, and I do have a Ryzen 5 based PC, but I have a concern about the processors.

    The Unified memory. The Mac Studio can have up to 128 Gig, but that's 128 Gig in total. The kinds of industries Apple are marketing to could very well already have workstations with 128Gig RAM and a beefy GPU with an extra 24 gig. Perhaps several..

    1. Kristian Walsh Silver badge

      Those industries are supposed to buy the Mac Pro, which can accommodate 1.5 Tbyte of RAM.

      I say “supposed to” because pricing on Mac Pro lives in its own private fantasy-land - anyone paying €6k+ for a quad-core Xeon 3223 and 32 Gbytes of RAM would need their head, and/or budgetary approval privileges, examined.

      1. Anonymous Coward
        Anonymous Coward

        I love to know how many Apple sold, I doubt it is above 20K in numbers, maybe not even near that.

        1. teknopaul

          It would be interesting

          I'm wouldn't be surprised if there are plenty of rich fanbois are out there spending money so they can show off how much money the are spending.

          I'm sure the business case for these machines is along the lines of "so we look professional/posh", I don't think it works.

          Worked for a company that purchased Apple stuff and put it next to front door to "impress" customers.

          Needless to say they went bust. (and a lot of equipment went missing)

      2. Yet Another Anonymous coward Silver badge

        Compared to the cost of the people in these industries, that's the daily cost of a color mixer or editor.

    2. Joe Gurman

      Not an Apples to apples comparison?

      Apple claims that its Apple silicon-based machines don’t need as much RAM as, say, Intel-based ones because (1) the memory access is so much faster than in conventional, off-the-SOC designs, the memory bandwidth is so much less and (2) their CPUs just move the data in and out faster, too.

      I haven’t used such a machine yet, so I can’t say whether I’d agree in practice, but people doing light and medium video creation loads appear to agree Apple’s done it right. The good news? Intel may catch up in two to three years…. with where Apple is now.

      1. Snake Silver badge

        Re: access speed and bandwidth

        ..and both statistics are irrelevant for the discussion at hand, that being "How much RAM is necessary to work on, and process, large *data* structures?".

        It doesn't matter if your 128GB of RAM is claimed to be "faster than X!" if you are trying to work on a 192GB video build that doesn't fit.

  4. Anonymous Coward
    Anonymous Coward

    Interesting but....

    It is a fairly sad state of affairs when rather than comparing Apples latest offerings to its current competitors, they need to be compared to 20 year old technology to make them sound good.

    Plus, 20 years ago people would have laughed at anyone buying Apple computers, and the iPod was only just emerging as a mainstream device. They are heading full circle.

    1. fidodogbreath

      Re: Interesting but....

      It is a fairly sad state of affairs when rather than comparing Apples latest offerings to its current competitors, they need to be compared to 20 year old technology to make them sound good. M1 Ultra didn't win all of the benchmarks vs current Intel and AMD chips, but it won a lot of them.

      1. Anonymous Coward
        Anonymous Coward

        Re: Interesting but....

        And won them at a fraction of the power usage? How’s your latest bill?

  5. Neil Barnes Silver badge

    An argument I've made before...

    Mr or Ms average don't care about the numbers. They care about whether they can watch cat videos. They don't care about the label on the box particularly, though some may decide that paying extra to have an apple on the side is a good thing. As long as they can still watch cat videos. Or prOn. Or both.

    The average user needs a browser and *possibly* a minimal office suite. The average office worker needs little more. And that minimum is available on a three hundred quid box from a pile-it-high retailer and has been for years. Why would they pay more? That's only going to happen to that minimal subset of users who *need* the performance...

    1. DJV Silver badge

      Re: "they can still watch cat videos. Or prOn. Or both"

      Hmmm, cat pr0n - I didn't realise there was a market for that!

      1. Anonymous Coward
        Anonymous Coward

        Re: "they can still watch cat videos. Or prOn. Or both"

        I've just been suitably informed by a DM reader, that Kate Beckinsale seems to be leading the field in soft cat prOn, right now on the DM hatefest website.

        1. Doctor Syntax Silver badge

          Re: "they can still watch cat videos. Or prOn. Or both"

          It's good that you quoted the URL in full. It ensures I'm adequately informed without needing to follow it. Thanks.

          1. Anonymous Coward
            Anonymous Coward

            Re: "they can still watch cat videos. Or prOn. Or both"

            In fairness, I didn't even make it to the hatefest DM comments below, on receiving the link either. It's in essence, it's a couple Cox Orange Pippin, (Apples, that is) in a mask, around a Cat's neck, to form a crop top, two apples protruding.

            So, in some ways, still relevant to the article, we've come full circle, so to speak.

            Speaking of relevance. Celebs and what they do to attempt to remain relevant...

  6. Anonymous Coward
    Anonymous Coward

    I saw the reveal presentation, and, while I'm no fanboy, I was amazed

    can't have enough performance for DaVinci Resolve when using Fusion, so the more the better.

    programming, multiple specification documents open, running renderers and still being able to use the computer, the more the better! Using many applications at the same time so, I'm quite excited.

    1. Kristian Walsh Silver badge

      Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

      You’ll hit that RAM barrier pretty quickly, though, and then you’ll have nowhere to go except sell and replace. If you’re a professional, your time is your income, and the downtime of buying and setting up a new system compared with unscrewing a panel and adding DIMMs is significant.

      That’s my big problem with these: you’re basically given a choice of grossly overpaying for the memory you’ll need over the unit’s lifetime, or ending up at a dead-end. Also, at 128 Gbyte, that maximum memory purchase isn’t even that high.

      That inability to cost-effectively expand means that this isn’t a professional system; and the specifications are more than any regular user could ever find a use for - the only mainstream high-performance computing application is gaming, and there’s almost no Mac gaming market. Even if there were, games require GPU rather than CPU muscle, and the M1 Ultra is nowhere near as competitive as a GPU.

      I think that’s why the response to the M1 Ultra has been so muted. It’s great, in theory, but who would actually buy one? At the other end of the range, I also feel that there’s too much of a gap between the Mac mini and the entry-model Studio: the entry Studio model is nearly double the price of the top mini.

      1. oiseau

        Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

        ... basically given a choice of grossly overpaying for the memory you’ll need ...

        ... inability to cost-effectively expand means that this isn’t a professional system ...

        This is and was so from the very start, by design and to a lesser extent, the result of corporate greed.

        And the very reason IBM PC boxes are widespread all over the world and Apple boxes are not.


      2. Wellyboot Silver badge

        Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

        >>>at 128 Gbyte, that maximum memory purchase isn’t even that high<<<

        I don't think there was that much memory on the planet when I started in computers!

        A real indication of the effect exponential growth can achieve if you just wait for a while.

        1. nintendoeats Silver badge

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          There is a mental exercise I like to what year was there more memory in the world than there is in my GPU right now? When I have a 1060 6GB I worked out that it was probably around the second year(1983-1984) of the C64...the huge amount of memory in that machine combined with its sales.

          Now I have 16GB...actually, it's probably still the same year. Exponential growth!

      3. Anonymous Coward
        Anonymous Coward

        Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

        Outside of my 486 SX I can't remember a period in history where I've ever needed to upgrade my RAM past what I initially speced and purchased.

        I've been using 32GB in my home desktop since about 2014 and that's still more than acceptable and has been overkill for most of that time.

        The 16GB on my M1 Pro MacBook Pro doesn't seem to sweat at any point and I can't imagine needing more within the life of this machine.

        1. mathew42

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          It depends on what you do. I frequently have multiple VMs running and memory is the most critical resource. Of course that doesn't count firefox which appears to have an instance for every tab, some of which are consuming 400MiB+.

          1. H in The Hague

            Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

            "Of course that doesn't count firefox which appears to have an instance for every tab, some of which are consuming 400MiB+."

            Interesting. Right now I've got around 25 tabs open in Firefox and it's using around 1.5 GB. I might try and open the same tabs in Vivaldi and compare its memory usage if I get a moment.

        2. nintendoeats Silver badge

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          I have done this many times, though I agree that 32 GB has done me for a long time. If I did more than casual media work, I would have upgraded beyond that and would probably be at 128 GB right now.

        3. Kristian Walsh Silver badge

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          My experience is different: I have upgraded RAM in every system I’ve ever owned, both Mac and PC (my last Mac laptop, a 2015 MacBook, had a RAM upgrade in its lifetime too). It’s the one thing that I do tend to run out of in my workloads, but then, I had a period of developing systems that required a lot of containers and virtualised systems, and they tend to consume lots of RAM.

          Storage is another thing that I have always ended up upgrading, but then it has been for speed (SATA 3 to SATA 6 to NMVe), rather than capacity. I don’t think I’ll need more than 1 Tbyte onboard for a long time to come.

          The issue of expandability came to my mind because I have just replaced my own desktop system, and I fitted it with 32G of DDR5. I may end up needing 64G in a year or two, but the price of DDR5 is so crazy right now that it just wasn’t financially sound to buy that much until I really needed it. By then, the usual pricing curve for electronics will make it much cheaper for me to upgrade. Apple’s design decisions with the M1 chip take that option away from me, and I find that position difficult to swallow in something that claims to be a professional tool.

          1. Androgynous Cow Herd

            Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

            If you understand your workload, why don't you just max out the RAM when you buy the system?

            Since time and efficiency is important, how much time is saved never unscrewing that panel at all because the RAM is already maxed out? You also don't risk an "oopsie" while you have your system open that could take it down for longer than the setup time for your new system.

            BTW, I deployed a new studio portable recently. Since data is on the SAN or the NAS, total setup time was about an hour... and changing discrete components on Ann olds system is the worst money pit of all it the whole money pit.

            1. Kristian Walsh Silver badge

              Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

              If you understand your workload, why don't you just max out the RAM when you buy the system?

              It’s precisely because I do understand my workloads that I am able to defer the cost of any upgrades. “Maxing out” is a clear sign that a buyer doesn’t have a handle on their real requirements. You can get away with it for one off purchases, but those mistakes multiply quickly.

              I make my living writing software systems of various sizes, operating from a few kilobytes to a terabyte in data-size. Right now, I know the work I’m doing has low RAM requirements, but as soon as I shift to anything cloud-based again, with lots of containers, I’ll buy the extra RAM. By then it will also be cheaper, both nominally, and in terms of spending power. But then, maybe I’ll still be writing small systems through the life of the system. If so, also great, I haven’t wasted a cent.

              But that’s just my single PC. My point was if you’re making that decision for thirty to a hundred workstations, the consequences of such a cavalier approach to provisioning as you‘re suggesting will add up quickly, even before we consider Apple’s steep price ramp.

              As for downtime, it’s twenty minutes, including the memory check (five, if I do what you did and don’t self-test the system hardware before putting it into use). I’m fairly confident in my ability to safely fit a pair of DIMMs without damaging my computer. But don’t imagine that’s any kind of a boast: computers are designed to be assembled by low-skilled labour - there’s nothing magical or “elite” about putting components into one yourself.

            2. Anonymous Coward
              Anonymous Coward

              Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

              exactly. You get what you need from the beginning, if you're a pro. The cost of the hardware can be recouped in 5 days as in my case, 10 days if considering all I need for my work.

        4. Vestas

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          These days many PCs come with a single DIMM fitted.

          On laptops its pretty much across the board too - from Asus gaming laptops to HP business laptops if you buy from the manufacturer.

          That means you're not using dual-channel memory and hence the CPU is (almost) literally half as fast in terms of memory access as it should be. You might not need another 8GB/whatever but you need another DIMM to enable dual-channel memory on the CPU.

          This isn't really relevant to the article but is probably worth mentioning anyway.

          1. Roland6 Silver badge

            Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

            >These days many PCs come with a single DIMM fitted.

            And you have to dig to find out this information and how many slots the PC actually has.

            As you do need to be careful, as OEMs (eg. HP) can be crafty and decide that 8GB is cheaper as 2x4GB rather than 1x8GB. This really messes up the mid-life upgrade (Windows 7/8 -> 10) as you can't just buy 2x8GB DIMMs and upgrade two machines to 16GB of dual-channel memory.

        5. Arthur the cat Silver badge

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          Back in 1988 I had 4 MB of memory in the Sun 3 I used for developing the core software for our startup. Over the next 3-4 years I loaded it up with spare memory from other machines as they were retired until I had the maximum humongous 24 MB of memory. Everybody else in the company was running on RISC machines of various architectures with much bigger memories.

          Why did I have such a small underpowered machine? Because I was writing core software - if I could get it running acceptably fast on my box, it ran like a vindaloo washed down with Guinness and Ex-Lax(*) on everybody else's (and customers') machines.

          (*) Those of a nervous disposition may substitute "very fast thing" if they wish.

        6. MachDiamond Silver badge

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          "Outside of my 486 SX I can't remember a period in history where I've ever needed to upgrade my RAM past what I initially speced and purchased.

          I'm mostly the same way. If I'm making money with the computer, chances are that it's worth spending the money to get a new one next year. An upside is the old one still has some resale value. Gone are the days when I pinched pennies on computers other than a few dedicated low-use machines. If I can shave off 10 minutes on a photo editing job and more on processing a video, over the course of a year even an "expensive" machine is worth it. As I get older, I have a healthier respect for my time as I have less of it left. Getting jobs done faster means I can do more work or I can do more other things. Why do I want to go through all of the bother of researching parts for a DIY windoze box that saves me a few hundred, when I can save that time and just purchase something fairly quick with little consideration?

          I tend to do my configurations upfront. Rarely do I add memory and I've never changed CPU's after an initial setup. Many manufacturers are soldering in RAM and the upgrade path for other components is so limited that there is likely a bottleneck someplace I can't do anything about that makes upgrades dubious anyway.

          1. Charles 9

            Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

            But if there are two things I always like to have more of in any given machine, it's RAM and storage. So if a machine I obtain isn't maxed out in the RAM department, that tends to be the first thing changed to it.

      4. Jay 2

        Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

        I think you've hit the nail on the head with your last paragraph.

        I currently have a 2013 21" iMac which won't be getting another OS update and with all the WFH I need to rejig my setup a bit. Not that I particularly like iMacs, but the 27" is no longer available and I'm not really interested in the 24". Similarly I'm not much of a laptop person.

        So over the weekend I was looking at options and for me my choices are:

        Mac Mini M1 with 2 x 24" 1080p = ~£1800 with 16GB RAM, 1TB storage and various new cables/stands/etc

        Mac Studio M1 Max with 2 x 28" 4k = ~£3000 with 1TB storage various new cables/stands/etc (4k as I can and it will give a bit more screen real estate).

        I'll probably go with the Studio even though it's slight overkill, but it'll be future-proofed enough and I do usually run my Macs for ~8 years or so. Plus currently got a stuff like a 2009 Samsung monitor and and even older MS mouse in play, so I'm due a refresh!

      5. Ian Johnston Silver badge

        Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

        I think it's generally accepted now that Apple have lost their traditional market in the professional AV world (the BBC went to Windows ten years ago), so machines like this are aimed at the amateur who wants to feel that s/he has something really posh on which to make their YouTube videos. The Range Rover Evoque of the desktop computing world, basically.

        1. razzaDazza1234

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          I know loads of AV houses that will love these.

          As mentioned above, when earning with the gear, buy nice or buy twice. plus, I have lost count of the hours and pounds lost when using cheaper computers/OS's in the past.

          Apple are coming good again now that Ives and his Product Design w++k+r team aren't sailing the ship.

          The BBC HAD to shift to reduce their luxurious spending and I bet it was a nightmare and ends up costing as much as the Macs.

        2. katrinab Silver badge

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          That is because 10 years ago, the only thing Apple had to offer them was a trashcan.

      6. Snapper

        Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

        The Unified RAM, effectively part of the CPU, is far more efficient and why people are able to do professional level video editing with 'only' 8GB of RAM in the M1 Mac Mini with far lower CPU temperatures and power consumption. If you have an M1 laptop that's a big bonus compared to a high-end PC unhitched from the grid..

        Apple have said a Mac Pro with M1 or M2 CPU's will be with us by the end of this year. I expect to see a cut down version of the current Intel Mac Pro, plus a slightly redesigned Super Mac Pro that will last the user 15 years.

        1. Smirnov

          Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

          "The Unified RAM, effectively part of the CPU, is far more efficient and why people are able to do professional level video editing with 'only' 8GB of RAM in the M1 Mac Mini with far lower CPU temperatures and power consumption. If you have an M1 laptop that's a big bonus compared to a high-end PC unhitched from the grid.."

          Unified memory isn't more efficient (it's also not new, UMA has already been done by SGI 26 years ago!). Data doesn't magically become smaller just because the memory is now closer to the CPU.

          "Apple have said a Mac Pro with M1 or M2 CPU's will be with us by the end of this year. I expect to see a cut down version of the current Intel Mac Pro, plus a slightly redesigned Super Mac Pro that will last the user 15 years."

          There will never ever be a Mac Pro that lasts 15 years, especially not when considering that during the intel phase Microsoft has supported Macs for much longer than Apple themselves.

          I still have one of the last classic cheese grater Mac Pros, which is now 10 years old. And while it's still OK for most less-demanding tasks, it's slow compared to everything else I have here. And the only reason I have been holding onto it is because Apple fucked up the successor (aptly nicknamed 'trashcan') and priced the current one into the extremes.

          I'm not sure I even want to power it on when it reaches 15 years.

          1. Ace2 Silver badge

            Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

            I don’t think you’re giving unified memory enough credit. It’s not more efficient as in “letting you pack in more data.”

            In DDR something already in L2 is quick to access, but with HBM *all* of memory is only as far away as an L1 or L2 access in a standard DDR system. The difference is dramatic.

            Also - and I haven’t seen anyone mention this anywhere - the HBM is only going to require 1/4-1/3 as much controller area as two channels of DDR. Using HBM allows them to put that silicon area to use for more cores, encoders, etc.

  7. Omnipresent Bronze badge

    News Flash

    It's the price tag. Single creatives are spending 4000$ a month on rent, and the family joe has kit that already works, as long as he/she turns off our internet. Tech has failed us, and continues to do so. Early adopters are wishing they had held out. You are going to have to become much more user friendly. Forcing us on the cloud, and constantly stealing from us has made us re evaluate the actual value of tech at all. Seems we are being "forced", more than given an opportunity, and that makes us very, very wary rn.

    1. FatGerman Silver badge

      Re: News Flash

      This. A massive increase in performance that most people don't need coupled with a massive price tag that most people can't afford. Plus we all know that in a couple of years there'll be another one, and another one, as the tech industry only survives now on people replacing perfectly good tech with new tech they don't actually need.

      It's an impressive piece of kit for sure, but expecting the average user, even the average IT bod, to be excited about it is like expecting the average driver to be excited about some new F1 engine that's never going to make it into a real car.

  8. Pascal Monett Silver badge

    The honeymoon is over

    Our love affair with IT is over, we're in slippers now in front of the stream and it's all humdrum from now on.

    I remember buying my first VooDoo 2, my first GeForce 2, my first Athlon Thunderbird.

    Heady times.

    Now ? Yeah, I upgraded my PC for the first time since ten years ago. Yeah, it works fine. No, I'm not going to benchmark it.

    That's where we're at : IT is now a tool, IT is no longer anything special.


    1. ShadowSystems

      Re: The honeymoon is over

      ^Seconded. I'm running a 4th gen I3 at 1.8GHz, 16Gb DDR4, & a 240Gb SSD. Do I want a faster rig? Yes. Do I need one? Not really. Can I afford to drop thousands of dollars to upgrade? No.

      I was intrigued by what I read about the new Mini's, right up until I heard how much the basic model cost, then started howling in derisive laughter at the price of the high end ones.

      I know I'm not the target market, I've got more brains than bucks. =-j

      1. simonlb Silver badge

        Re: The honeymoon is over

        It also doesn't help when the cost of even a relatively low-end GPU is half the cost of all the other components combined.

    2. localzuk Silver badge

      Re: The honeymoon is over

      Yup. I feel that.

      Each new generation of kit seems as humdrum as the last now. Gone are the days of replacing kit every 3 years because you see huge benefits doing so. It took me 7 years to upgrade from my i5-2550k, and I only did that because modded Minecraft had become sluggish. Doubt I'll upgrade from my current CPU for a decade, unless something amazing happens in tech.

      Only reason we're replacing laptops at work is because they are physically starting to fall apart.

      1. razzaDazza1234

        Re: The honeymoon is over

        Nah, it's you.

        Move over grandad

        1. Pascal Monett Silver badge

          Let's have this conversation again when you've grown up . . .

    3. Korev Silver badge

      Re: The honeymoon is over

      Yeah, the last transformative upgrade to PCs was affordable SSDs.

      My PC is nine years old, I swapped the discs[0] and GPU[1] but I don't feel the urge to spend a load of cash replacing it until I have to.

      [0] I work in IT so have seen more than my share of dead discs

      [1] I thought we'd get a full COVID lockdown so I figured I'd need something to entertain myself.

      1. oiseau
        Thumb Up

        Re: The honeymoon is over

        ... last transformative upgrade to PCs was ...

        Late 2015, went from a rather oldish P4 3.0Gb/2Gb, Matrox 650 and Adaptec SCSI3 card to a practically new basic Sun Ultra 24 which I upgraded with Q9550 + 8Gb + 2x NVidiaFX580 cards and an IBM SAS 3.0 controller with 4x 80Gb unused spares.

        It's been a few years now and I can still use my Umax S-5E without any issues. 8^D

        Save replacing the drives with 4x300Gb unused spares I got last year for ~US$100, everything has run perfectly well and don't expect to do much else with the exception of replacing the two oldest monitors, a pair Samsung SyncMaster 940n.


    4. mathew42

      Re: The honeymoon is over

      With the prevalence of laptops, the biggest reason for upgrading is that the battery no longer retains sufficient charge for a day at school. Memory is the second reason.

      1. nintendoeats Silver badge

        Re: The honeymoon is over

        Yup. I have a perfectly good laptop but the battery is really bad. I even replaced the battery, so it is upgraded from useless to poor. This thing has a 1070 in it, so it's not even that old :/

  9. Andy 73 Silver badge


    It's still the case (was always the case) that many people in IT were more enamoured with the technical aspect than the delivered outcome. Last week I sat through a startup pitch where the founder had built something very clever and was convinced that non-technical companies would buy his very expensive proposition *because it was clever*. He didn't once stop to ask what benefit anyone using the tech would receive.

    Sure, faster, smaller, prettier - but what does it *do*?

    1. eldakka

      Re: Hmmm..

      > but what does it *do*?

      It goes "binngg!". Gotta have the maching that goes "binngg", all the cool kids have one.

      1. Korev Silver badge

        Re: Hmmm..

        It goes "binngg!". Gotta have the maching that goes "binngg", all the cool kids have one.

        Doesn't everyone use Google or DDG and not Bing to search these days?

    2. Doctor Syntax Silver badge

      Re: Hmmm..

      "He didn't once stop to ask what benefit anyone using the tech would receive."

      Didn't you ask him?

    3. H in The Hague

      Re: Hmmm..

      "He didn't once stop to ask what benefit anyone using the tech would receive."

      Hmm, most of my customers (engineering, etc.) figured out a decade or two ago that they really had to emphasise the benefits to the customer rather than technical cleverness.

  10. Mishak Silver badge

    "it's too small for your cat to sit on"

    Written by someone who doesn't own a cat? Mine can quite happily sit on a postage stamp.

    1. DJV Silver badge

      Re: "it's too small for your cat to sit on"

      My thoughts exactly - I came here to post something similar.

      Back around 2006 my cats used to sleep on top of my two CRT monitors (yeah, I know, cat fur down the back of monitors is never a good thing). It was hilarious the day I swapped the CRTs out for flatscreens as the cats tried to jump on top of them as usual but fell down the back instead. The disgruntled expressions on their faces was worth it!

    2. Tessier-Ashpool

      Re: "it's too small for your cat to sit on"

      Slightly off-topic, but I used to have an expensive AV amplifier that was the favourite resting place of my cat. One day he wasn’t feeling too good, and puked up into the device, which promptly stopped working. Grrr!

      1. Sp1z

        Re: "it's too small for your cat to sit on"

        I lost a Draytek Vigor 2862 the very same way. Sitting at work, remoting into home machine and it suddenly disconnects. Get home to see crusted cat vomit in the vents on top.

        I've wall-mounted all my routers ever since...

        1. Yet Another Hierachial Anonynmous Coward

          Re: "it's too small for your cat to sit on"

          I prefer to wall-mount the cats......

          Routers should have free run of the house. Anything else is cruel.

          1. Arthur the cat Silver badge

            Re: "it's too small for your cat to sit on"

            I prefer to wall-mount the cats......

            Hard stare.

    3. JDX Gold badge

      Re: "it's too small for your cat to sit on"

      Mine used to sit on my Mac Mini so clearly it can sit on this.

      1. KimJongDeux

        Re: "it's too small for your cat to sit on"

        If it can sit on a computer it can s#*t on it too.

    4. Lars Silver badge

      Re: "it's too small for your cat to sit on"

      I would guess Apple is not either cat piss proof. Beware, I didn't.

  11. Fazal Majid

    I'm holding off

    After a blistering pace of speed improvements, Apple's Silicon team seems to have hit a wall, the same one Intel hit a decade ago. The M1 Pro, Max and Ultra are just refinements on the M1 design from 1.5 years ago, and I suspect most unoptimized apps will not see any performance improvement from a lowly M1 MacBook Air, apart from the faster SSD and increased RAM. Let's see what the M2 looks like.

    1. Anonymous Coward
      Anonymous Coward

      Re: I'm holding off

      We already know what the M2 will be like.

      The M1 is based of the old A14 chip with the latest mobile devices being powered by the A15.

      It's more than likely the M2 will be based on the A15 which has about a 15% uplift over the A14 in benchmarks like Geekbench multithreaded.

      Ergo I would expect the M2 to be about 15% faster than the M1 unless there are other significant changes.

      1. katrinab Silver badge

        Re: I'm holding off

        I think the M2 will be based on the A16.

        The M1 was effectively an A14X with a thunderbolt controller. The previous actual A(n)X chip was the A12X, and then the A12Z which just had an extra GPU core.

    2. juice

      Re: I'm holding off

      The problem is that we've hit the law of diminishing returns when it comes to making use of all that processing power, especially now that we've moved a lot of the more complex stuff to either parallel-processing algorithms or pre-trained AI models.

      Because the parallel-processing stuff will generally be Fast Enough on any halfway decent CPU produced in the last decade; I'm running an 2012-vintage dual-CPU Xeon (16 cores total) with 24GB of ram, and that's more than capable of churning out 1080p video encoding in realtime.

      And the AI models require little or no processing power, since all the hard work was done at the training stage.

      For the most part, unless you're some sort of power user running half a dozen docker containers, while also rendering 4K live video streams, all those architectural improvements are arguably just reducing the machine's power usage and thereby improving the battery life.

      Because 99% of the time, those machines will be ticking along and using less than 10% of their CPUs capabilities.

      1. Doctor Syntax Silver badge

        Re: I'm holding off

        Don't worry. The next generation os S/W will be along soon to soak up the performance of the next generation of H/W.

        1. juice

          Re: I'm holding off

          ... maybe.

          Personally, I'd say that we've been seeing diminishing returns on new hardware for the best part of the last decade. Once we got past 4 cores and more than 8GB of ram - on both computers and mobile phones - then we hit the point where 99% of tasks can be done in memory, and on a secondary process/thread, thereby keeping the OS responsive.

          Which isn't to say that there aren't things which can bring computers to their knees, even in the consumer market; video encoding, video games and VR are all power and memory hogs. But they're very much a tiny percentage of what people use their machines for.

          And at the risk of being hoisted on my own "640kb should be enough for anyone" petard, it's difficult to think of what might come along to change that. Not least because our network infrastructure has also improved in both bandwidth and latency, to the point where most tasks (barring video encoding) can potentially be offloaded to the cloud. Voice processing for Siri, Google and Alexa are prime examples of this.

          Still, that's the good - and sometimes bad - thing about the future. It always has ways to surprise us ;)

      2. Roland6 Silver badge

        Re: I'm holding off

        >I'm running an 2012-vintage dual-CPU Xeon (16 cores total) with 24GB of ram

        I've not run my equally vintage dual Xeon (2x8 cores) workstation for some time, mainly because with all the fans it generates a lot of noise so everyone in the house can hear it...

        1. juice

          Re: I'm holding off

          It's not too bad for noise. Otoh, between the dual CPUs and the many sata drives crammed into it, it does add about 50p/day to the electricity bill when switched on.

          Which is pretty much going to double come April.

          So I am thinking about upgrading to something newer! Since some £200 jobbie off Ebay will pay for itself within a year or so...

          1. Roland6 Silver badge

            Re: I'm holding off

            >So I am thinking about upgrading to something newer! Since some £200 jobbie off Ebay will pay for itself within a year or so...

            There was a frequent contributor a few years back on ElReg who made a case for buying new as opposed to reusing old servers because of the energy efficiency improvements and so the price difference would tend to pay for itself through energy bill savings.

  12. Anonymous Coward
    Anonymous Coward

    I'd be more impressed.....

    ........if the BT broadband here was faster than 30Mbits/sec (Down) and 8Mbits/sec (Up)!!!

    My four year old WiFi router (801.11ac) is runnig the LAN at around 800Mbits/sec......sometimes as low (!) as 400Mbits/sec.



    To get to the point.........this fantastic Apple Mac Studio would make for absolutely fantastic peer-to-peer applications across broadband......if the broadband speeds matched the processor!!!!



    (1) ......but I suppose there would a huge outcry from politicians about E2EE going peer-to servers to leverage, and much harder for the Cheltenham snoops to listen in!

    (2) "...114 billion transistors..." So what (exactly) are all those transistors and all that on-chip ROM and all that on-chip RAM and all that on-chip code.......what are they actually doing?....and who are they doing it for? How much history could be stored in there? I seem to recall that Snowden told us that CISCO were shipping functionality designed specifically for folk in Fort Meade, MD. And what sorts of things might Apple be hiding in 114 billion transistors?

  13. JDX Gold badge


    The author seemed to just like the sound of their voice, rambling on like a red-faced pub bore telling everyone who'd listen about Gen X this, millenials that, the problem with the world today....

    Studio is impressive and one imagines for people doing 4k video editing, worth the extreme price. How does it compare with the Mac Pro though? And is there going to BE an M1-based Mac Pro or is that entire concept dead now?

    1. nintendoeats Silver badge

      Re: Rambling

      Ok, but I've been rambling about the same thing for a few years now. So...he's got a choir to preach to.

    2. Snapper

      Re: Rambling

      Yes, Tim said so.

      1. katrinab Silver badge

        Re: Rambling

        There will be an Apple Silicon Mac Pro.

        He also said that the M1 Ultra was the final chip in M range.

        So, the options are:

        Multiple M1 Ultras

        Something based on the M2 architecture

        Something completely different.

        I don't think it will be something completely different because the market is too small.

    3. gnasher729 Silver badge

      Re: Rambling

      Mac Pro supports 1536 GB of RAM - at eye watering prices. But with SSDs running at 7.4GByte/sec, I don't know how many apps would actually benefit from this. It seems the cheap 8GB M1s swap like crazy but the SSD is so fast, you never notice. And CPU wise, the M1 Ultra is ahead.

  14. 3arn0wl


    ... the device the tech industry has used to get us to keep forking out for the latest tech, to fund their RnD and their livelihood.

    But actually, the RaspberryPi marketers have it right... all that we really need is Sufficiency : enough capability to run the apps we want to use, on a lightweight OS. Sure, there'll be occasions when we need some real grunt to do something more than the document or spreadsheet, surfing or e-mailing, but for most of the time, most people don't need it.

    Of course that's not what manufacturers want to hear at all. Still... there'll always be gamers, I guess.

    1. Doctor Syntax Silver badge

      Re: Hyperbole...

      RaspberryPi marketers have it right... all that we really need is Sufficiency products in stock, unfortunately.

  15. Robert Grant

    There are Fisher Price-style laptops, Raspberry Pis, endless free, interactive Web tutorials on programming, excitable and exciting people who want to help and programming languages that are much easier to learn than before. Computers are cheap, bandwidth is massive and there's great excitement and wealth to be had.

    Saying it's harder to learn now is just silly. It's much easier to learn. It might well be true that it's harder to pick what to learn, but that's a different problem.

    1. Freddie

      I'm not sure the author is suggesting it's harder to learn, but that it's harder to excite potential learners.

      We've grown up with clunky systems that we could hack - this got a lot of us interested as we could achieve exciting/interesting things. The breadth and level of sophistication in modern sw is such that to acheive exciting/interesting things that haven't been done before is harder, and so fewer people will become enthused.

      1. juice

        > I'm not sure the author is suggesting it's harder to learn, but that it's harder to excite potential learners.

        People are excited about doing things - it's just all that extra memory and processing power has enabled us to create tools which abstract away the underlying hardware and processes.

        And the newer generations are excited about using those tools and abstractions to create new things - as well as new tools and layers which will then be used in turn by the generations which follow them.

        Admittedly, this also means that it's much harder for each new generation to peel back the layers of the onion. And that could meant that finding - and training - people to do things with those deeper layers is going to get more difficult.

        On the other hand, it also means that the older generation of tekkies will generally always be able to find work. COBOL 4 LIFE, or somesuch ;)

      2. Robert Grant

        I disagree with that as well. I doubt the percentage of the population that's excited about that sort of thing has gone up either. There are plenty of fun things to try. E.g. write a game in the totally free, cross-platform Godot, or set up IoT sensors of the weather, or do home automation... endless things.

  16. Fenton

    The interesting bits

    The CPU count and performance is not actually interesting.

    What is interesting is all of the other accelerators, which may mean more people move over

    to the apple ecosystem. This may worry Microsoft.

    Once we get some real world benchmarks and these machines prove to out perform more expensive

    intel equivalents, they may well cause a shift to apple which will hit Microsoft/Intel/Nvidia/AMD.

    Hopefully this will spur the traditional makers to start implementing accelerators/extensions into x86


    1. 3arn0wl

      Re: The interesting bits

      I've been thinking about the Microsoft situation too...

      I guess that as soon Apple announced their M1, the phone was ringing off the hook at Arm / Qualcomm / MediaTek et al, demanding a Windows equivalent and, just a year and a half later, Lenovo is the first 3rd party OEM to produce the goods.

      My guess is though : this is much to Microsoft's chagrin. Isn't it their vision that everyone does their computing in their cloud? They've been reticent about using ARM chips, presumably because it's more work to modify (and maintain) the OS for each chip, plus it fragments the product.

    2. Steve Davies 3 Silver badge

      Re: This may worry Microsoft.

      as long as those to move over still pay their Orifice 365 subscription then why would they worry?

      They make more profit from software than they ever did from hardware.

      It would not surprise me if MS floated off their OS business and kept the rest.

      It is mature and will only ever suck up more profits. If they don't float it then they will have to either:-

      - Introduce hefty end-user subscriptions. I'm talking about £20/month + VAT per device.

      - or deluge the user with endless ads that are impossible to block.

      OR both.

  17. ThatOne Silver badge


    > Serving up wow has to be a top priority

    Sorry, but you're barking up a non-existent tree. Times have changed, and the "wow" is gone, definitely and definitively. Initially computers were some crazy adventure you could partake in, but by now they have become a bland and gray industry among others. Don't you see it?

    Think aviation: First there were the enthusiasts, the crazy people and the tinkerers, and really anybody could hop on the bandwagon. Some would fall off again (and break their bones), some would become famous and even rich. Where is aviation now? It's a restricted and heavily regulated industry where fantasy and adventure are dirty words. It has become an industry, and industries don't make you dream, except about quarterly profits and growth.

    I did live those enthusiasts and tinkerers days. That's why I know they are long gone. Nowadays [Big Corp] releases its latest model every year, trying to optimize profit (meaning one shiny marketing-worthy component in a slew of inferior components, making for a bad rig looking good on paper). People who need a new computer will buy one, but they definitely won't do it because they dream of doing so.

    Everything starts as an adventure where your imagination is the only limit, and ends up as a big industry where the only goal is profit.

    1. Paul Kinsler

      Re: Everything starts as an adventure ...

      I think there is scope for something a bit more nuanced here. At a simple level, the "big industry" result includes making available robust (we hope) components or tools which might be combined in new ways to open up some new kind of "adventure".

      But perhaps the main message might be that there is probably some benefit in working out how to promote ways of thinking thinking that looks to construct such new "adventures" by looking to use & combine the new (and old) industry-made tools that are available.

      1. ThatOne Silver badge

        Re: Everything starts as an adventure ...

        > components or tools which might be combined in new ways to open up some new kind of "adventure".

        Done. The pieces might not have cool playful colors, but you can already build about anything you want, if you have the time, the money and the passion. And sorry, no amount of bling will reduce the amount of time, money and passion you'll need to get anywhere, the entry fee will always remain the same (at best it will increase if marketing starts thinking there is money to be made with this).

        I'm afraid that this is, as I initially said, just an attempt to bring back the heady days of computer pioneers, clouded by the good old veteran bias: Objectively those days weren't heady, you would pay a lot of money for some strange and useless device, read through cinder block-sized manuals, and when you eventually managed to make "Hello World" happen, it was more a triumph of man over manual than a revolution in computer science. All while your friends and family exchanged worried looks and shook their heads over that new hobby. Why can't he take up golf? At least he would get some fresh air.

        YMMV of course, for some it might have been the start of a successful career, but then again, if they had been born 40 years later, wouldn't they just pick the same profession all the same? If you like working with computers you will, if there are any around. If you don't, you won't, sometimes even if you need to (I have names!).

  18. Gene Cash Silver badge

    Things to hack on now

    So I've gotten bored of my Rasp Pis... one runs my garage and one monitors the charging on my bike. My latest fad for the past 2 years is my 3D printer. I have a Prusa Mk3 (not the cheap shit) and it's still breathtaking to whack something up in OpenSCAD and have it physically appear on a metal plate.

    Don't get a cheap 3D printer... you'll spend more time trying to get it to work than actually using it.

  19. juice

    Not just for the skilled or experienced...

    > What the personal supercomputer has become is a divinely powerful construction set for ideas in any medium, technical and artistic, but only for the skilled and the experienced. You have to push it: golden age IT pulled you along behind it, if you just had the wit to hold on.

    I think that's completely wrong.

    Modern computers - backed by the cloud - have made it incredibly easy for the "unskilled and inexperienced" to do stuff. Because they're powerful enough that we've been able to build tools to help people do what they want.

    Whether that's programming, video editing, audio mixing, writing, digital painting, live streaming, 3d modelling or any of the millions of other things which people are using their machines for.

    Whatever you want to do, there's a tool for it. And thousands - if not millions - of examples, tutorials and prior art to use while learning.

    Fundamentally, more people are doing more things than ever. And they don't need to get down to the metal to do it, either.

    (and that has a few knock-on effects, particularly when it comes to discoverability and the perceived/actual value of said activities. But that's a separate topic...)

  20. Matthew 17

    I think that this was always going to happen with technology; it would plateau as it would become so incredibly powerful it would hit a wall as few could find an application for it. TV screens are at 4K, soon they'll try to sell you 8K screens even though many people are happy with 1080 and can't tell the difference between it and 4k. Assuming they're able to sell us 8K screens, they'll then have the task of selling us 16K.

    OK so I'll confess I'm ordering an M1 Ultra Studio but only because my 2013 Trashcan is finally getting too long in the tooth to run my studio, but its had some decent mileage so I don't mind the cost, but even though my trashcan wasn't hugely upgradeable when I got it, it's very modular by comparison and I can't help but feel that this new uber M1 isn't going to have anything like the same mileage before it's deemed a paperweight.

  21. Anonymous Coward
    Anonymous Coward

    Blame the web for that...

    The web got us back to cumbersome ways to accomplish something that a native UI would perform better and faster. And requires you to be tethered to a remote server that may be at the end of a still slow connection - often for tasks your could perform locally on your SSD. Developers got back to use plain editors and a bunch of tools badly connected together, usually with script-tape and other ill-conceived ways

    And everything is cobbled together using some code found on github, npm, or worse Stackoverflow. Most tools and frameworks moreover are side-effects of the needs of advertising companies who need to keep people stuck in the web because it's where they can exploit you.

    So your powerful PC is stuck into running a browser that while compiling over and over some javascript and drawing over and over a ugly UI designed for fat fingers on a mobe, is also busy collecting and shipping your data to the Data Hoarders.

    Because almost everything has to be free or almost, there's really no push to program good applications. Whatever you get has to be paid by some other business, so you get the breadcrumbs. Only a few niche applications - where users are still willingly to pay, get some attention. But since companies switched to subscription, and their application will become mostly useless (or stop working wholly) as long as you stop paying, there are little incentives to look for and implement "WOW" features in the next release. And maybe you won't sell anyway as too many people got used to the reduced feature set of a phone and can't use a PC properly - after all all that they ask are "contents to consume" - and that what most devices and software are designed to deliver.

    You see operating systems that attempt to look like a browser on a phone. Because that's where they believe they can make money.

    1. Doctor Syntax Silver badge

      Re: Blame the web for that...

      "requires you to be tethered to a remote server that may be at the end of a still slow connection"

      Too true. A colleague in our Civic Soc. usually sends our lecture posters as PDFs to somebody's farcebook page. She was recently told to use some web page to convert it to a JPEG. Really? Somebody thinks you need a remote server to convert a PDF page to a JPEG?

      1. Anonymous Coward
        Anonymous Coward

        Re: Blame the web for that...

        I’m more wondering why they would want to change a pdf to a jpg and lose all the nice vector scaling. Given browsers render pdfs these days it isn’t as if someone wouldn’t have a viewer.

  22. wilhoit

    For some time now, and for the foreseeable, the quality of the typical end-user experience is determined by the fact that TCP did not envision semi-connected networks. Hardware makes no difference to this .

  23. Jim Hill

    The last four-five-six paragraphs there? Just, ... wow. So utterly dead on the mark, and so well written. I get the need for the lead-in, but that ending needs setting in a showcase where more people will see it somehow.

  24. Doctor Syntax Silver badge

    "a new budget iPhone"

    This must be some meaning of the word "budget" with which I was previously unfamiliar.

  25. Doctor Syntax Silver badge

    "Above it all, the stratospheric mysteries of supercomputing hung like the PCs the gods themselves ordered from the Mount Olympus edition of Byte."

    My recollection is that by 2001 Byte's glory days were well behind it. That in itself was an indication of things to come.

  26. MuTru

    Well, I think you're illustrating the real problem without realizing it. You just spent all your time talking about the Mac Studio Ultra and how blasé you are about the top end of innovation. The Ultra is vastly more computer than all but a few people need or can afford. The real news, IMHO, is the Studio Max, which is half the cost and still plenty of power than nearly all creators will ever need. That's the wow moment, and you seemed to have missed it. So, ask yourself: why are you talking exclusively about the Ultra, which is inaccessible to most of us, and not at all about the Max?

    1. Dave 126

      Because he's saying that even the Ultra, let alone the Max, has not excited him because computers have long since done what he's wanted them to to.

  27. Lars Silver badge

    Apple on a super computer

    So when will we see Apple on the list. Last I looked they are still all linux.

    This just because I think one has to mention linux when talking about super computers.

    1. Steve Davies 3 Silver badge

      Re: Apple on a super computer

      You'll probably see Linux on an M1-Ultra based system on the list pretty soon.

      A win for Linus and Apple.

      1. Lars Silver badge

        Re: Apple on a super computer

        "Linux on an M1-Ultra based system on the list".

        I was thinking about that too but super computing is very much a software question (what is not) and on the top500 list there is not one single Apple processor mentioned.

        So who knows.

        I actually had an Apple II+ and very early too, and I was impressed about it while working professionally with better equipment, and I was blown away by Visicalc, the first spreadsheet.

        Somebody mentioned the Amiga and also for me it was the first really good micro computer.

        On Visicalc and Apple this.

        "VisiCalc (for "visible calculator")[1] is the first spreadsheet computer program for personal computers,[2] originally released for Apple II by VisiCorp in 1979. It is often considered the application that turned the microcomputer from a hobby for computer enthusiasts into a serious business tool, prompting IBM to introduce the IBM PC two years later.[3] VisiCalc is considered to be Apple II's killer app.[4] It sold over 700,000 copies in six years, and as many as 1 million copies over its history.

        Initially developed for the Apple II computer using a 6502 assembler running on the Multics time sharing system,".

      2. doublelayer Silver badge

        Re: Apple on a super computer

        "You'll probably see Linux on an M1-Ultra based system on the list pretty soon."

        No, you won't. For one thing, the top 500 are really powerful distributed computers. Building that out of individual Macs which weren't intended for that is possible, but when you could do it with nodes that were designed for industrial-scale operation, most people who are going to use that much power will do that.

        For another, the existing methods of running Linux on M1 are... well I don't want to be unintentionally insulting about them because the people who are doing the work are quite intelligent and have solved lots of hard problems. The problem is that there have been so many hard problems they've had to work on. Apple isn't actively blocking Linux (yet), but they didn't do anything to make it work. Those working on it have had to reverse engineer lots of components and they do this because it's fun for them. The type of person who is going to build a supercomputer, with the budget that this requires, doesn't want to run their system using something experimental which had to hack through undocumented systems. They can get lots of server-class hardware that was designed by people who expected Linux on it. They can buy chips by people aiming for high-performance applications who also designed for Linux. They will do that.

  28. Paul Smith

    When I were a lad...

    A task that took 18 hours twenty five years ago, which could be halved every 18 months for a near constant £1500 by the application of Moore's law, made upgrading worth every penny. Twenty five years later the same Moore's law means the same task takes under a second, so it is much harder to justify an upgrade to reduce the time to half a second.

  29. Blackjack Silver badge

    The big problem is that we already have small computers that we can carry in our pockets, after Smartphones small computers as powerful or more that a desktop PC don't seem impressive.

  30. ecofeco Silver badge

    The lede was buried

    I did not tary.

  31. Loyal Commenter Silver badge

    I remember the Macs we had in 2001

    Beige boxes, with proprietary and closed technology (ever manage to royally screw one up by plugging a parallel-port printer into the SCSI connector that they had oh-so-cleverly used the same connector with), and those boiled-sweet commodity "iMacs". At a time when a PC would commonly have an ergonomic mouse (with a scroll wheel) compared to the Mac's cuboid thing with one button, I struggled to understand why "designers" thought Macs were better.

    Looking back, it's obvious now - they were essentially given away free to educational establishments, and Apple paid big bucks for them to be featured in popular TV and films. They were a product for the gullible then, who couldn't realise they were being locked into a closed ecosystem (the first one's free!), and I don't see what's different now.

    Apple have only just started to acknowledge that everyone else is putting USB-C connectors on their hardware, and given up the proprietary connector cash printing-press, but it's not for want of trying. Look back on the various mutually incompatible ones they've had in the past, ye mighty, and despair.

    So, yes, Apple's star is fading - what is their USP these days? That they are more expensive? That you have to go to a "genius bar" when they go wrong? That their OS is somehow more flexible or superior to any other (I think not)? Practices like the need to own an Apple device in order to compile software for an Apple device put me off (I can happily compile something to run on an android phone from a PC, and even test it on a virtual device, something that Apple won't allow).

    I'm not saying that the business practices of other megacorps aren't just as predatory, but even Microsoft have abandoned "embrace, extend, extinguish". At the end of the day, Apple computers are Veblen goods, and some of us just aren't into showing off to plump our own egos and assuage our self-doubt.

    1. Dave 126

      Re: I remember the Macs we had in 2001

      > I struggled to understand why "designers" thought Macs were better.

      At that time, colour management on Windows PCs wasn't great, so they werent really an ootion. And Macs all had Firewire, because USB 1.0 wasnt suitable for high res scanners. Designers used Macs because print houses used Macs.

      Your struggle did little to advance your understanding.

      1. Loyal Commenter Silver badge

        Re: I remember the Macs we had in 2001

        Surely "colour management" is entirely down to the software you use (and your monitor). There's nothing inherently special about Macs in this regard.

        I recall, back in the day, that I had a PC MOBO that included firewire (it never got used, other than to attract dust), so the idea that this was also somehow superior and special to Macs is somehow moot.

        I also recall using both USB and parallel port scanners that were quite capable of scanning at anything up to 1200 DPI. Yes, you were limited by how fast the interface could push those bytes across onto your PCI bus, but "wasnt [sic] suitable" is a strong turn of phrase. If you were desperate to use a firewire scanner for that faster throughput, then you could buy a PC that had it. The fact that a commodity motherboard included it, pretty much as standard (and we're talking about the turn of the century here), puts the lie to the claim that you need a Mac.

        As for actually needing to scan things at high resolution; I guess you like seeing dust particles, little bits of fluff and hair, and really really need to see the halftone pattern on printed materials.

        As for "Your struggle did little to advance your understanding." - well what can I say, that sounds like the sort of self important bullshit a Macolyte would come up with.

    2. PriorKnowledge

      Apple USPs in 2022

      * As an OEM, Apple provides 7 years of full support

      * Assumes personal not corporate computing by default

      * Willing to dump insecure/broken protocols ruthlessly

      * One source of support for hardware and OS software issues

      * Ships decent software with the computer, not bloated trialware

      * Includes decent physical theft deterrence features for free

      * Still supports ‘00s iPods natively, including acquiring media

      * Natively supports cloud and trad services fairly and equally

      * Sticks to one set of native APIs to keep things lean and efficient

      * Doesn’t have gaping holes in its trusted computing implementation

      * MDM can be fully serverless and implemented entirely offline

      Apple has a boatload of USPs to go with their boatload of down sides. We already know what the down sides are so I can’t be bothered to list them!

    3. oiseau

      Re: I remember the Macs we had in 2001

      ... but even Microsoft have abandoned "embrace, extend, extinguish".

      I think not, not by a long shot.

      They are just doing it differently, but it is just more of the same.


  32. gnasher729 Silver badge

    Top 500 Supercomputer list

    Does anyone know what would have been the last year where this computer would have been in the top 500 supercomputer list? (Leading in 2001 probably means still handing around #300 in 2007 or so).

    1. gnasher729 Silver badge

      Re: Top 500 Supercomputer list

      I shouldn't be so lazy... #500 in November 2009 was 20.1 TFlops.

  33. sreynolds

    Back in 2001....

    They would probably have a huge export restrictions sticker on it, not allowing its sale to Iran North Korea and perhaps Iraq (potentially useful in chemical weapons development, Tony Blair would lie to us about) and there would have been an international version with only 56bits of encryption,

  34. TM™

    Sinclair ZX Spectrum (Plus): First PC that really did stuff (I had a ZX81 but that was all about getting the keyboard to register key presses).

    Commodore Amiga: Biggest wow moment I've ever experienced in a computer (and I've worked in VFX). Never to be surpassed experience of leap in computing power. I guess the matrix might do it, but I'm probably already in that (things are crazy enough).

    PlayStation 3: Another quantum leap in performance per price. Massive respect for CELL.

    Everything else has been more of the same.

    The Apple studio is impressive, but the price! Can't help but wonder if there is a missed opportunity for a top notch living room entertainment / games console at competition beating price here. I'm guessing they can church out a 32GB, 1TB Ultra pretty cheaply. If they came in at 60% of a PS5 / XBOX it would be really disruptive.

    1. Dave 126

      Apple wouldn't want to chase the games console market because of that market's spikes in demand and unpredictability. Established players Sony and Microsoft are unable meet demand due to silicon supply shortages. Apple doesn't have unlimited TMSC production lines for its silicon, so will stick to its existing product categories.

  35. T. F. M. Reader

    Apples, Oranges

    I spent a part of my career doing massively parallel supercomputers, though it was many years ago (I started a bit after ASCI White, actually). I will be grateful to the commentariat if anyone can point to M1 Ultra's LINPACK results which is how supers, including ASCI White, are measured (and what they are designed for - not for LINPACK per se but for solving partial differential equations and such).

    ASCI White actually never achieved 12 TFLOPS in practice, IIRC. Nvidia GPUs did a few years ago, but again on different benchmarks - again, IIRC.

    All I saw in my admittedly perfunctory search was that M1 Ultra beats Intel and AMD. It is not even clear to me how far behind (in TFLOPS) a many-thousand-core GPU augmented with several hundred tensor processors it is. From what I saw M1 Ultra has 20 CPU cores, 64 GPU corres, and a 32-core "neural engine", whatever that is.

    The current fastest super is ~45,000 faster that ASCI White (single CPUs evolved a lot faster indeed), has 7,630,848 cores, more than 5PB of memory, and is, by my estimate, about 40% as efficient (in MFLOPS/W) as M1 Ultra, for its LINPACK MFLOPS. That's not bad for a system that is ~26860 faster. I am much more excited by this than by the new M1.

    All of this is not to diss M1 at all. I play around with both (older) Intel and (newer) M1 Macs at work every day and M1s are noticeably cooler and faster than Intels when running long and heavy parallelized compilations. All I am saying is that if you want to compare to a massively parallel super you must count oranges, namely LINPACK, for fairness. The supers probably will not even be able to run Netflix... On the other hand, it is not clear if an M1 Ultra can run this in 12 hours.

    Maybe M1 Ultra will be great at LINPACK, too. I have not yet seen the numbers though.

  36. Oliver Knill


    While I agree that the appreciation for newer and better machines has decreased (I was in awe already when having a tandy clone PC attached to a TV which was so much better than a programmable TI 59), then came atari, the first macs, the next and nice affordable workstations running linux. Still, I like to buy the most powerful machine I can afford. The reason is productivity. Not only the micro lag, the search for files, the syncing of files, LaTeXing files, the conversion of files, not to speak what happens when editing large pictures with the gimp or handle larger video projects where having to wait a few seconds until the machine has rendered some stuff. I'm not a gamer but want to see high performance when flying round in google earth, I like to work on 49 inch monitors with lots of different parts open at the same time (documents, books, computer algebra systems, etc). By the way, just the reading of high resolution PDF files on linux on a high end workstation can have lag. We need more computing power and this is only going to become more urgent. can understand the article, but it confirms that we all have become spoiled brats who do trash new innovation as not fast enough. I sometimes have to fire up one of these old machines and wonder: how the heck could I have ever worked n such a crappy technology. Take a machine from 10 years ago and try to do anything decent. My small Raspi is faster and that fits into a matchbox. I would love that new M1 Ultra. Unfortunately have not the cash right now.

  37. Daniel von Asmuth


    The Apple Studio is a toy. It supports up to 1 concurrent users, who will be able to run Safari, Angry Birds, Word and Excel. It has a paltry 20 cores, 128 GB of RAM and an SSD. No PCIe slots.

    ASCI White was a supercomputer and ran all manner of state of the art software. ASCI White had 8,192 processors, 6 terabytes (TB) of memory, and 160 TB of disk storage [Wikipedia].

  38. KimJongDeux

    This chip sounds lovely. But computers transferring data inside themselves is very 1990s. What I'd really like is an internet connection that was at least a tenth as fast. Preferably one that worked as I walk between buildings or in the lee of a wooded hillside.

    1. ThatOne Silver badge

      > What I'd really like is an internet connection that was at least a tenth as fast.

      So you can get bigger and better ads?...

  39. Anonymous Coward
    Anonymous Coward

    We have chips that do 4 billion cycles or second but fing Windows still takes seconds to delete a file.

    Moore's law of regular performance doubling is long dead and the few enhancements are sucked away by Spectre, Meltdown and co.

    No wonder the wow is gone.

  40. Anonymous Coward
    Anonymous Coward

    Single threaded performance

    The reason improvements in performance made such a big difference in our lives back then was because the speedup applied to sequential single threaded performance. That made all class of algorithms faster. Today that type of improvement is very incremental between CPU generations and the real gains come from parallel compute, which doesn’t help nearly as many algorithms. Hence why graphics and modelers care but the average joe doesn’t really notice.

    1. Kevin McMurtrie Silver badge

      Re: Single threaded performance

      Lazy coders. Swift, Rust, C++, Objective-C, Java, Scala, Go, and Ruby all have refined tools for multiprocessing. JavaScript and shell scripts are the only active languages that can't support elegant multiprocessing. Even Python does if you don't use CPython.

  41. Michael Brian Bentley


    Why would El Reg publish a review of a powerful new computer by someone who wouldn't know what to do with it?

    1. Anonymous Coward
      Anonymous Coward

      Re: Skeptical

      Isn't the whole basis of Apple's marketing on YouTube by the usual Apple influencers, about showing off the powerful new M1 processor by people that don't know what to do with other than seeing if it can play (or at best edit) a 4K YouTube video? Certainly the curated Google content feed, seems to indicate this.

  42. naive

    People lost interest since Big Tech is morally bankrupt

    Lets compare IT to cars.

    People are very much interested in comparing the Nürburgring times of Corvettes, Ferraris and Porsches.

    Those cars are not made in factories where they have to install safety nets to prevent workers from committing suicide.

    Technology is open, one can choose bolts, gas, tires, and other parts to improve and maintain the car.

    Cars are made so they can be fixed,not to become scrap metal in the shortest amount of time without attracting class action law-suits, unlike Apple stuff that is engineered to become polluting E-Waste within 5 years since it can hardly be repaired of upgraded.

    Cars manufacturers care about their image, the safety of their product and the environmental impact.

    Car manufacturers don't fight legal changes with armies of lawyers like Big Tech does.

    Car manufacturers tend to contribute to the society where they sell their products, factories are located where the buyers are. They don't combine tax evasion with production in third world countries.

    Car bosses don't collude with power hungry politicians, agreeing to design products with back doors enabling spying.

    Car bosses don't buy news papers and TV stations to influence public opinion, unlike MS and Amazon.

    Compare this to Big Tech in IT. They don't care about anything else but the next quarter baseline. Workers well being, avoiding tax evasion, environment, openness in competition and privacy are just insignificant parameters in the way they conduct their business.

    When one buys a graphics card, there is good reason to feel shafted by getting the performance of 8 years ago with a price tag increased by 150%.

    Someone who wants a fast Unix workstation, can only choose x86, since Apple decided not to sell the M1 cpu's to others.

    Buying IT Tech nowadays feels like handing over ones hard earned money to a gang of bullies terrorizing the street.

    1. ThatOne Silver badge

      Re: People lost interest since Big Tech is morally bankrupt

      > Car bosses don't


      The writing is on the wall, they are already about to change their old-fashioned business model to be more "disruptive" and to tap into new sources of profit (ie. shafting the customer).

    2. mantavani

      Re: People lost interest since Big Tech is morally bankrupt

      Car companies & their heavy machinery compatriots do sadly do a lot of those things. Google 'right to repair' together with John Deere, or Tesla, or the the Alliance for Automotive Innovation (an industry ground backed by Ford, Honda, Hyundai, and GM that spent $26million in Massachusetts alone trying to gerrymander that state's proposed RtR legislation) - it's not just an IT problem, it's a problem of shortsighted narrowly-focussed corporates who know the price of everything and the value of nothing, jealously guarding their fiefdoms within regulatory environments that do little or nothing to contain them let alone protect the consumer.

  43. breakfast Silver badge


    All this talk of nostalgia for 2001 and yet nowhere in the comments did someone recommend we 'imagine a Beowulf Cluster of these things.' Truly we are a fallen civilisation.

  44. Binraider Silver badge

    Software sure hasn't advanced anywhere near as rapidly as hardware has. When was the last time you saw a truly new application?

    More power is always good of course, if you deal with processor intensive tasks like video encoding or FEA. Some of the methods I use today used to be done by "calculators" on mechanical adding machines - literally taking months to resolve one problem. Today, I can lob the same problem into COMSOL and solve it in half an hour. Specifying the problem is another matter of course - that takes just as long regardless of the tooling used.

    It is remarkable how the design houses of the 1960's could turn out working machines using such tooling on tight timescales.

    Today, "solutions" end up taking way longer because capability has gone up; and design margins have gone down. Sacrificing the latter puts a lot of trust in your modelling margins. And I'm not sure it's well placed tbh. See the lead time on Boeing Starliner versus the iterative hardware development done by SpaceX. One was a hell of a lot cheaper and got results a lot more quickly while being better understood too.

    When I'm able to explain to a computer what I want to achieve rather than have to define it all in n'th degree detail then the benefit of all this extra CPU power comes home. Cue Star Trek Computers.

  45. Ross 12

    It still blows my mind that I have a handheld battery powered internet terminal in my hand with an 8-core 64-bit processor, 8gb of ram, and 64gb of storage that I can do most of my day-to-day computing on. When I was a kid writing BASIC on my Speccy, those stats would have been absolutely incomprehensible.

    It's hard to get excited about advances now because most of our needs are already met. Pretty much any device made in the past 10 years is 'good enough' to do your every-day web stuff, and most obstacles are in the software - abandoned or deprecated systems.

    The gains we get now are really only significant for less common use cases, like video editing, graphics rendering, audio production, VR, etc. Gaming is probably the most mainstream use case that's still pushing hardware forward, but even then, the visible gains are becoming less and less significant, with the latest-gen consoles moving to SSDs providing the most noticeable jump in capability in recent years.

  46. Rooster Brooster

    It's unfair to think that we haven't envisioned a use for so much parallel processing power. Imagine all those unneeded CPU cycles soaked up with SETI screen savers 25 years ago. Now we could use 'spare' power on 1000's of Ultra Mega Macs and still find nothing (even faster!)

  47. Anonymous Coward
    Anonymous Coward

    It can have all the power in the world, but when it mostly taken up by looking good and displaying ads, no one is going to care.

    Congrats! You made the ad load 50ns faster. Get yourself a cookie.

    1. ThatOne Silver badge

      A 3rd party cookie I gather?

  48. Anonymous Coward
    Anonymous Coward

    Programming supercomputers is hard! You need to be able to write (hopefully!) optimal and cache-oblivious algorithms, manage domain decomposition (between nodes, between GPUs per node, between cores per GPU) and message passing on a substrate that hopefully reaches peak theoretical speeds without any bottlenecks, with the CPU being used as little as possible on busywork or in latency-spiking sleep states. You need a good mental model of EVERYTHING, a complete mastery of the entire stack and exactly how every message, interrupt, memory copy, packet, and bus read/write happens and when and why :D

  49. Samsara

    Q: "Where is the Fisher-Price digital audio workstation? Where, in short, is the Toys-R-Us of the digital world?

    A: Roblox

  50. Anal Leakage

    Jesus… even for el Rag, this screed was weak.

  51. John Savard

    Not so fast

    I had to hunt around. My web search got a figure of 21 TFLOPS for the M1 Ultra. But it took several results before I found one that noted this was single precision floating-point power, not 64-bit floating-point power. And it is double-precision that is usually used for scientific supercomputing.

  52. Morat

    Desktops are fast enough for mere humans.

    There's really no need for all the power that you can now buy for a desktop, unless you're into something niche. For me, that niche is VR gaming. Even a top line Intel/AMD processor and a 3090 won't give let you max out the capabilities of a good VR headset at the moment.

  53. Foster

    QuarkXPress 3 on a Mac 11ci circa 1990. Make a 4-page newsletter with simple line-art graphics and send to print (pre-PDF). Go and cook dinner, mow the lawn, do the shopping whatever. And it's still scrolling through even though the laser printer can take it much faster. Okay, so StarTrek screen saver may be using a few little thinking seconds on the Mac.

    QuarkXPress 17 on a late 2015 iMac, nearly obsolete and still running latest 64-bit OS. Set a 200 page book full of high res images to make a PDF for printing. None of the images have been optimised, some maybe from enormous images supplied. Turn your back for a moment and then wonder whether it's scrolled through or not. I always click on 'Open PDF after export' so I know it has actually finished in such a tiny amount of time.

    Pixelmator Pro on same Mac. Click on 'Remove Background' on a high resolution, huge image and it does it brilliantly and almost instantaneously. Deiscerniong what is the figure in the foregroaund and what is to be dumped. While at the same time recording live TV, getting email, web browsing, watching Netflix, blah blah blah.

    I've never stopped getting the 'Wow' moments anbd especially when my next iPad will be a gazillion times faster than my iMac. I cannot wait to see what my next Mac will be capable of and I hope Apple do another 27" version.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like