back to article Geekbench stats show Apple Silicon MacBook Air trouncing pricey 16-inch MacBook Pro

Benchmarks highlighting the performance of Apple’s homegrown M1 chip have surfaced, showing Cupertino’s latest (unreleased) MacBook Air handily trouncing the previous generation 16-inch MacBook Pro in both single-core and dual-core tasks. On the GeekBench 5 test, the MacBook Air 10,1 with 8GB RAM scored 1687 for single-core …

  1. Gordon 10
    Boffin

    Clock speed

    AFAIK Apple haven't published clock speeds for the M1 Air vs M1 Pro, the addition of the fan suggests to me that there is rather a lot more going on than 1 additional GPU.

    Im betting there are or will be M1 SKU's with significantly higher clock speeds for the Pro than the one used in the Air. Obviously depends on the yields TMSC are currently getting on the M1 die of course.

    1. ThomH

      Re: Clock speed

      It's only the base model of the Air that has one GPU core fewer, so there's definitely something more going on — the more expensive Air has the same stated quantity of CPU and GPU cores but is still passively cooled.

      That all being said, how thoroughly does Geekbench measure sustained versus peak performance? If all active cooling does is allow the CPUs to maintain peak performance for longer then the benchmark metrics might not look any different. Though that kind of cuts to the meaningfulness of benchmarks more than anything else.

    2. DS999 Silver badge

      Re: Clock speed

      All evidence is that the clock rate is the same between the Air and Pro. I think the missing GPU core is a binning thing (same difference as between the A12X and A12Z) and Apple is relying on non CPU differences between the Air and Pro to differentiate them.

      That may change in the future once Apple has several generations of M* SoCs at their disposal, and they might have the M1 in the Air while the 13" MBP gets the M2 and the 16" MBP gets the M3. Or maybe they'll do some clock based binning, or maybe they're just going to rely on other things to differentiate and have really fast CPUs in their cheapest laptop.

      Geekbench isn't that great at determining CPU speed but other sources have indicated the M1 (and A14X, which is likely the same chip with a fuse or two blown to change its identity for use in iPads) runs at 3.1 GHz.

  2. FIA Silver badge

    I'm curious to know if there's any multi CPU support in these?

    Their use of stacked memory limits the expandability, hence we're only seeing 16GB as the max for now.

    But if you're going to have a 'pro' version of this that's not going to cut it, so it'll be interesting to see how they solve that. Could you for example put 4 M1s together to get 64GB in some kind of NUMA style layout?

    Interesting times ahead.

    1. This post has been deleted by its author

      1. ThomH

        I remember the 2015 Retina MacBook because I still use it.

        That said, it's on its third logic board because they kept melting. Luckily only while it was still under warranty and not for the last couple of years, but nevertheless I'd be a little cautious.

        If the reviews check out, I'll probably buy one of the Mac Minis. I guess we'll see.

        1. Blackjack Silver badge

          No wonder Apple keeps saying they lose money on repairs. 3 logic boards melted? Wow.

    2. Anonymous Coward
      Anonymous Coward

      guessing

      I'm guessing they're stacking 16Gb base LP DRAM ..... there is talk of 24Gb before we get to 32Gb in 2 ish years time.... so unless they increase the stack count - we're at least 2 years off a 32GB offering .... famous last words

      1. _andrew

        Re: guessing

        Pictures suggest that they're side-by-side on a carrier, like the HBM GPU memory of a few years ago, or the chiplets on AMD processors. I've read at least one comment that that is at least partly to avoid the differential-expansion problems that stacked combinations seem to suffer.

    3. ThomH

      Anandtech speculates that the entire die size of an M1 is 120mm^2; having a quick look around I found that an i9-9900K is 180mm^2 and that obviously doesn't include RAM. Apple will be benefitting hugely from the hard work of TSMC here and that 5nm node (even if TSMC's nanometres don't correlate exactly with Intel's), but it suggests that — purely in terms of fabrication, anyway — they could go to higher core counts and RAM sizes while still being sensible sizes and without going multi-CPU.

      All armchair speculation, of course.

    4. Dave 126 Silver badge

      > I'm curious to know if there's any multi CPU support in these?

      And others will want to know the GPU roadmap. Okay, right now Apple making a Mac Mini and MacBooks Pro and Air with the same chip is smart. The MacBook Pro sends the message that Apple Silicon can handle 'pro' tasks, the lower end machines mean that more developers can afford one.

      Now, Apple *could* have announced an ARM chip with more PCIe lanes, Thunderbolt, and RAM - but why would they? The market for these pricier mission-critical machines is more cautious and wouldn't buy on day one anyway, since it'll take a bit of time for those M1 MacBooks to demonstrate their prowess, assuage compatability fears and attract developers. In the meantime, Apple can spend the next year continuing to develop the M2, M2X Super Turbo etc, plus their own discrete GPU.

      Yep, a GPU. It's not been mentioned much, but their roadmap slides at WWDC showed no AMD GPUs alongside Apple CPUs.

      1. tip pc Silver badge

        Also Apple stated no egpu support for the Mac silicon systems. Don’t know when or if that’ll return.

    5. DS999 Silver badge

      There wouldn't be any multi chip support in the M1

      Because Apple doesn't need it for these Macs.

      There have been reports of another chip that has 8 big cores and 4 little cores, which would presumably provide the "high end" CPU option for the Macbook Pro. They have said the full transition will take around two years, so two years from now they will have a solution for the Mac Pro. I wouldn't be surprised if they designed the 8+4 chips in 2022 to be able to gang up like AMD's chiplets and form a 32 core beast (and the GPU would similarly gang up)

      The current stuff wouldn't have that capability, other than maybe for some early testing/debugging of their future potential "chiplet" type strategy.

    6. tip pc Silver badge

      “ Could you for example put 4 M1s together to get 64GB in some kind of NUMA style layout?”

      For some reason I have this feeling that RAM won’t be as much of an issue as it normally is, 8GB could be enough with 16 perhaps being overkill.

      I’d run loads of docker containers which extra ram would be useful, but for different more linear tasks and fast nvme 8GB could be enough especially given how Idevices seem to work well with high process and memory tasks with far less RAM.

      One of these M1’s with 64/128GB RAM running as a server in a mini would suit me well.

  3. Wyrdness

    13" Pro scores the same

    The 13" Pro is showing identical Geekbench numbers to the Air. People are speculating that the passively cooled Air won't be able to sustain this level of performance for too long, whilst the fan-equipped Pro should be able to. Anyone who was considering one of the faster Intel Macbooks can now get more performance for far less money. Of course, Apple being Apple, something has to give and these new machines are limited to 16GB of memory and only a single external display - not really 'Pro' specs IMO.

    1. DS999 Silver badge

      Re: 13" Pro scores the same

      Yeah the 16GB limit is pretty lame, but this just replaces the "low end" MBP 13" option. They supposedly have another chip coming with 8 big cores and 4 little cores that would be the high end, and presumably handle the low end MBP 16". Obviously they would need a higher memory limit for that, though they would probably keep the same strategy of soldering LPDDR4x (or maybe LPDDR5 depending on when it comes out) onto the MCM alongside their SoC. I think LPDDR4x can go up to either 64 or 96 GB, and LPDDR5 can go up to 128 GB, so other than the "no user upgrades" aspect their scheme won't be as limited down the road.

  4. Peter X

    Finger nail biting time for some

    Like, Intel for example?

  5. StrangerHereMyself Silver badge

    On chip DRAM

    The speed boost can mainly be contributed to the fact that the DRAM is on the SoC die. If it were connected via a North bridge the results would probably have been much less to write home about.

    1. Dave 126 Silver badge

      Re: On chip DRAM

      Yeah, the interconnects and transports are interesting.

      It has a bearing on Apples plans for extending the M range to support more PCI lanes and Apples own discrete GPUs.

    2. Anonymous Coward
      Anonymous Coward

      Re: On chip DRAM

      > the DRAM is on the SoC die

      Not actually on the SoC die, but adjacent to it and connected via LPDDR4X

      According to the published die photos the SoC has 12MB of L2 cache shared between the four "performance" cores, and 4MB of L2 cache shared between the four "efficiency" cores.

    3. DS999 Silver badge

      Re: On chip DRAM

      Sorry, but you're wrong.

      LPDDR4x/LPDDR5 actually have higher latencies and lower bandwidth than socketed DDR4/DDR5. You can look at the figures in Anandtech's testing to see that for the iPhone 12, and presumably for the M1 Macs when they arrive.

      While closely coupled DRAM actually CAN be used to reduce latencies and increase bandwidth, the LPDDR standards use it to reduce power. The standard you want for closely coupled DRAM for max performance is HBM/HBM2, as seen on Nvidia Titan GPUs.

    4. Proton_badger

      Re: On chip DRAM

      This chip is an 8-wide design with a instruction reorder buffer 630 instructions deep. The only other design this advanced is the IBM POWER10. The 12MB cache also helps. I think we can safely say there's a lot more than RAM being responsible for its outstanding performance.

    5. Anonymous Coward
      Anonymous Coward

      Re: On chip DRAM

      "to the fact that the DRAM is on the SoC die"

      You posted this in the other article too. You are flat wrong. Stop posting this guff.

  6. trevorde Silver badge
    Trollface

    Fastest nag in a one horse race

    Just a pity there's no software for these new Macs. All the best software is on Windows or Linux these days.

    /troll

    1. Dave 126 Silver badge

      Re: Fastest nag in a one horse race

      Rule one of trolling is not admitting to being a troll. Also, your spelling is correct, and your caps-lock key appears to be broken.

      I'm sorry, but I can't award you more than two troll points. Out of ten.

  7. ecarlseen
    Alien

    Unsurprising results

    Apple's A-series CPUs have been knocking on Intel's laptop performance door for a few years now, and that was from a position of dealing with the thermal and power constraints of a cell phone form factor. That they are blowing Intel away in a laptop should be expected. Let's start at the beginning - Apple's CPU team was recruited / purchased from companies specializing in highly-optimized CPUs. They got the right people. Their CPUs are now effectively two process nodes ahead of Intel (TSMC 5nm vs. Intel 14+++++++++++nm for their volume parts, which is most comparable to TSMC 10nm). They also have a simpler instruction set to optimize for, no 32-bit backwards compatibility issues, and they can hyper-optimize for one OS. Not to knock Apple - this is amazing work - but it's the amazing work we've been expecting when a company gets the best people and gives them a gigantic pile of money to work with.

    1. NeilPost

      Re: Unsurprising results

      Shirley Intel, IBM and AMD also have the best people and a ton of cash??

      Apple Silicon all built on ARM licenced technology.

      ... reminds me of Intel XScale 20 years ago. Schadenfreude writ large. I’m sure my Dell Axim x51 is in the attic somewhere.

      1. Dave 126 Silver badge

        Re: Unsurprising results

        > Shirley Intel, IBM and AMD also have the best people and a ton of cash??

        I think @ecarlseen's point is that Intel, IBM, and AMD (and Qualcomm, Nvidia with ARM server CPUs, Samsung have wound up their Exynos designs) aren't in the same position as Apple, since they aren't designing CPUs / SoCs to run a single operating system family that they control. It's not just that Apple have a shit ton of cash, but that they are better placed to reap a return on their investment in this area. We've seen hints of it though - Google have an in-house silicon design team (for accellerators in their Pixel range), Microsoft worked with Qualcomm for an ARM chip for their Surface range

        It's interesting to note that Apple were stung by the Trash Can Mac Pro a few years back. It's whole design was based on using two graphics cards, as was the industry norm at the time. What Apple didn't expect was for AMD (and Nvidia) to shift to developing single very powerful GPU cards instead - and the Trash Can's thermal design couldn't cope with that. The Trash Can wasn't seriously updated, and Apple reputation was bruised - for reasons that they couldn't have predicted unless they knew as much about GPU design as AMD did.

        Similarly, MacBook Pro buyers complained for several years that new models didn't bring any great speed gains - because Intel weren't focusing on that at the time.

        And people knocked Apple a few years back for releasing a MacBook Pro range that only supported 16GB RAM - because the Intel CPUs didn't support a certain type of power efficient RAM above that. Six months later, Apple listened and gave the option for more RAM, but it came at the cost of disproportionate RAM power consumption - because Intel didn't make the chip they wanted.

      2. FIA Silver badge

        Re: Unsurprising results

        Shirley Intel, IBM and AMD also have the best people and a ton of cash??

        They do (although Apple have nearly 4 times the cash of Intel, IBM and AMD combined), but they also have the baggage of x86, which is very CISC, and arguably not particularly great. AMD simply extended this when creating x64.

        Arm on the other hand was initially designed by a small team based on lessons learnt from the MOS 6502, this lead to a simple and lean architecture, which (possibly due to some other work Acorn had at the time) also happened to have nice low interupt latency. The initial ARM was so power efficient it could run without power. :)

        Also, when it came to the 64 bit transition, Intel dropped the ball with IA64, which left AMD to capitalise with it's 64 bit extensions to IA32. However, this still leaves the legacy of the old ISA, whereas Arm, coming to the party a lot later, were able to design a 64 bit instruction set that learnt from the lessons of others, and also removed some features that whilst useful at the time were less helpful when designing a modern out of order superscalar CPU.

        Apple Silicon all built on ARM licenced technology.

        It's built on an Arm licenced architecture. Apple have an architectural Arm licence, they design the CPU and GPU themselfs. (As oppose to a core licencee who uses an Arm core design with their own surrounding IP). There's speculation that they have a particularly special licence too, which considering they were one of the co-founders of Arm is not entirely unlikely.

        Apple have been slowly building up to this for some time now, they bought a chip company years ago to design the (increasingly powerfull) CPU in the iThings and Arm on Mac has been one of those things that has been about to happen for years now.

    2. Billius

      Re: Unsurprising results

      >it's the amazing work we've been expecting when a company gets the best people and gives them a gigantic pile of money to work with.

      I dunno, when you take that approach you always have the distinct possibility of getting people with egos trying to take a project in their direction and people pissing money away with very little to show for it.

      1. Dave 126 Silver badge

        Re: Unsurprising results

        > I dunno, when you take that approach you always have the distinct possibility of getting people with egos trying to take a project in their direction

        That is a danger, yes, but there are ways of mitigating it at the hiring stage and through management. There's a long parole period for new employees. One tool Apple management has is a clear vision of where the company is going and what they want the team to achieve. Prospective employees also have a good idea of what Apple is about before they apply. Management and team members are aided by employing very good analysts and supplying them with with expensive data to turn into good intelligence.

        An egotistical team member is less likely to go his own way if he sees his manager rationally explain - with high quality data - why things should be done, and know that he himself is listened to. If that doesn't work, Apple has skunk teams exploring niche areas, perhaps Mr Maverick would be happier there.

        As it is now, I'd imagine the teams responsible for the A series and M1 SoCs are giving themselves a deserved pat on the back - that's got to be good for team cohesion.

  8. theOtherJT Silver badge

    Back to the 90s?

    I'm really interested to see if this is going to take us back to a time when we genuinely didn't know which way to jump when buying a computer. x86 won SO HARD in the 90s. MOS technologies was long dead. The moto 68xxx chips died off in the Atari and Amiga and Apple jumped ship. Acorn struggled along in the British classroom, but finally withered way.... Are we going to have another architecture war? Do we need to start caring about what the underlying hardware is again?

    Interesting times.

    1. NeilPost

      Re: Back to the 90s?

      ‘Acorn withered away’.... but gave birth to ARM. Don’t forget the key bit. Herman Hauser, Roger/Sophie Wilson, Steve Furber etc.

      With the M1 Mac’s do we have a new worlds fastest desktop computer as my BBC A3000 (briefly) was.

      1. NeilPost

        Re: Back to the 90s?

        ... on a parallel thought?? What happened to the INMOS Transputer ?? The so called future at the time ?!

        Did Thomson piss it up the wall in the end??

        —-

        ... and this biggest laugh that Apple owned 1/3 share in ARM Ltd until they had to flog it to stave off bankruptcy after binning Jobs, getting the Pepsi bloke in and then tossing him and buying Next/Jobs :-)

        1. Rolf Howarth

          Re: Back to the 90s?

          I remember the Transputer... I think I even had a job offer from Inmos once but ended up going somewhere else.

    2. Dave 126 Silver badge

      Re: Back to the 90s?

      >Do we need to start caring about what the underlying hardware is again?

      Probably not in the same way. In the nineties, a PC wouldn't even read a Mac floppy disk without faffing. Documents weren't as portable between operating systems. You were more tied to the operating system you are using. Okay, that's OS stuff, not cpu architecture per se.

      Today, a lot of useful work can be done in a web browser - emailing, word processing, printing, etc, and most of us switch between OSs (Android, Windows, iOS, MacOS, Linux) many times a day. We expect to edit (or at least view) documents between them.

      In addition, we're already all using computers with CPUs, GPUs, DSPs - different architectures for different jobs - inside them.

      I heard it said that the Motorola CPUs (Atari, Mac) were better suited to music production in the '90s. And it was music and DTP that allowed Apple to survive that decade (Windows didn't have any means for system wide document colour management). Again, not an architecture thing. The inclusion of FireWire as standard (originally for high Res scanners) in Macs helped them in the early 2000s get the audio and video editing market.

      1. NeilPost

        Re: Back to the 90s?

        The Atari ST was more music suited as had a GUI and crucially MiDi Ports as standard.

        The Mac.. usability.

        Until it was ported elsewhere the Acorn Archimedes/BBCA3000 was exclusive for the excellent Sibelius composition software.

  9. MachDiamond Silver badge

    And how many USB ports will the laptops have? They can delete the card reader as it doesn't work with the cards that go in my (pro) camera and I have a much faster external reader anyway. What I need is more built in USB ports rather than having to carry a hub and wall wart in my bag.

  10. BGatez

    f*** a bunch of apple with their ridiculous overcharges for upgrades and welled on RAM and storage and killing right to repair

    1. Wibble

      You're obviously not their target market then.

      Whatever you think of the price, they're exceedingly good bits of kit that prioritise usability and reliability with tight integration with the hardware and software stack. Why do you think Microsoft try to copy Apple kit?

      1. Anonymous Coward
        Anonymous Coward

        Which Apple kit did Microsoft copy?

  11. anonymous boring coward Silver badge

    "Will it provide fanbois with a dramatically improved experience, or will it be more of the same, leaving the sole point of differentiation the TouchBar? Finger nail biting time for some"

    Seems the Register writers always are bitter about not being able to afford Macs. Not a good look.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like