back to article 'Bigger is better' is back for hardware – without any obvious benefits

When I first saw an image of the 'wafer-scale engine' from AI hardware startup Cerebras, my mind rejected it. The company's current product is about the size of an iPad and uses 2.6 trillion transistors that contribute to over 850,000 cores. I felt it was impossible. Not a chip – not even a bunch of chiplets – just a monstrous …

  1. tiggity Silver badge

    whatever personal computer performance we have

    Some crappy pointless JavaScript overkill website will bring it grinding to a halt.

    1. Binraider Silver badge

      Re: whatever personal computer performance we have

      So much this!

  2. fch

    You're speaking from my soul there !

    Somehow, it feels as if the increases of the "power at your fingertips" we've seen in the last decades have largely been used to ... make the eyecandy a little sweeter still. Advances in usability ? Word'365 or what it's called today is not so far ahead of what Winword'95 could do; user interfaces haven't become more obvious or snappier either (but thanks to the huge compute power increases, also not slower even though every mouse pointer move in your Electron-Framework-App pushes around 100 REST API requests with a number of grpc-to-json-to-yaml-to-binary-to-toml data conversions. Oh, forgot about that old Java component in there that's using a REST-to-XMLRPC bridge thing. anyway ...).

    Mobile Apps have seen advances; the interface on current Android or iOS devices beats 00's era S60 or Windows Mobile any day for sure. Desktop Apps have stagnated, though, and for a longtime. Still no voice dictation. Still no talking to my computer. It's not even taking advantage of my two widescreen displays ... still puts taskbar, menubars, toolbars, ribbon bars ... vertically on top of each other, to shrink the space that already shrank from the screens getting wider even more ... so I can now see four top halves of A4 pages next to each other alright, . Thanks Microsoft, but please train the monkey brigade doing your UI reviews a bit more in practical tasks.

    Still, much of the time it feels like modern computing is like modern cars - only available as huge SUV and ultra-huge-hyperluxurious-SuperSUV, with a fuel efficiency quite a bit worse than an early-1980s Corsa, but oh yea it'll have airconditioning inside the tires as well as an exhaust pipe camera livestreaming the glow of the catalytic converter to the cloud.

    1. John Robson Silver badge

      Spin one of those screens through 90 degrees, and then you'll have a decent document viewer.

      Soon MS will spot the adobe style "detachable menu" and decide that's a great idea, then three years later they'll stop doing so, then rinse and repeat.

      They just changed the UI for track changes such that even when my wife had figured out *how* to turn it of and on, nd was pointing me at the change that indicated it... it still took my ~30 seconds to see what the indicator was. And I could only tell by repeatedly turning it off/on and watching for the change.

      It used to be a little slider button that was green when it was on (right hand side), and grey when it was off (left hand side). Now the icon *background* changes between two subtly different greyscales...

    2. jmch Silver badge

      *Still no voice dictation. Still no talking to my computer*

      One of my greatest bugbears. For all the local power available, Google, apple and amazon* insist on transmitting audio to their servers, where it is not only processed but also harvested and stored**. Most other dictation /voice command apps use 3rd party cloud provider.

      Yet surely there is enough local power, even on a mobile, to do the processing locally. And if its 92% accurate instead of 97%, I can live with that

      *and other culprits

      **in whatever form they can get away with

      1. DS999 Silver badge

        Siri's voice recognition is on device

        Not sure about the Mac, but presumably is on the M1 Macs.

      2. doublelayer Silver badge

        "One of my greatest bugbears. For all the local power available, Google, apple and amazon* insist on transmitting audio to their servers, where it is not only processed but also harvested and stored**."

        I don't know about storage, but most of those places offer offline dictation and have for some time:

        Apple: On Mac OS, go to the dictation settings and select the offline option. Download each language file you are interested in. On IOS, it's less clear, but they claim that if you select languages under Settings -> General -> Keyboard, that the processing will be offline. It works when I set my phone to airplane mode.

        Google: On Android, go to Settings -> System -> Language and Input -> Google keyboard and select offline languages to download.

        Amazon: I don't know about their tablets, but for Alexa devices, you're out of luck.

        1. jmch Silver badge

          Thanks for the feedback.

          Re Android, though: it's possible to work offline, and will work in aeroplane mode, but if there is a connection it will do it online. Not sure if there is a setting to make it permanently offline even if the phone is online

      3. Davegoody

        Apple don’t, at least not on iThings

        iPhones and iPads, for the last couple of generations no longer require an internet connection to parse most Siri requests. Means that it’s a hell of a lot more responsive.

      4. Anonymous Coward
        Anonymous Coward

        *Still no voice dictation. Still no talking to my computer*

        I work in an open-plan office The thought of having to listen to half-a-dozen colleagues bellowing their relentless barrage of pointless emails into a microphone all day makes my blood run cold.

      5. plrndl

        Still No Voice Dictation

        Most of the people who are smart enough to communicate with a computer directly with language, are already doing so, They're called computer programmers, and they have to be specially educated to do so. Most humans don't have sufficient grasp of the syntax of their native language to communicate clearly with one-another, let alone with a strictly logical computer.

        Maybe with another 20 years of development of quantum computing and AI, we will have a computer capable of understanding typical human babble.

    3. Anonymous Coward
      Anonymous Coward

      modern cars

      and can we stop with the 10"+ ipad like displays in the center console? I don't want a TV in my vehicle it is supposed to be a simple radio and maybe a few extras like navigation and find me the nearest fuel, food and/or toilet

    4. doublelayer Silver badge

      "Still no voice dictation. Still no talking to my computer."

      I'm confused by this. We have dictation. In my experience, it works rather well when using it to type, though like everything else you have to check it for mistakes it will make eventually. Of course, I know some people who the software seems to hate and frequently misunderstand, but there's a reasonable chance you're not one of them. We have had dictation software for some time now. If you meant conversational dictation where the computer talks back, we don't really have that. The problem with that is that the computer doesn't understand and construct responses, but it can listen and write down what you said just fine.

      1. Robert Carnegie Silver badge

        Windows speech recognition has been "standard" since Windows XP Tablet Edition, which wasn't quite standard, and was built-in in Windows Vista and ever since. You could also get it with Microsoft Office. Windows XP Tablets weren't especially powerful, though.

        You do have to find it, activate it, and train it, and avoid being steered into using the cloud-processing version if you prefer not to. The first part is "key Win+U then select Speech" as of Windows 10.

  3. Nugry Horace
    Gimp

    Wafer-scale devices? That was going to be Sir Clive Sinclair's Next Big Thing in the 1980s. He'd even got Rick Dickinson to design a casing for his range of wafer-scale SSDs.

    1. Contrex

      Ah yes Anamartic. Wasn't good old Ivor Catt involved in some way?

      1. Roland6 Silver badge

        Catt certainly held a number of key patents back then.

        I remember talking to him about the cooling problems, particularly when you link wafers together to form a transputer style grid computer. I forget the numbers, just that we managed to cram something ridiculous into the space of an upright piano, only problem was getting rid of circa 8KW of heat.

        The other area of concern was getting data on and off the wafer.

        But then Catt was more focused on small footprint CPU and memory, like the Transputer; not large single CPU and large memory (40GB+) on a single wafer.

    2. ChipsforBreakfast

      Somewhere, if I dredge the depths of the storage boxes in my attic I still have a working Sinclair QL, complete with the original user manual. I suspect the microdrive cartridges are long dead though - they never were the most reliable of things.

      I must dig it out sometimes and see if it'll still power on....

    3. Robert Carnegie Silver badge

      I thought it rang a bell. Including "each small device on the wafer self tests at power up and disables itself if faulty".

  4. confused and dazed

    death spiral

    Absolutely agree, the use-case for recent Mac Studio Ultra is pretty much niche.

    Wafer fabs need to shrink and ship large quantities of wafers to stay viable. hardware vendors need to trump the other hardware vendors for sales. Users don't need the extra compute. The result must be that things last longer. I'm writing This on a 4 year old machine and feel no need to change it. Once every house is saturated and there is no new app, then there will be a culling

    1. Flocke Kroes Silver badge

      Re: death spiral

      I thought my 15-year old laptop had finally reached end-of-life but I found I could still get another replacement keyboard for it so it should be good for 5 more years.

    2. YetAnotherXyzzy

      Re: death spiral

      Agreed. I only last month replaced my 16 year old desktop, and that only because the 4 GB RAM maximum just wasn't cutting it for a box that needs to visit mainstream websites in a mainstream browser.

      So I replaced it with... a 10 year old refurbished desktop, and immediately maxed out the RAM to 16 GB. DDR3 isn't going to get any cheaper or easier to find, you know. That out to be good for a few years.

    3. Robert Carnegie Silver badge

      Re: death spiral

      We do still have a worldwide shortage of microprocessors...?

  5. Mike 137 Silver badge

    "Raw capacity has never been the point of computing"

    We have to remember that technological 'innovation' is driven by vendor interest, not customer demand. Raw capacity (i.e. ever bigger numbers) leads to constant churn, which is what has kept the industry rolling for so long. If everyone's kit was felt to be adequate for its functional lifetime, the revenue stream would dry up. I still have an perfectly adequate and reliable system with an Athlon single core processor which has been driving a SCSI professional graphics scanner since 2005, but obviously that represents lots of lost opportunities for vendors to take money over the intervening 17 years.

    1. nintendoeats

      Re: "Raw capacity has never been the point of computing"

      I hope that it's JUST driving the scanner...I use a much newer (but still ancient) i5 2400s to connect to work, and it struggles with many day-to-day tasks.

      It also must be said, energy efficiency has improved a lot since then.

  6. Charlie Clark Silver badge

    More false premises and equivalences… Pesce never fails

    my next MacBook Pro will be bigger and heavier than my 2015 model

    Based on what evidence? The notebook's weight is now largely dependent upon the screen and the battery. Apple has kept the weight of the MBP constant since it ditched the DVD drive and the ARM based ones use the extra space for more battery: few users complain about better battery time. Not that I'm about to rush out any by one, but I do appreciate what they've done.

    As for larger wafers/dies, this is simple physics: communication on the die is faster than to anything connected. Apple has done this with memory on the M1 and shouts the numbers at anyone who'll listen: memory shared by CPU and GPU makes some operations a lot faster.

  7. Jonathon Green

    Hardware running ahead of software is hardly new (sometimes the pendulum swings the other way and we’re waiting for hardware capable of putting The New Shiny in software onto the desktop, sometimes it’s the other way round).

    Give it a year or two and somebody out there will find a use for it, of course it’s possible we’ll then all wish they hadn’t but once again “plus ça change, plus c'est la même chose“ and all that…

  8. Howard Sway Silver badge

    These monsters need to be tamed, trained, and put to work

    Isn't the real problem that PC (and processor) manufacturers have just been trapped in a cycle of having to constantly stay at the high end of the latest and greatest chips, because that's what people have been taught to buy to run up to date software.

    How many office workers need a 4Ghz processor in order to run the software they use? Almost none, but that's what you'll have to get if you buy a new one. Why aren't 2Ghz based machines still available, and at a much lower price point?

    If you want a new car, you don't have to buy a Ferrari, you can buy a modestly performing one for much less money to meet your more modest needs, surely this should be the case too for PCs.

    1. Snapper

      Re: These monsters need to be tamed, trained, and put to work

      The CPU is but one part of a desktop or laptop computer. Change it and you only change the cost of that one part, ergo you would very quickly hit the point where that computer is not profitable to sell.

      The only way round that is to reduce the quality of the remaining parts, and looking at some of the fragile examples out there now I don't think we have too far to go before they only last three years and ........oh!

    2. Flocke Kroes Silver badge

      Re: Smaller and cheaper

      Think of you soon to be ex-PC supplier and blow a loud Raspberry while counting backwards from Pi.

    3. This post has been deleted by its author

  9. Anonymous Coward
    Anonymous Coward

    Wasn't this a Sinclair idea in the 80s ?

    Whole wafers that blew the connectors linking the working dies on first powerup ?

  10. rafff

    Celebrate Cerbras

    To the best of my knowledge Cerebras have a special compiler to schedule work on the beast. And it really is a beast; I have been in the Presence.

  11. Anonymous Coward
    Anonymous Coward

    I disagree completely. I'm a programmer. There are never enough compute cycles, cores, and IOs for doing modern web development, where you have to run multiple servers/services and multiple nodes, sometimes running as a local test implementation of a fault-tolerant network.

    But for the workloads I'm talking about, it is the core count that matters, or at least the full hardware threads, not raw CPU.

    The one limitation that continues to haunt processing is the plethora of single-threaded utilities and tasks that run on a standard system, severely limiting the throughput. Take a kernel update for example: instead of spawning multiple kernel link cycles, Linux does them one at a time in sequence, even on a 12-core box!

  12. TimMaher Silver badge
    Alien

    SETI

    Now, finally, I can get something that will run Boinc and Boinc Manager, without grinding to a halt as the GPU fan powers up to eleven.

  13. mdmaus

    Horses for courses…

    Please don't write articles that suggest everyone shares the same requirements. There are plenty of professional users that can use all the computing power they can get from these machines (and more). I'm working in such an environment myself and have been frustrated for years by the achingly slow development from Intel. Apply has set a fire under them and not a moment too soon.

  14. Boris the Cockroach Silver badge

    Remember

    the old saying

    thats 90% of users use 10% of the software functions...

    The only reason this windows 10 PC leaps into life faster than my oldest win95 pc is because of the SSD bolted to the motherboard.

    If it loaded in from the HDD , it would take as long.... 25 years of 'progress'?... 25 years of bloat more like.

    And lets face it..... most workers could get by on office 2003/win xp with no problem and no disruption.

  15. DS999 Silver badge

    Not all hardware is bigger

    The bulk of PC sales are things like Celeron and i3. The bulk of Mac sales are ordinary M1, not the Pro, Max or Ultra. Just because something is available for those who need it, or think they need it, doesn't mean the whole market is turning in that direction.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not all hardware is bigger

      At this point in time, that is just not true anymore. The bulk of sales are midrange systems that the sales droids tell them will play games - usually an i5 or AMD equivalent with a paltry 8-16GB of RAM and a 250GB SDD.

      The bulk of business sales is low-end machines for the masses of staff; that is where things get skewed, because those same businesses are responsible for the bulk of the high end purchases as well. Those aren't really "Windows boxes" in the sense you or I think of them, though - they're just glorified modern green screens for the servers that the staff use. If the staff weren't so insistent on it being "usable", most of their work could still be done on dumb terminals.

      1. DS999 Silver badge

        Re: Not all hardware is bigger

        The majority of consumers don't play games on their PC, so sales droids using that approach won't influence them.

    2. ayay

      Re: Not all hardware is bigger

      I can't even find anything lower end than an i5 or Ryzen 5 pretty much anywhere. Even from OEMs.

  16. DenTheMan

    Softie !

    At least with physical hardware there is gokd actual return.

    3.11 would run fine on 4MB of memory, 16MB feeling superb.

    And the install would deplete your hard drive by a massive 15MB.

    I feel that on the software side we have had minimal return since Windows 3.11, the primary reason for future Microsoft bloat being to replenish sales of both hardware and software. And what have we inherited from this shifty behaviour?

    Yes, 1000s and 1000s of new avenues for malware.

    1. juice

      Re: Softie !

      > I feel that on the software side we have had minimal return since Windows 3.11, the primary reason for future Microsoft bloat being to replenish sales of both hardware and software

      Linux Mint requires 20GB of hard drive space

      Mac OS requires at least 35GB of space.

      Even the Pi's Rasbian distro takes up 4.7GB of space.

      Yes, there's arguably a lot of pointless bloat in modern OS's, and if you're brave and/or have lots of time, you can no doubt manually trim them down significantly.

      But it's not a purely "Microsoft" issue.

  17. porlF

    In bioscience, bigger is sometimes better ...

    THE GOOD

    I supported a biotech research computing environment for >3 decades and in my experience there were occasions when the availability of bigger systems enabled new research. For example, the emergence of TiB scale RAM systems (SGI UV) enabled the de novo sequencing of very large genomes (of plants) which, up to then, was impractical. Available software was written with much smaller genomes and smaller hardware platforms in mind, and it was inefficient and wobbly, but it worked. Phew!

    Also, some researchers might not attempt ambitious computational tasks if they (mistakenly) believe it's not possible, when those of us in support groups can say "of course we can try, we just need to make the box bigger".

    THE NOT YET PERFECT

    Inefficient code delivered with cutting edge lab equipment required unnecessarily large resource to run. Some years ago, I spec'd and installed a multi-cabinet HPC/HTC cluster solution for the initial processing of data from a cutting-edge high throughput sequencing instrument .... only for the equipment vendor to later spend a bit of time optimising the code which meant it would now run on a single quad core desktop! This is the nature of cutting-edge laboratory coupled with cutting-edge software to handle it. The cluster capacity was then swallowed up with the unexpected volume of work to analyse the avalanche of (complex) processed data.

    THE BAD

    A lot of open-source research-grade software is written by people who are trying to prove a research objective, and will choose language, frameworks and platform that will just get the job done. Once achieved, there is precious little time or resource available to make the workflow or the code run efficiently, or even to re-write. This means a lot of HPC/HTC cluster time is lost to inefficient software and workflows ... and researchers use each other's code a lot, so useful (if inefficient or wobbly) software rapidly spreads within a global research community.

    CONCLUSION

    If we were to ask science to progress more slowly, and spend more time and money on software engineering, we could make much better use of existing hardware, but I end my comment with a defence of science that it would be wrong to do this in many cases. Sometimes the pace of change in science is so fast there is no time or funding available to optimise the many hundreds of apps used in research, much less to keep pace with advances in code libraries, hardware, storage and networking etc.. I feel the benefits of rapid advances in biological science often (not always) far outweigh the apparent background wastage of CPU cycles and IOs. Bigger is better if we want science to move ahead .. especially so when we need it to move fast ( remember covid? ).

    1. philstubbington

      Re: In bioscience, bigger is sometimes better ...

      I guess profilers are still available? I used to find it quite illuminating when I was a programmer in the 80s and 90s quite how much time was spent running bits of code.

      Seems like a lost art :(

      1. porlF

        Re: In bioscience, bigger is sometimes better ...

        @philstubbington .... I agree yes, profiling has become something of a lost art, or at least a less prevalent one in some aspects. We created tools to assist with profiling, but the barrier has always been the sheer weight of new and updated apps, rapid churn of favoured apps and the availability of time, money and researcher expertise to actually do this.

        Any software tool implemented in an HPC/HTC environment will perform differently at different institutions, on different architectures and, more importantly, in different data centre ecosystems where scheduling, storage and networking can vary considerably. Software tools are rarely used in isolation and there is normally a workflow comprising several different/disparate tools and data movements. Ideally then, tools and entire workflows need to be re-profiled, and this is not cost effective if a workflow is only going to be used for one or two projects.

        We had over 300 software items in our central catalogue, including local code but most of which were created at other places, plus an unknown number sitting in user homedirs, and there was no realistic way to keep on top of all of them. There are one or two companies out there who have specialised in profiling bioinformatics codes, and this will work well for a lab who is creating a standardised workflow, e.g. for bulk production sequencing or similar over a significant time period e.g. many months or years. We had a different problem, where a lot of the research was either cutting-edge or blue skies discovery, so nearly all workflows were new and experimental, and then soon discarded as the science marched forwards.

        Bioinformatics codes are generally heavy on IO, so one of the quickest wins could be achieved by looking at the proximity of storage placement of input data, and the output location, and how to then minimise the number of steps required for it to be ingested by the next part of the workflow.

  18. BebopWeBop
    Facepalm

    but my iPhone 13 is significantly chunkier than my iPhone X, and my next MacBook Pro will be bigger and heavier than my 2015 model.

    Maybe - but that is battery and/or screen!!!

  19. Joe Gurman

    Breathes fire?

    The maximum sustained power consumed by the Mac Studio with the M1 Ultra CPU is 370 W. Quite a bit more than the most efficient, full-sized laptops, but quite a bit less than a lot of full-sized desktops. Don't mistake efficient thermal design for wasteful power consumption (e.g. Xeons).

  20. steelpillow Silver badge
    Boffin

    Yield or die

    "the 17-year cicada strategy, applied to computing – availability in such overwhelming numbers that it simply doesn't matter if thirty per cent of capacity disappears into the belly of every bird and lizard within a hundred kilometers."

    That is known in the waferfab world as "yield". Usually we bin the ones that don't make it; with a new bleeding-edge product, this is often around 80% at launch and falls slowly as the fab beds in and we all learn to do our latest jobs properly. Waferscale hinges on the idea that it is more efficient to leave the duds in and wire round them. It is a meme that (like broadband over the power grid) recurs regularly but, as others have pointed out, since the days of Ivor Catt and Clive Sinclair (better known as the 1980s) has yet to prove itself deserving of Darwinian survival.

    P.S. Our hack is clearly no gamer. "a monstrous 1000mm2 die"? That's only just over 30 mm on a side. The idea that the CPU is a monster and will catch fire unless extreme cooling is employed, is a fundamental axiom of said community.

  21. Anonymous Coward
    Anonymous Coward

    "Moar Transistors!" Only gets you so far with a real world computer. Multithreading is hard to do well (and doesn't help with every use case), so if you have an application screaming away on one vCPU it doesn't matter if you have a 10,000 core monster.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like