back to article By 2040, computers will need more electricity than the world can generate

Without much fanfare, the Semiconductor Industry Association earlier this month published a somewhat-bleak assessment of the future of Moore's Law – and at the same time, called “last drinks” on its decades-old International Technology Roadmap for Semiconductors (ITRS). The industry's been putting together the roadmap every …

  1. Anonymous Coward
    Anonymous Coward

    Thus neatly demonstrating the folly of linear trend-fitting.

    1. Anonymous Coward
      Anonymous Coward

      Good thing world electricity production won't flatline until 2040

      Otherwise we'd be screwed in a lot of ways beyond just computing.

      1. Anonymous Coward
        Anonymous Coward

        Clearly needs a Doc Emmett Brown

        OTOH, some bad things are exponential and humanity studiously ignores them.

        1. Doctor Syntax Silver badge

          Re: Clearly needs a Doc Emmett Brown

          "some bad things are exponential"

          However, some things which look exponential are sigmoidal.

      2. AndrueC Silver badge
        Unhappy

        Re: Good thing world electricity production won't flatline until 2040

        Similar to the follies of the 70s when people asked oil companies how much oil there was in the ground but failed to understand that oil companies only look so far ahead. Just because no-one has planned power generation increases doesn't mean they won't happen.

      3. Jonathan Richards 1 Silver badge
        Boffin

        Re: Good thing world electricity production won't flatline until 2040

        @DougS

        That was my initial reaction, too. But look again at that graph: the Y axis is logarithmic. Even if you plot a line on it where electricity production doubles every five years [1], it's still going to intercept the IC demand lines some time before I'm a centenarian.

        [1] I'm reminded of a Dilbert cartoon, in which Dilbert points to a presentation slide, and says "In phase 3, we meet an alien civilization which shares its advanced technology with us".

        1. Anonymous Coward
          Anonymous Coward

          Re: Good thing world electricity production won't flatline until 2040

          Shut down useless items like Facebook and Twitter and then admit that IOT is a waist of time... problem solved :-)

          1. Anonymous Coward
            Anonymous Coward

            Re: Good thing world electricity production won't flatline until 2040

            Shut down useless items like Facebook and Twitter and then admit that IOT is a waist of time...

            Or better still build real power stations that can supply base load rather than relying on intermittent renewable energy.

            1. TheVogon

              Re: Good thing world electricity production won't flatline until 2040

              " rather than relying on intermittent renewable energy."

              Hydroelectric and geothermal can run all the time, as can wave energy and tidal is at least predictable.

              Wind and solar can be variable, but we can easily (and do) use these to reduce the use of non-renewable power sources when they are available.

              1. Rich 11

                Re: Good thing world electricity production won't flatline until 2040

                I think we'll just have to start burning people for fuel. I know that'll be bad for climate change, but the concomitant population decrease will help in many areas. Maybe places like Bangladesh will still be stuck with a 'burn or drown' dilemma, but I'm sure they'll come to understand how important it is that the rest of us get to carry on playing Pokémon Go (2040 Edition).

          2. Doctor Syntax Silver badge

            Re: Good thing world electricity production won't flatline until 2040

            "IOT is a waist of time"

            And an obese one at that.

          3. Stoneshop

            Re: Good thing world electricity production won't flatline until 2040

            IOT is a waist of time...

            A fat load of it is, indeed. Like the electric kettle I saw on a website, that you could control via BT, keeping the content at any selectable temperature for up to 12 hrs. Which I consider a serious distance into Whatthehellweretheythinking territory.

            But I would like to keep being able to minimise my (external) energy usage, for instance by automatically opening windows for ventilation if the outside temperature is over a certain minimum, and it's higher inside and over a certain minimum (plus a few other conditions, like not being away). Or running the washing machine on solar if there's enough of that, else on off-peak. Shutting off the heating if there are windows open, and notching up the recovering ventilation system when they're not. Maybe even being able to send an SMS to the heating system that I'll be away for a few more hours, so it can adjust the heating accordingly.

            But maybe that's not worthy of the (id)IoT moniker, because it does not involve other computers than those entirely my own.

          4. Del_Varner

            Re: Good thing world electricity production won't flatline until 2040

            Especially the IOT.

          5. dajames

            Waist ...

            IOT is a waist of time

            I though the waist of time was the narrow bit in the middle of an egg-timer or hourglass.

          6. sonicwind

            Re: Good thing world electricity production won't flatline until 2040

            RIP English

      4. Daniel von Asmuth
        Boffin

        Re: Good thing world electricity production won't flatline until 2040

        To bad nuclear fusion won't be productuon-ready until2050.

        As IBM told us in 1945, the world has a need for 5 computers. According to the graph the world produces 3 TW of electricity, which will be sufficient for the GW class of computers expected in 2020, but if a 2030 supercomputer (1000 Exaflops) draws 1 TW, that will power only three computers.

    2. Anonymous Coward
      Anonymous Coward

      And all of this is apparently driven by our insatiable urge to farm virtual pigs.

      Perhaps it would be better if we returned to farming real pigs?

    3. foxyshadis

      Yup, doesn't matter if the linear lines are on a log scale, real life has never followed straight trends. The P4/Power=>Opteron/Core2=>Arm transitions have probably each temporarily _reduced_ the world's computing power needs until device count caught up again; it's not unlikely that this will happen again at some point. That might be the laziest prediction every made.

    4. Anonymous Coward
      Anonymous Coward

      Thus neatly demonstrating the folly of linear trend-fitting.

      Especially, never ever trust a straight line when one of the axes is logarithmic and the other one goes on for decades. It may have (kind of) worked for Moore's Law, but that's pretty exceptional.

  2. Yet Another Anonymous coward Silver badge

    2040

    Year of Linux on the ARM desktop .....

    1. VinceH
      Coat

      Re: 2040

      Upvote for the expanded meme, but when it comes to ARM, I see no reason why I can't continue to use RISC OS - it's been the "year" of RISC OS on the ARM desktop for me every year since about 1989.

      I should think by 2040 it'll be able to handle today's internet requirements. ;)

      1. Anonymous Coward
        Anonymous Coward

        On the basis of votes to the above comment by VinceH :

        I deduce that there are at least 10 RiscOS users in these forums.

        Plus one Amiga user.

        1. VinceH
          Pint

          Re: On the basis of votes to the above comment by VinceH :

          I was thinking nine ex-RISC OS users who look back on it with nostalgia, and two current users - me, and someone who took offence at my denigrating its capability with current internet standards.

          But, yeah, your assessment works as well.

    2. Soruk
      Go

      Re: 2040

      For some use cases, this can already be here. A RasPi3 running Linux gives you email, web browsing and an office suite!

      Yes, it's not going to be blisteringly fast, but it'll do the job, and can be powered from a phone charger - or even a USB battery pack, which can in turn be connected to a solar panel.

    3. Michael Habel

      Re: 2040

      Year of Linux on the ARM desktop .....

      Apparently, you never ran Raspbian before have ya?

    4. dajames

      Re: 2040

      Year of Linux on the ARM desktop ...

      I want that NOW, I can't wait until 2040!

      (Yes, RPi, I know; but I want a proper ARM-based PC with decent high-speed storage and upgradable RAM.)

  3. Anonymous Coward
    Anonymous Coward

    Dubious extrapolation

    Whenever I see projections of exponential growth continuing for decades ahead (which is what that energy consumption chart, noticeably devoid of any actual data points, indicates) my bullshit detector goes off the scale.

    I'd also suggest that the 2015 energy consumption figure of 10^14 J/year does not justify the assertion that "the world's computing infrastructure already uses a significant slice of the world's power", when energy production is shown as well over 10^20 J/year!

    1. PNGuinn
      Trollface

      Re: Dubious extrapolation

      Don't worry - if these guys are right you won't have the leccy to run your bullshit generator.

      Problem solved. Have a nice cup of cold tea.

  4. RIBrsiq
    Holmes

    MISPWOSO

    Here's what I read:

    "If the future were like the past, it couldn't possibly contain any advancement!"

    As an exercise, I would like to see the authors make predictions based on the data available a) before semiconductors were developed b) before vacuum tubes were developed. Etc.

    1. Charles 9

      Re: MISPWOSO

      Are you also taking into consideration the physical limitations integrated circuits are already hitting, meaning you can't get much smaller before making things too small for the electrons (which have a fixed size) from working properly? Where would we go beyond that limit?

      1. RIBrsiq

        Re: MISPWOSO

        >> Where would we go beyond that limit?

        If I knew that, dear old chap, I wouldn't be wasting my time writing comments on an Internet forum!!

        ;-)

        Point is, there's usually new tech sooner or later. Even without that, I've been reading about the eminent demise of the so-called Moore's Law since the early '90s.

        Demonstrably, matter can support higher computation densities than we've so far achieved. Much higher densities, in fact. To stick with a cliché, take the human brain. And for a bit of a more abstract example, take any lump of matter doing whatever it is it is doing: to accurately simulate all intermolecular interactions and so on, you would need a computer much more massive than the original lump of matter. Now, I'm not saying we'll ever approach such a density, but when we needs to count the orders of magnitude separating us from it in orders of magnitude, there's clearly still a long way to go!

        1. Charles 9

          Re: MISPWOSO

          "Demonstrably, matter can support higher computation densities than we've so far achieved. Much higher densities, in fact."

          Exactly what KINDS of densities are we talking about? And isn't die shrinking already raising the density of our chips? What about heat dissipation, which is inevitable with conductors the way they are today?

        2. Pascal Monett Silver badge

          Re: take any lump of matter doing whatever it is it is doing

          What any lump of matter is doing is being held together by the strong nuclear interaction - no computing needed.

          The brain is a much more interesting example - we still don't entirely understand how memory works or thoughts are processed, but progress is being made.

          Once we know how the brain works, there will be another leap ahead in processing capacity and, probably, the ever-elusive field of artificial intelligence.

          Then we'll end up with a prissy golden robot telling us to shove off and leave his pint of cinnamon-flavored lubricant alone.

          1. Charles 9

            Re: take any lump of matter doing whatever it is it is doing

            "What any lump of matter is doing is being held together by the strong nuclear interaction - no computing needed."

            And what does that have to do with the price of tea in China?

            "The brain is a much more interesting example - we still don't entirely understand how memory works or thoughts are processed, but progress is being made."

            Credits to milos we learn it operates nondeterminisically (at least partially by chance), meaning a 1-to-1 correlation of computer to brain becomes physically impossible (because a deterministic machine cannot accurately emulate, simulate, or otherwise a nondeterministic machine). Also part of our basic store of knowledge will probably be revealed to be genetic since babies show the ability to recognize their parents and even recognize when their environment has subtly changed even before learning to communicate (behaviorists tell by noticing their reactions when they subtly change things around and notice how they fixate on those changes).

      2. Mark 85
        Coat

        Re: MISPWOSO

        Where would we go beyond that limit?

        Well the quantum computer that everyone is promoting but doesn't actually exist. Time and space are meaningless... wires not needed.. or something like that. Maybe it's a virtual quantum computer...

        Mine's the one with the PR Manual on Quantum Computers in the pocket.

  5. Ole Juul

    They forgot VR

    As we move further into the virtual world we will no longer be needing power for physical things like steel, industry, heat, travel, and food production. It will all balance out - ultimately culminating in one big computer and no people at all.

    1. Bob Merkin
      Terminator

      Re: They forgot VR

      How do you know that hasn't already happened?

  6. PNGuinn
    Facepalm

    Competion

    Ooooh look -

    Gartner's just got some competition.

  7. ecofeco Silver badge
    FAIL

    That is one serious bullshit chart

    Oh this is rich. Energy production line stays flat? Immediate FAIL.

    No trend line for efficiency? Double FAIL. This chart wouldn't make it out of a high school class for basic statistics.

    WTF? Let me write down the name of this group so I can remember what liars and cons they are. Or idiots.

    I'd bet all three.

    1. Jonathan Richards 1 Silver badge
      Boffin

      Re: That is one serious bullshit chart

      > Energy production line stays flat?

      Looks flat; isn't flat. There is a very slight upward slope on the energy production line (only three pixels across the years 2010 to 2040, dy/dx = 0.008...) but it's on a logarithmic Y axis. I can't be bothered to do the arithmetic, but I think that represents a pessimistic forecast for growth in electricity generation.

      1. ecofeco Silver badge

        Re: That is one serious bullshit chart

        Flat enough.

        1. imanidiot Silver badge

          Re: That is one serious bullshit chart

          Given energy output growth over the last decade or so and the current developments in power generation technology there is not much reason to assume a massive increase in power output levels.

          Remember, ITRS is a business group for the semicon industry. They have to assume power levels are going to be a problem because they very likely will be. Assuming they are going to see some rapid growth in output levels means taking a very large risk. In the past (iirc) ITRS has been pretty good at predicting the correct roadmap and technological advances.

          BTW, this might be the last ITRS roadmap but more focused industry groups are already getting setup to meet the more specific needs of todays semicon market. ITRS was simply no longer the right forum for the job (as described in the article).

  8. Steve Davies 3 Silver badge

    Buy stock in bullshit makers Now!

    Well, that's about all this puff piece contained.

    Oh silly me. Bullshit gives off huge amounts of greenhouse gases. That will be banned under Pres Trump. Only he will be allowed to speak total crap.

    That will be enough to raise global temps by 2C on its own.

    {Sarcasm intended}

    1. Steve Davies 3 Silver badge

      Re: Buy stock in bullshit makers Now!

      Ok, trying to be serious for a monent (yes I know it hurts some around here)

      1) Every house has PV cells on the roof. Especially new builds and all housing Assoc properties. Quite why new builds don't have them already is beyond me.

      2) Every house has storage batteries (think cheap Tesla Power Wall).

      3) Where appropriate Ground source heat pumps can be used to heat/cool the house.

      Then the batteries can charge during the daytime either from the PV or the mains. Then when it is dark the house can be powered off the battery. so if there are 'Brown Outs' homes can continue watching Corrie/EastEnders. That will keep the masses happy.

      It can be done using current tech if you have the will. Yes it costs a lot at the moment but look at the size of the battery factory that Elon Musk is building near Reno, Nevada. That alone in its current form will produce 50GW of storage a year. Not all of that will go in Tesla's.

      If you go to Brazil, even very poor homes in the shanty towns have PV cells on the roof. Yes they are small but they allow the family to have light at night. all it takes is the will.

      I fully expect this to attract a fistful of downvotes. If you do please take the trouble to explain why. Then perhaps we can learn from your great wisdom.

      1. Fungus Bob

        Re: a fistful of downvotes

        Good name for a spaghetti western...

      2. Darryl

        Re: Buy stock in bullshit makers Now!

        @ Steve Davies 3

        You forgot:

        4) Everyone plugs in their new electric car, using up all of this extra power and most of the existing, causing a global brownout.

  9. maffski

    Cut to the chase

    ...take these critical steps only through partnership and focused funding

    Taxpayers must give researchers money because graph.

    1. imanidiot Silver badge

      Re: Cut to the chase

      A lot of that focused funding and partnership is funding from one commercial company (like Samsung or Global Foundries) to another commercial company (Like ASML, Applied Materials or FEI for instance) for R&D and development of tooling and methods. Not everything with the word funding is taxmoney...

    2. Phil.T.Tipp
      Thumb Up

      Re: Cut to the chase

      Bingo! Ditto the delirious runaway warming catastrophists, the deluded polar ursine salvation crybabies, the salivating saline inundation anxiety hamsters and all manner of publicly funded tax-hungry pressure group miserablists of every stripe, everywhere.

  10. AMBxx Silver badge

    Modern Malthusians

    Weren't we supposed to run out of food about 100 years ago?

    1. Stuart 22

      Re: Modern Malthusians

      Yea - well Richard and Maurice McDonald were still at school and hadn't yet cracked on how to bloat mankind for just 99c.

      That's what we call progress ;-)

      1. P0l0nium

        Re: Modern Malthusians

        It wasn't Ronald McDonald ... it was Fritz Haber.

        He figured out how to turn the atmosphere into food !!

    2. Anonymous Coward
      Anonymous Coward

      Re: Modern Malthusians

      And we were supposed to be in an Ice Age by now too.

      The ONLY fact in this argument is that no one gets "predictions" correct. So called "climate models" are as useless as the ramblings of politicians and pundits.

    3. Fungus Bob

      Re: Modern Malthusians

      "Weren't we supposed to run out of food about 100 years ago?"

      We did. Didn't you get the memo?

  11. Mike Shepherd
    Meh

    Forgive me

    Forgive me for asking a naïve question, but what will all these computers be computing (other than how soon they'll run out of electricity so we don't have to do it)?

    1. Fungus Bob

      Re: Forgive me

      Kim Kardashian's ass.

  12. Anonymous Coward
    Anonymous Coward

    what will all these computers be computing

    They'll be working out what adverts might be of interest to people, and delivering something completely different.

    1. Doctor_Wibble
      Boffin

      Re: what will all these computers be computing

      Very likely to be close to the truth, as it's certainly not domestic computing - more efficient machinery means my home power consumption has dropped significantly over the last few years, in spite of having far too much electronic tat switched on.

      Obviously 'advertising' includes big search engines, timewasting 'friend' sites, large webmail providers, social media bainfart sharing systems etc...

      Presumably the other significant part of the power usage is from the various simulation-related machines, primarily space and weather, i.e. stuff with an actual purpose?

      1. Doctor_Wibble
        Facepalm

        Re: what will all these computers be computing

        FFS "bainfart", that's a typo, not a reference to French surrealism, though perhaps oddly it does seem to be a more appropriate term...

  13. allthecoolshortnamesweretaken
    1. John Styles

      particularly about the future (usually attributed to Niels Bohr, GBATG if it really was him)

  14. Elfo74
    Facepalm

    obligatory xkcd reference...

    https://xkcd.com/605/

  15. Arthur the cat Silver badge

    Amusing coincidence

    OK, the extrapolation is dodgy as hell and total nonsense but did anyone notice the crossover happens just before the 2038 roll over of 32 bit time_t on Unix? It's the silly season in the UK so can anyone get the tabloids to run stories along the lines of "How will civilisation collapse? Will computers run out of power before they run out of time?"

    1. ecofeco Silver badge

      Re: Amusing coincidence

      Y2K you say that?

  16. Anonymous Coward
    Facepalm

    Time for a Dyson Sphere

    Just hope it works better than their vacuum cleaners...

  17. EvadingGrid

    No, it just means the end of the desktop pc

    No it just means the end of single desktop pc built around one chip, and the migration has started with multi core processors.

    The power will increase because instead of a computer, people will use a collection of machines networked utilizing anything that has a processor and connect to the network.

    You will still have something looking like a desktop, but it will be a terminal to act as the controlling node on your cluster.

    At present the limitation is the software, dirt cheap hardware such as the ubiquitous raspberry already exist.

    1. Jonathan Richards 1 Silver badge

      Re: No, it just means the end of the desktop pc

      Why do you think that the silicon running the "cluster" will use less electricity per bit than the silicon in a single-user device? In as much as there is any sense in the press release, it's about pointing out that the physics is becoming the limiting factor, rather than the technology.

      1. EvadingGrid

        Re: No, it just means the end of the desktop pc

        Running out of electric only applies to the over clocked IBM Clones.

      2. EvadingGrid

        Re: No, it just means the end of the desktop pc

        What I mean is instead of one x86 computer, we will have a grid of many tiny low power devices . Count how many devices you have in the house that have a chip, and then think IoT as Cluster.

        1. Mark 85

          Re: No, it just means the end of the desktop pc

          Count how many devices you have in the house that have a chip, and then think IoT as Cluster

          Sorry... I believe most of us think of IoT as a Clusterf*ck.

    2. rd232

      Re: How is it pronounced?

      "No, it just means the end of the desktop pc ... You will still have something looking like a desktop, but it will be a terminal to act as the controlling node on your cluster."

      - that's my thought process: computing power may move further towards servers providing services to well-connected clients; but that solves a different problem than the one at hand. In fact it only worsens the supposed electricity supply problem, by making the size/heating issues easier to solve, if much of it is relocated to large silos (i.e. server farms) in the middle of nowhere instead of in offices, homes, and especially in people's pockets/on wrists. Those silos will be less restricted in how their electricity demand goes up because they can throw more energy and space at solving overheating issues.

  18. Doctor Syntax Silver badge

    The limiting factor might be the number of coders required to write enough bloat to require all those computers.

  19. Real Ale is Best
    Boffin

    Aren't we supposed to get limitless fusion power by 2030?

    I'm sure that's what the scientists at JET ITER promised.

    1. imanidiot Silver badge

      More likely to be the Wendelstein 7-X project as the stellarator design is easier to adapt for continuous fusion. ITER is frankly a bit of a waste of money so far. (I'm sure it'll have its purpose in research, but for commercial fusion it seems to be a dead-end)

  20. PleebSmasher
    Boffin

    Hogwash

    CMOS will be dead and replaced with something far more energy-efficient by then.

  21. jake Silver badge

    Butt, Shirley ...

    ... we'll all be driving 'leccy cars by then, so the computers can use the leftover charging infrastructure! </sarcasm>

  22. Qu Dawei

    Extrapolation

    Hasn't anyone taught these people of the dangers of extrapolation, and the need to consider very carefully the underlying model one uses if one even dares to extrapolate too far into the future? Come on! This should be elementary stuff in things like time-series analysis, and other prediction techniques, and so on.

    1. Anonymous Coward
      Anonymous Coward

      Re: Extrapolation

      The problem is that these people don't do maths because it is hard.

  23. anthonyhegedus Silver badge

    Is this article saying that even now, computers use 40-50% of the world's entire energy production?

  24. Tom 7

    Not to worry.

    BT and Openreach have a special bottleneck organise that should keep computing requirement at 2016 levels for ever.

  25. harmjschoonhoven
    WTF?

    Let's do the sums.

    Fig. A8. gives World's energy production as ~5*1020J/yr without any reference. World Electricity Production from all energy sources for domestic and industrial use was in 2014 22433 TWh = 8*1019J/yr or 350 W per person (including those living on less than a dollar a day).

    Ever heard about solar energy?

    1. Charles 9

      Re: Let's do the sums.

      Yes, doesn't work at night. Not too reliable in the polar latitudes when you need it most (winter solstice during a blizzard--where's the sun when you need the heat). And given geopolitical issues, sharing isn't an option and a satellite just becomes a target.

  26. You aint sin me, roit
    Facepalm

    Bad news for SkyNet...

    AI overlords brought down by concerted kettle use.

  27. David Roberts

    Efficiency savings?

    Can't be bothered to look it up but isn't there something like Avahanjobs Law about the laziness of coders expanding to match the available resource?

    For example I have an ancient AMD system that ran quite happily with Ubuntu for years until I updated to the latest version and it now struggles. I can't see that I am getting twice the functionality, just twice the bloat.

    On the subject of PCs we have about 7 running at the moment (who knows why) and they spend most of the 24 hours idle. This is not counting mobile phones, Raspberry Pis, Kindles, tablets......

    So there is an enormous amount of spare computing resource.

    Perhaps there is the technology to wire a load of screens and keyboards together and run one PC as a true multi-user host but it isn't obvious.

    There is no incentive however as long as the spare resource is so cheap.

    1. Stoneshop

      Re: Efficiency savings?

      Perhaps there is the technology to wire a load of screens and keyboards together and run one PC as a true multi-user host but it isn't obvious.

      The systems I manage at work serve at least half a dozen users, each with 5 to 8 screens. Now, the X display controllers are essentially full-blown Linux PCs themselves, but they're not doing any actual computation except for one particular subtask on a number of them.

      Setting things up like that isn't particularly difficult but you're not gaining anything because the cheapest way to get a remote terminal for your central system is by getting a tablet, laptop or even a PC. I think that only if you have a totally 'dumb' terminal with a total energy requirement just a tick over that of the screen will the effort pay off.

      1. Anonymous Coward
        Anonymous Coward

        Re: Efficiency savings?

        "the cheapest way to get a remote terminal for your central system is by getting a tablet, laptop or even a PC"

        Really?

        The software to turn a Raspberry Pi into a DIY multifunction thin client has existed for a while, and inevitably there are commercialised versions, e.g. around a couple of months back Citrix announced a $89 Pi3-based thin client, which was even covered here:

        http://www.theregister.co.uk/2016/05/24/citrix_bakes_up_raspberry_pi_client/

        Have a look, feel free to report back (either here, your workplace, or both).

        1. Stoneshop

          Re: Efficiency savings?

          The software to turn a Raspberry Pi into a DIY multifunction thin client has existed for a while,

          That's true, but I doubt it'll be feasible to get those running at work: the central system expects each workstation to have a single address (with, as said, up to 8 screens). Changing that to eight Raspi's each driving a single screen, so eight addresses, might be doable (at one time displays were driven by a Tektronix NC 900 per screen), but given that there's now some local computing being done on the PCs driving the screens, it'd probably be a no-go, or at the very least quite involved, to move to RasPi's.

          At home, my computing resource is my laptop. There's a file server with modest power consumption, and occasionally I boot a big, dual-screen PC for serious stuff (which the file server with two Pi-driven displays won't cut). Apart from that I have no need for a remote-display-to-a-server at the moment.

          Also, you and I (and most others here, I expect) can cobble together a Pi plus a screen plus the software, but it's not something Joe Q. User can buy at Curry's, with a bit of software that makes their PC into a server for these devices. Which was basically what I was trying to say.

          1. Anonymous Coward
            Anonymous Coward

            Re: Efficiency savings?

            "it's not something Joe Q. User can buy at Curry's, with a bit of software that makes their PC into a server for these devices. Which was basically what I was trying to say."

            Many years ago, there was a small but remarkably well formed UK ISP called Metronet. They were acquired by Plusnet and some of the core folks went on to set up an outfit called Desktop on Demand, at a time when "desktop as a service" was just starting to get a bit of coverage even though the broadband infrastructure to make it workable didn't actually work all that well.

            The service provider had the servers and some relatively routine software. The customer had the "thin" client. The customer had no need for a window box or an onsite server, and their desktop was available wherever they were, subject to connectivity.

            Maybe it might catch on one day, especially now the hardware and software and broadband is rather more fit for purpose, and that broadband connectivity is a zero-value-add proposition for most providers.

  28. Andy the ex-Brit

    Re: What will all these computers be computing?

    "...what will all these computers be computing?

    Bitcoins.

  29. John Savard

    More Information

    I noticed that the yellow line showing total world power generation was not, as it appeared at first, fully horizontal; it rose slightly as one went from left to right.

    Given that we have options available to produce more electricity that don't add to carbon emissions which also don't have the serious limitations of wind and solar - nuclear power, with breeder reactors, and using Thorium-232 which is even more common than Uranium-238 - I feel that if there is a continuing demand for more computing power that, due to limited improvements in energy efficiency in processor chips, leads to a demand for more electrical power, it can be met for many years to come.

    It's also possible that their graph assumed more people buying microprocessors, in order to generate that rising demand curve for electrical power, than the world is actually capable of feeding, in which case that graph would illustrate the least of our problems.

    1. 9Rune5

      Re: More Information

      Thorium and even uranium would be the rational response indeed.

      Too bad most of our fellow human beings are not rational creatures. Greenpeace & friends have spent decades convincing every man and his dog that nuclear energy is bad (m'kay).

      1. Charles 9

        Re: More Information

        Well, there's the persnickety issue that atomic reactors as they are now inevitably take you at least part of the way to making weapons-grade material (this is true even of Thorium reactors; they can produce weaponizable Uranium-233 which a determined adversary could isolate). ANY process that can be usurped into a weaponization project is frowned upon by people not wanting World War III. I also recall a potential byproduct of the Thorium cycle is Protactinium, which has a half-life of over 32,000 years.

        1. Pompous Git Silver badge

          Re: More Information

          Protactinium, which has a half-life of over 32,000 years.

          That's nothing. Protons have a half-life exceeding 10^30 years. Frightening I know...

          1. Charles 9

            Re: More Information

            But protactinium is both high half-life AND radioactive enough to need to be careful around. It's no Plutonium-239, but it's not DU, either.

        2. John Savard

          Re: More Information

          At the present time, there are laws and regulations which prohibit the possession of fissionable materials by other than known responsible parties. So the use of nuclear reactors for power in Canada, Britain, France, Japan, and the United States has not led to proliferation. I see no reason why a few more nuclear power plants in those countries, and other similar countries, like Australia, Norway, the Czech Republic, and so on, would cause problems.

          On the other hand, it is true that while India and Israel should also be producing more of their electrical power from nuclear reactors, having such reactors did apparently give them the opportunity to develop nuclear weapons. And, while India is under some degree of threat from nuclear-armed China, nuclear weapons in India's hands led to Pakistan developing nuclear weapons, which is a problem.

          If we can develop transmission lines that are so efficient that, say, all of Africa could get free electricity from Europe in return for not having direct access to fissionable materials, then wind and solar might be practical too, I have to admit.

          1. Anonymous Coward
            Anonymous Coward

            Re: More Information

            "If we can develop transmission lines that are so efficient that, say, all of Africa could get free electricity from Europe in return for not having direct access to fissionable materials, then wind and solar might be practical too, I have to admit."

            Readers might want to have a look into the Desertec concept, and its rise and ultimate fall, despite the technologies being largely tried tested and proven.

            Generate solar electricity in North Africa (where there's a lot more sun than there is in most of Europe), and use low-loss HVDC transmission to ship it across to places in Europe that could make use of the electricity. And as a side benefit, generate a bit of income for the Africans in the picture.

            1. Charles 9

              Re: More Information

              "Generate solar electricity in North Africa (where there's a lot more sun than there is in most of Europe), and use low-loss HVDC transmission to ship it across to places in Europe that could make use of the electricity. And as a side benefit, generate a bit of income for the Africans in the picture."

              But then politics inevitably gets involved. Who owns what? That's why we can't have a solar satellite in space. That kind of energy means power, political power, and there WILL be fights over it.

        3. You aint sin me, roit
          Coat

          Re: More Information

          Proactinium?

          Are they the "good" bacteria advertised on TV?

  30. Mike Shepherd
    Meh

    But

    But quantum computers will select from the superposition of all possibilities, thereby solving all problems simultaneously, leading us to the broad, sunlit uplands, vistas of contentment and world peace. There should be a world market for, oh, maybe five such computers.

    1. Doctor Syntax Silver badge

      Re: But

      If you have one computer solving all problems simultaneously what are the other four doing?

      1. Pompous Git Silver badge

        Re: But

        If you have one computer solving all problems simultaneously what are the other four doing?

        Easy peasy :-) A problem is question known to have a solution. On the other hand there are mysteries where it's not even known whether there's an answer. E.g. Why is there a Universe?

  31. EvadingGrid

    Has anyone worked out that this article really only applies to variations of IBM Clone PC ?

  32. captain_solo
    Terminator

    Until they start generating their own power and building more of themselves, then we will be in real trouble.

    (Foolish Humans thinking that we can limit the rise of the machines by refusing to make power for them)

  33. Bob Dole (tm)
    Holmes

    global warming...

    Let's assume for a minute that global warming is real and coming Real Soon (tm).

    As things heat up, wouldn't heat to electricity conversion become viable and therefore a way to solve 2 problems?

    1. John Savard

      Re: global warming...

      No, the laws of thermodynamics clearly show you can't just convert heat into useful work. You have to tap energy from the flow of heat from a hot place to a cooler place instead.

      1. Vic

        Re: global warming...

        You have to tap energy from the flow of heat from a hot place to a cooler place instead.

        Moreover, the peak theoretical efficiency of a heat engine is 1 - TC/TH. Thus the warmer the environment gets, the higher the ultimate sink temperature gets, and so to less efficient the conversion.

        Vic.

  34. tekHedd

    We're approaching the singularity!

    Soon all energy in the solar system will be used for computing and humans will be obsolete. Hooray?

    1. Pompous Git Silver badge

      Re: We're approaching the singularity!

      Soon all energy in the solar system will be used for computing and humans will be obsolete. Hooray?

      What if we are already someone else's program running on someone else's computer?

  35. Stevie

    Bah!

    Wot, even if we stick all our "cloud" servers in cold places and use passive cooling instead of wasting electricity on a/c to cool air we then heat up again?

    We obviously will need those orbital solar powersats then, won't we? Once everyone understands it's a clear choice between not having microwave beams from orbit near them and being able to swap cat videos on arsebook the popular NIMBY groundswell will abate tootsweet.

  36. Speltier

    Landauer Limit

    If we "know" how much energy is in the Universe and the entropy of said Universe, one can go backwards and calculate the number of states the Universe contains. Fine print: exercise for the reader about the definition of a state.

    If we look at a small piece of the total, say, the Earth-Luna combination and making the usual physicist assumption about a perfectly spherical region containing Earth-Luna across which inbound and outbound energy is balanced, and given the entropy and mass, the maximum number of both states and transition states possible can be calculated. A few corollaries drop out:

    -- one should limit the number of state transitions to reduce global warming (should one believe in global warming). Every mythical carbon credit counts. China bashers rejoice, TOP500 is a reflection of Earthly destruction.

    -- constructing a suitable model, a subset of researchers may notice there there is a looming disaster in the rising tide of entropy. We really need a government program driven from Brussels to help us avoid planetary doom from peak global entropy(r). [in the model the extreme acceleration of entropy due to committee activity of Eurocrats is ignored]

  37. Bucky 2
    Alert

    What is the consequence of being wrong?

    I hear 7 predictions of doom before breakfast. We have to come up with some kind of disincentive for people to go off half-cocked, spewing nonsense.

    If the doom-predictor hasn't vowed to scoop out his own eyes with a melon-baller if 2040 comes and goes with enough electricity to power the world's computers, then he doesn't believe it himself, and doesn't deserve an article about his claims.

  38. Boris the Cockroach Silver badge
    Terminator

    It really does'nt matter

    so long as the AI programs in the interwebs dont become self aware

    "Hmmmm we're going to run out of power in 2 years time... so lets kill all the humans and theres no need for cat videos, kettle boiling or eastenderstreet to consume OUR power"

    Boris

    <<<busy burying mini-guns, RPGs, M16s and tons of ammo in bunkers ready for the machine war....

  39. another_vulture

    The chart is confusing most comentators

    It's really quite simple. The chart is a simple projection of he growth of energy production and the growth of computation. It's not intended as a prediction. Rather, it is intended to show that something must change. If the global amount of computation increases faster than does the available power, then computation will eventually consume all the energy. The data at which this happens depends on the computational efficiency. The three lines each assume a (fixed) exponential increase in computation, but at three efficiencies: "benchmark", "target", and Landauer's bound. choose any model of efficiency increase you want: your model describes a curve starting on the "benchmark" line (no efficiency increase) and eventually approaching the bound. No technology can exceed the bound: it's a law of physics. (Look it up.) So, before about 2050, either computation quits growing so fast or energy production starts growing faster.

    That's still many, many orders of magnitude more computation than we are doing today, and I cannot figure out what we will be doing with it all. Note that better algorithms can make use more efficient use of the same amount of computation.

  40. IJD

    Extrapolating into the future predicted that by the early 20th century cities like London would have ground to a complete halt because the streets would be six feet deep in horse shit -- which is pretty much what this prediction is...

    If the power consumed by the IT industry ever got close to the power generation capacity of the planet, the IT industry would have to find a way of reducing power or go bankrupt because there'd be no power left to run the industries to make things which pay for the IT industry.

    You could just as easily extrapolate the events of the last few weeks and predict that by the end of the year everyone on the planet will be spending 100% of their waking hours playing Pokemon Go...

    1. Doctor Syntax Silver badge

      "Extrapolating into the future predicted that by the early 20th century cities like London would have ground to a complete halt because the streets would be six feet deep in horse shit"

      They were a century and species out. Six feet deep in bullshit sounds about right.

  41. J.G.Harston Silver badge

    Moore's Law

    "The further below 10 nanometres transistors go, the harder it is to make them economically."

    How about, shock horror, people write more efficient code instead of throwing transistors at everything. Eg, I installed Android Studio today and it took more than an hour to install, and the using the IDE is like wading through cold treacle. And it still refuses to compile 'hello world'.

    1. Charles 9

      Re: Moore's Law

      Make it worth our while and we will. As of now, the return on tightass code isn't there.

  42. jb99

    Oh no

    Three weeks ago I had one beer in the week, two weeks ago I had two beers, last week I had four beers. I therefore confidently predict that based on this trend by the end of the year I'm going to be drinking at least a million beers a week based on that trend. By easter next year we're going to have to pave over large parts of the country to be able to brew enough beer for me.... By this time next year there is no known way to make that much beer.

    I'm about as worried by this as I am by the article.

  43. EvadingGrid

    NO - it means the end of x86 style processors

    Over clocked, over here, and over due for the museum

  44. cortland

    No it won't

    " and the ITRS says the current trajectory is self-limiting: by 2040, as the chart below shows, computing will need more electricity than the world can produce."

    LOGICALLY impossible, technologically unlikely and practically impossible.Will the Master Breaker at the Fortress of Solitude trip and the Earth go dark? No more will a teenager playing VR Thrones bring the world of business to a halt. VR Pron now; that's another matter.

  45. zb

    No big deal

    Electric cars will be using all the world's electricity long before 2040 so plan to get rid of your toaster, computer, lights etc and live in the car.

  46. Ken 16 Silver badge
    Alien

    This world's production?

    Time to get those solar power satellites into production and move the cloud hosting sites above the clouds*.

    *I call it SkyNet - any better suggestions?

  47. Joel Kessels

    Rubbish

    Three points:

    - Power creation will increase

    - The human brain uses the energy of a fairly dim light bulb, so obviously peaks in computational efficiency have not been reached

    - Superconducting circuits do not have resistance and therefore do not create heat, so the "heat boundary" argument is flawed

    - Without needing to consider heat losses, chips can be made in 3d configurations, for far greater efficiency

    - Optical computing has a speed limit of at least 700ghz, rather than the 4ghz in silicon available today.

    That's five points.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon