Thus neatly demonstrating the folly of linear trend-fitting.
By 2040, computers will need more electricity than the world can generate
Without much fanfare, the Semiconductor Industry Association earlier this month published a somewhat-bleak assessment of the future of Moore's Law – and at the same time, called “last drinks” on its decades-old International Technology Roadmap for Semiconductors (ITRS). The industry's been putting together the roadmap every …
COMMENTS
-
-
-
Monday 25th July 2016 09:06 GMT AndrueC
Re: Good thing world electricity production won't flatline until 2040
Similar to the follies of the 70s when people asked oil companies how much oil there was in the ground but failed to understand that oil companies only look so far ahead. Just because no-one has planned power generation increases doesn't mean they won't happen.
-
Monday 25th July 2016 09:08 GMT Jonathan Richards 1
Re: Good thing world electricity production won't flatline until 2040
@DougS
That was my initial reaction, too. But look again at that graph: the Y axis is logarithmic. Even if you plot a line on it where electricity production doubles every five years [1], it's still going to intercept the IC demand lines some time before I'm a centenarian.
[1] I'm reminded of a Dilbert cartoon, in which Dilbert points to a presentation slide, and says "In phase 3, we meet an alien civilization which shares its advanced technology with us".
-
-
-
Monday 25th July 2016 14:14 GMT TheVogon
Re: Good thing world electricity production won't flatline until 2040
" rather than relying on intermittent renewable energy."
Hydroelectric and geothermal can run all the time, as can wave energy and tidal is at least predictable.
Wind and solar can be variable, but we can easily (and do) use these to reduce the use of non-renewable power sources when they are available.
-
Monday 25th July 2016 15:16 GMT Rich 11
Re: Good thing world electricity production won't flatline until 2040
I think we'll just have to start burning people for fuel. I know that'll be bad for climate change, but the concomitant population decrease will help in many areas. Maybe places like Bangladesh will still be stuck with a 'burn or drown' dilemma, but I'm sure they'll come to understand how important it is that the rest of us get to carry on playing Pokémon Go (2040 Edition).
-
-
-
Monday 25th July 2016 17:28 GMT Stoneshop
Re: Good thing world electricity production won't flatline until 2040
IOT is a waist of time...
A fat load of it is, indeed. Like the electric kettle I saw on a website, that you could control via BT, keeping the content at any selectable temperature for up to 12 hrs. Which I consider a serious distance into Whatthehellweretheythinking territory.
But I would like to keep being able to minimise my (external) energy usage, for instance by automatically opening windows for ventilation if the outside temperature is over a certain minimum, and it's higher inside and over a certain minimum (plus a few other conditions, like not being away). Or running the washing machine on solar if there's enough of that, else on off-peak. Shutting off the heating if there are windows open, and notching up the recovering ventilation system when they're not. Maybe even being able to send an SMS to the heating system that I'll be away for a few more hours, so it can adjust the heating accordingly.
But maybe that's not worthy of the (id)IoT moniker, because it does not involve other computers than those entirely my own.
-
-
-
Monday 25th July 2016 17:50 GMT Daniel von Asmuth
Re: Good thing world electricity production won't flatline until 2040
To bad nuclear fusion won't be productuon-ready until2050.
As IBM told us in 1945, the world has a need for 5 computers. According to the graph the world produces 3 TW of electricity, which will be sufficient for the GW class of computers expected in 2020, but if a 2030 supercomputer (1000 Exaflops) draws 1 TW, that will power only three computers.
-
Monday 25th July 2016 19:01 GMT foxyshadis
Yup, doesn't matter if the linear lines are on a log scale, real life has never followed straight trends. The P4/Power=>Opteron/Core2=>Arm transitions have probably each temporarily _reduced_ the world's computing power needs until device count caught up again; it's not unlikely that this will happen again at some point. That might be the laziest prediction every made.
-
-
-
-
-
Monday 25th July 2016 19:20 GMT VinceH
Re: On the basis of votes to the above comment by VinceH :
I was thinking nine ex-RISC OS users who look back on it with nostalgia, and two current users - me, and someone who took offence at my denigrating its capability with current internet standards.
But, yeah, your assessment works as well.
-
-
-
Monday 25th July 2016 14:40 GMT Soruk
Re: 2040
For some use cases, this can already be here. A RasPi3 running Linux gives you email, web browsing and an office suite!
Yes, it's not going to be blisteringly fast, but it'll do the job, and can be powered from a phone charger - or even a USB battery pack, which can in turn be connected to a solar panel.
-
-
Monday 25th July 2016 02:17 GMT Anonymous Coward
Dubious extrapolation
Whenever I see projections of exponential growth continuing for decades ahead (which is what that energy consumption chart, noticeably devoid of any actual data points, indicates) my bullshit detector goes off the scale.
I'd also suggest that the 2015 energy consumption figure of 10^14 J/year does not justify the assertion that "the world's computing infrastructure already uses a significant slice of the world's power", when energy production is shown as well over 10^20 J/year!
-
-
-
Monday 25th July 2016 04:05 GMT RIBrsiq
Re: MISPWOSO
>> Where would we go beyond that limit?
If I knew that, dear old chap, I wouldn't be wasting my time writing comments on an Internet forum!!
;-)
Point is, there's usually new tech sooner or later. Even without that, I've been reading about the eminent demise of the so-called Moore's Law since the early '90s.
Demonstrably, matter can support higher computation densities than we've so far achieved. Much higher densities, in fact. To stick with a cliché, take the human brain. And for a bit of a more abstract example, take any lump of matter doing whatever it is it is doing: to accurately simulate all intermolecular interactions and so on, you would need a computer much more massive than the original lump of matter. Now, I'm not saying we'll ever approach such a density, but when we needs to count the orders of magnitude separating us from it in orders of magnitude, there's clearly still a long way to go!
-
Monday 25th July 2016 08:48 GMT Charles 9
Re: MISPWOSO
"Demonstrably, matter can support higher computation densities than we've so far achieved. Much higher densities, in fact."
Exactly what KINDS of densities are we talking about? And isn't die shrinking already raising the density of our chips? What about heat dissipation, which is inevitable with conductors the way they are today?
-
Monday 25th July 2016 10:36 GMT Pascal Monett
Re: take any lump of matter doing whatever it is it is doing
What any lump of matter is doing is being held together by the strong nuclear interaction - no computing needed.
The brain is a much more interesting example - we still don't entirely understand how memory works or thoughts are processed, but progress is being made.
Once we know how the brain works, there will be another leap ahead in processing capacity and, probably, the ever-elusive field of artificial intelligence.
Then we'll end up with a prissy golden robot telling us to shove off and leave his pint of cinnamon-flavored lubricant alone.
-
Monday 25th July 2016 11:45 GMT Charles 9
Re: take any lump of matter doing whatever it is it is doing
"What any lump of matter is doing is being held together by the strong nuclear interaction - no computing needed."
And what does that have to do with the price of tea in China?
"The brain is a much more interesting example - we still don't entirely understand how memory works or thoughts are processed, but progress is being made."
Credits to milos we learn it operates nondeterminisically (at least partially by chance), meaning a 1-to-1 correlation of computer to brain becomes physically impossible (because a deterministic machine cannot accurately emulate, simulate, or otherwise a nondeterministic machine). Also part of our basic store of knowledge will probably be revealed to be genetic since babies show the ability to recognize their parents and even recognize when their environment has subtly changed even before learning to communicate (behaviorists tell by noticing their reactions when they subtly change things around and notice how they fixate on those changes).
-
-
-
Monday 25th July 2016 05:45 GMT Mark 85
Re: MISPWOSO
Where would we go beyond that limit?
Well the quantum computer that everyone is promoting but doesn't actually exist. Time and space are meaningless... wires not needed.. or something like that. Maybe it's a virtual quantum computer...
Mine's the one with the PR Manual on Quantum Computers in the pocket.
-
-
-
Monday 25th July 2016 05:41 GMT ecofeco
That is one serious bullshit chart
Oh this is rich. Energy production line stays flat? Immediate FAIL.
No trend line for efficiency? Double FAIL. This chart wouldn't make it out of a high school class for basic statistics.
WTF? Let me write down the name of this group so I can remember what liars and cons they are. Or idiots.
I'd bet all three.
-
Monday 25th July 2016 09:32 GMT Jonathan Richards 1
Re: That is one serious bullshit chart
> Energy production line stays flat?
Looks flat; isn't flat. There is a very slight upward slope on the energy production line (only three pixels across the years 2010 to 2040, dy/dx = 0.008...) but it's on a logarithmic Y axis. I can't be bothered to do the arithmetic, but I think that represents a pessimistic forecast for growth in electricity generation.
-
-
Monday 25th July 2016 20:18 GMT imanidiot
Re: That is one serious bullshit chart
Given energy output growth over the last decade or so and the current developments in power generation technology there is not much reason to assume a massive increase in power output levels.
Remember, ITRS is a business group for the semicon industry. They have to assume power levels are going to be a problem because they very likely will be. Assuming they are going to see some rapid growth in output levels means taking a very large risk. In the past (iirc) ITRS has been pretty good at predicting the correct roadmap and technological advances.
BTW, this might be the last ITRS roadmap but more focused industry groups are already getting setup to meet the more specific needs of todays semicon market. ITRS was simply no longer the right forum for the job (as described in the article).
-
-
-
-
Monday 25th July 2016 05:52 GMT Steve Davies 3
Buy stock in bullshit makers Now!
Well, that's about all this puff piece contained.
Oh silly me. Bullshit gives off huge amounts of greenhouse gases. That will be banned under Pres Trump. Only he will be allowed to speak total crap.
That will be enough to raise global temps by 2C on its own.
{Sarcasm intended}
-
Monday 25th July 2016 17:32 GMT Steve Davies 3
Re: Buy stock in bullshit makers Now!
Ok, trying to be serious for a monent (yes I know it hurts some around here)
1) Every house has PV cells on the roof. Especially new builds and all housing Assoc properties. Quite why new builds don't have them already is beyond me.
2) Every house has storage batteries (think cheap Tesla Power Wall).
3) Where appropriate Ground source heat pumps can be used to heat/cool the house.
Then the batteries can charge during the daytime either from the PV or the mains. Then when it is dark the house can be powered off the battery. so if there are 'Brown Outs' homes can continue watching Corrie/EastEnders. That will keep the masses happy.
It can be done using current tech if you have the will. Yes it costs a lot at the moment but look at the size of the battery factory that Elon Musk is building near Reno, Nevada. That alone in its current form will produce 50GW of storage a year. Not all of that will go in Tesla's.
If you go to Brazil, even very poor homes in the shanty towns have PV cells on the roof. Yes they are small but they allow the family to have light at night. all it takes is the will.
I fully expect this to attract a fistful of downvotes. If you do please take the trouble to explain why. Then perhaps we can learn from your great wisdom.
-
-
-
Monday 25th July 2016 20:21 GMT imanidiot
Re: Cut to the chase
A lot of that focused funding and partnership is funding from one commercial company (like Samsung or Global Foundries) to another commercial company (Like ASML, Applied Materials or FEI for instance) for R&D and development of tooling and methods. Not everything with the word funding is taxmoney...
-
-
-
-
Monday 25th July 2016 10:09 GMT Doctor_Wibble
Re: what will all these computers be computing
Very likely to be close to the truth, as it's certainly not domestic computing - more efficient machinery means my home power consumption has dropped significantly over the last few years, in spite of having far too much electronic tat switched on.
Obviously 'advertising' includes big search engines, timewasting 'friend' sites, large webmail providers, social media bainfart sharing systems etc...
Presumably the other significant part of the power usage is from the various simulation-related machines, primarily space and weather, i.e. stuff with an actual purpose?
-
-
Monday 25th July 2016 08:22 GMT allthecoolshortnamesweretaken
Aww, c'mon guys - predictions are tricky.
-
Monday 25th July 2016 08:40 GMT Arthur the cat
Amusing coincidence
OK, the extrapolation is dodgy as hell and total nonsense but did anyone notice the crossover happens just before the 2038 roll over of 32 bit time_t on Unix? It's the silly season in the UK so can anyone get the tabloids to run stories along the lines of "How will civilisation collapse? Will computers run out of power before they run out of time?"
-
Monday 25th July 2016 08:55 GMT EvadingGrid
No, it just means the end of the desktop pc
No it just means the end of single desktop pc built around one chip, and the migration has started with multi core processors.
The power will increase because instead of a computer, people will use a collection of machines networked utilizing anything that has a processor and connect to the network.
You will still have something looking like a desktop, but it will be a terminal to act as the controlling node on your cluster.
At present the limitation is the software, dirt cheap hardware such as the ubiquitous raspberry already exist.
-
Monday 25th July 2016 09:37 GMT Jonathan Richards 1
Re: No, it just means the end of the desktop pc
Why do you think that the silicon running the "cluster" will use less electricity per bit than the silicon in a single-user device? In as much as there is any sense in the press release, it's about pointing out that the physics is becoming the limiting factor, rather than the technology.
-
Monday 25th July 2016 12:05 GMT rd232
Re: How is it pronounced?
"No, it just means the end of the desktop pc ... You will still have something looking like a desktop, but it will be a terminal to act as the controlling node on your cluster."
- that's my thought process: computing power may move further towards servers providing services to well-connected clients; but that solves a different problem than the one at hand. In fact it only worsens the supposed electricity supply problem, by making the size/heating issues easier to solve, if much of it is relocated to large silos (i.e. server farms) in the middle of nowhere instead of in offices, homes, and especially in people's pockets/on wrists. Those silos will be less restricted in how their electricity demand goes up because they can throw more energy and space at solving overheating issues.
-
-
Monday 25th July 2016 09:37 GMT Qu Dawei
Extrapolation
Hasn't anyone taught these people of the dangers of extrapolation, and the need to consider very carefully the underlying model one uses if one even dares to extrapolate too far into the future? Come on! This should be elementary stuff in things like time-series analysis, and other prediction techniques, and so on.
-
Monday 25th July 2016 11:44 GMT harmjschoonhoven
Let's do the sums.
Fig. A8. gives World's energy production as ~5*1020J/yr without any reference. World Electricity Production from all energy sources for domestic and industrial use was in 2014 22433 TWh = 8*1019J/yr or 350 W per person (including those living on less than a dollar a day).
Ever heard about solar energy?
-
Monday 25th July 2016 13:13 GMT David Roberts
Efficiency savings?
Can't be bothered to look it up but isn't there something like Avahanjobs Law about the laziness of coders expanding to match the available resource?
For example I have an ancient AMD system that ran quite happily with Ubuntu for years until I updated to the latest version and it now struggles. I can't see that I am getting twice the functionality, just twice the bloat.
On the subject of PCs we have about 7 running at the moment (who knows why) and they spend most of the 24 hours idle. This is not counting mobile phones, Raspberry Pis, Kindles, tablets......
So there is an enormous amount of spare computing resource.
Perhaps there is the technology to wire a load of screens and keyboards together and run one PC as a true multi-user host but it isn't obvious.
There is no incentive however as long as the spare resource is so cheap.
-
Monday 25th July 2016 18:13 GMT Stoneshop
Re: Efficiency savings?
Perhaps there is the technology to wire a load of screens and keyboards together and run one PC as a true multi-user host but it isn't obvious.
The systems I manage at work serve at least half a dozen users, each with 5 to 8 screens. Now, the X display controllers are essentially full-blown Linux PCs themselves, but they're not doing any actual computation except for one particular subtask on a number of them.
Setting things up like that isn't particularly difficult but you're not gaining anything because the cheapest way to get a remote terminal for your central system is by getting a tablet, laptop or even a PC. I think that only if you have a totally 'dumb' terminal with a total energy requirement just a tick over that of the screen will the effort pay off.
-
Monday 25th July 2016 19:06 GMT Anonymous Coward
Re: Efficiency savings?
"the cheapest way to get a remote terminal for your central system is by getting a tablet, laptop or even a PC"
Really?
The software to turn a Raspberry Pi into a DIY multifunction thin client has existed for a while, and inevitably there are commercialised versions, e.g. around a couple of months back Citrix announced a $89 Pi3-based thin client, which was even covered here:
http://www.theregister.co.uk/2016/05/24/citrix_bakes_up_raspberry_pi_client/
Have a look, feel free to report back (either here, your workplace, or both).
-
Monday 25th July 2016 20:07 GMT Stoneshop
Re: Efficiency savings?
The software to turn a Raspberry Pi into a DIY multifunction thin client has existed for a while,
That's true, but I doubt it'll be feasible to get those running at work: the central system expects each workstation to have a single address (with, as said, up to 8 screens). Changing that to eight Raspi's each driving a single screen, so eight addresses, might be doable (at one time displays were driven by a Tektronix NC 900 per screen), but given that there's now some local computing being done on the PCs driving the screens, it'd probably be a no-go, or at the very least quite involved, to move to RasPi's.
At home, my computing resource is my laptop. There's a file server with modest power consumption, and occasionally I boot a big, dual-screen PC for serious stuff (which the file server with two Pi-driven displays won't cut). Apart from that I have no need for a remote-display-to-a-server at the moment.
Also, you and I (and most others here, I expect) can cobble together a Pi plus a screen plus the software, but it's not something Joe Q. User can buy at Curry's, with a bit of software that makes their PC into a server for these devices. Which was basically what I was trying to say.
-
Tuesday 26th July 2016 08:57 GMT Anonymous Coward
Re: Efficiency savings?
"it's not something Joe Q. User can buy at Curry's, with a bit of software that makes their PC into a server for these devices. Which was basically what I was trying to say."
Many years ago, there was a small but remarkably well formed UK ISP called Metronet. They were acquired by Plusnet and some of the core folks went on to set up an outfit called Desktop on Demand, at a time when "desktop as a service" was just starting to get a bit of coverage even though the broadband infrastructure to make it workable didn't actually work all that well.
The service provider had the servers and some relatively routine software. The customer had the "thin" client. The customer had no need for a window box or an onsite server, and their desktop was available wherever they were, subject to connectivity.
Maybe it might catch on one day, especially now the hardware and software and broadband is rather more fit for purpose, and that broadband connectivity is a zero-value-add proposition for most providers.
-
-
-
-
-
Monday 25th July 2016 14:24 GMT John Savard
More Information
I noticed that the yellow line showing total world power generation was not, as it appeared at first, fully horizontal; it rose slightly as one went from left to right.
Given that we have options available to produce more electricity that don't add to carbon emissions which also don't have the serious limitations of wind and solar - nuclear power, with breeder reactors, and using Thorium-232 which is even more common than Uranium-238 - I feel that if there is a continuing demand for more computing power that, due to limited improvements in energy efficiency in processor chips, leads to a demand for more electrical power, it can be met for many years to come.
It's also possible that their graph assumed more people buying microprocessors, in order to generate that rising demand curve for electrical power, than the world is actually capable of feeding, in which case that graph would illustrate the least of our problems.
-
-
Monday 25th July 2016 19:30 GMT Charles 9
Re: More Information
Well, there's the persnickety issue that atomic reactors as they are now inevitably take you at least part of the way to making weapons-grade material (this is true even of Thorium reactors; they can produce weaponizable Uranium-233 which a determined adversary could isolate). ANY process that can be usurped into a weaponization project is frowned upon by people not wanting World War III. I also recall a potential byproduct of the Thorium cycle is Protactinium, which has a half-life of over 32,000 years.
-
Tuesday 26th July 2016 06:28 GMT John Savard
Re: More Information
At the present time, there are laws and regulations which prohibit the possession of fissionable materials by other than known responsible parties. So the use of nuclear reactors for power in Canada, Britain, France, Japan, and the United States has not led to proliferation. I see no reason why a few more nuclear power plants in those countries, and other similar countries, like Australia, Norway, the Czech Republic, and so on, would cause problems.
On the other hand, it is true that while India and Israel should also be producing more of their electrical power from nuclear reactors, having such reactors did apparently give them the opportunity to develop nuclear weapons. And, while India is under some degree of threat from nuclear-armed China, nuclear weapons in India's hands led to Pakistan developing nuclear weapons, which is a problem.
If we can develop transmission lines that are so efficient that, say, all of Africa could get free electricity from Europe in return for not having direct access to fissionable materials, then wind and solar might be practical too, I have to admit.
-
Tuesday 26th July 2016 08:45 GMT Anonymous Coward
Re: More Information
"If we can develop transmission lines that are so efficient that, say, all of Africa could get free electricity from Europe in return for not having direct access to fissionable materials, then wind and solar might be practical too, I have to admit."
Readers might want to have a look into the Desertec concept, and its rise and ultimate fall, despite the technologies being largely tried tested and proven.
Generate solar electricity in North Africa (where there's a lot more sun than there is in most of Europe), and use low-loss HVDC transmission to ship it across to places in Europe that could make use of the electricity. And as a side benefit, generate a bit of income for the Africans in the picture.
-
Wednesday 3rd August 2016 22:15 GMT Charles 9
Re: More Information
"Generate solar electricity in North Africa (where there's a lot more sun than there is in most of Europe), and use low-loss HVDC transmission to ship it across to places in Europe that could make use of the electricity. And as a side benefit, generate a bit of income for the Africans in the picture."
But then politics inevitably gets involved. Who owns what? That's why we can't have a solar satellite in space. That kind of energy means power, political power, and there WILL be fights over it.
-
-
-
-
-
-
-
Friday 29th July 2016 19:02 GMT Vic
Re: global warming...
You have to tap energy from the flow of heat from a hot place to a cooler place instead.
Moreover, the peak theoretical efficiency of a heat engine is
1 - TC/TH
. Thus the warmer the environment gets, the higher the ultimate sink temperature gets, and so to less efficient the conversion.Vic.
-
-
-
Monday 25th July 2016 19:03 GMT Stevie
Bah!
Wot, even if we stick all our "cloud" servers in cold places and use passive cooling instead of wasting electricity on a/c to cool air we then heat up again?
We obviously will need those orbital solar powersats then, won't we? Once everyone understands it's a clear choice between not having microwave beams from orbit near them and being able to swap cat videos on arsebook the popular NIMBY groundswell will abate tootsweet.
-
Monday 25th July 2016 19:48 GMT Speltier
Landauer Limit
If we "know" how much energy is in the Universe and the entropy of said Universe, one can go backwards and calculate the number of states the Universe contains. Fine print: exercise for the reader about the definition of a state.
If we look at a small piece of the total, say, the Earth-Luna combination and making the usual physicist assumption about a perfectly spherical region containing Earth-Luna across which inbound and outbound energy is balanced, and given the entropy and mass, the maximum number of both states and transition states possible can be calculated. A few corollaries drop out:
-- one should limit the number of state transitions to reduce global warming (should one believe in global warming). Every mythical carbon credit counts. China bashers rejoice, TOP500 is a reflection of Earthly destruction.
-- constructing a suitable model, a subset of researchers may notice there there is a looming disaster in the rising tide of entropy. We really need a government program driven from Brussels to help us avoid planetary doom from peak global entropy(r). [in the model the extreme acceleration of entropy due to committee activity of Eurocrats is ignored]
-
Monday 25th July 2016 19:57 GMT Bucky 2
What is the consequence of being wrong?
I hear 7 predictions of doom before breakfast. We have to come up with some kind of disincentive for people to go off half-cocked, spewing nonsense.
If the doom-predictor hasn't vowed to scoop out his own eyes with a melon-baller if 2040 comes and goes with enough electricity to power the world's computers, then he doesn't believe it himself, and doesn't deserve an article about his claims.
-
Monday 25th July 2016 20:18 GMT Boris the Cockroach
It really does'nt matter
so long as the AI programs in the interwebs dont become self aware
"Hmmmm we're going to run out of power in 2 years time... so lets kill all the humans and theres no need for cat videos, kettle boiling or eastenderstreet to consume OUR power"
Boris
<<<busy burying mini-guns, RPGs, M16s and tons of ammo in bunkers ready for the machine war....
-
Monday 25th July 2016 21:50 GMT another_vulture
The chart is confusing most comentators
It's really quite simple. The chart is a simple projection of he growth of energy production and the growth of computation. It's not intended as a prediction. Rather, it is intended to show that something must change. If the global amount of computation increases faster than does the available power, then computation will eventually consume all the energy. The data at which this happens depends on the computational efficiency. The three lines each assume a (fixed) exponential increase in computation, but at three efficiencies: "benchmark", "target", and Landauer's bound. choose any model of efficiency increase you want: your model describes a curve starting on the "benchmark" line (no efficiency increase) and eventually approaching the bound. No technology can exceed the bound: it's a law of physics. (Look it up.) So, before about 2050, either computation quits growing so fast or energy production starts growing faster.
That's still many, many orders of magnitude more computation than we are doing today, and I cannot figure out what we will be doing with it all. Note that better algorithms can make use more efficient use of the same amount of computation.
-
Monday 25th July 2016 21:53 GMT IJD
Extrapolating into the future predicted that by the early 20th century cities like London would have ground to a complete halt because the streets would be six feet deep in horse shit -- which is pretty much what this prediction is...
If the power consumed by the IT industry ever got close to the power generation capacity of the planet, the IT industry would have to find a way of reducing power or go bankrupt because there'd be no power left to run the industries to make things which pay for the IT industry.
You could just as easily extrapolate the events of the last few weeks and predict that by the end of the year everyone on the planet will be spending 100% of their waking hours playing Pokemon Go...
-
Tuesday 26th July 2016 00:31 GMT J.G.Harston
Moore's Law
"The further below 10 nanometres transistors go, the harder it is to make them economically."
How about, shock horror, people write more efficient code instead of throwing transistors at everything. Eg, I installed Android Studio today and it took more than an hour to install, and the using the IDE is like wading through cold treacle. And it still refuses to compile 'hello world'.
-
Tuesday 26th July 2016 07:28 GMT jb99
Oh no
Three weeks ago I had one beer in the week, two weeks ago I had two beers, last week I had four beers. I therefore confidently predict that based on this trend by the end of the year I'm going to be drinking at least a million beers a week based on that trend. By easter next year we're going to have to pave over large parts of the country to be able to brew enough beer for me.... By this time next year there is no known way to make that much beer.
I'm about as worried by this as I am by the article.
-
Tuesday 26th July 2016 16:58 GMT cortland
No it won't
" and the ITRS says the current trajectory is self-limiting: by 2040, as the chart below shows, computing will need more electricity than the world can produce."
LOGICALLY impossible, technologically unlikely and practically impossible.Will the Master Breaker at the Fortress of Solitude trip and the Earth go dark? No more will a teenager playing VR Thrones bring the world of business to a halt. VR Pron now; that's another matter.
-
Thursday 5th May 2022 13:43 GMT Joel Kessels
Rubbish
Three points:
- Power creation will increase
- The human brain uses the energy of a fairly dim light bulb, so obviously peaks in computational efficiency have not been reached
- Superconducting circuits do not have resistance and therefore do not create heat, so the "heat boundary" argument is flawed
- Without needing to consider heat losses, chips can be made in 3d configurations, for far greater efficiency
- Optical computing has a speed limit of at least 700ghz, rather than the 4ghz in silicon available today.
That's five points.