"what the report calls a rebound effect"
Which will be nicely compounded by Google's decision to remove data saving.
Well done, Alphabet. Way to go to help save the planet.
Modern 5G network infrastructure is more power efficient than prior generations but the Centre for Research into Energy Demand Solutions (CREDS) says it isn't clear if this will deliver a cut in overall energy consumption, or whether consumption may in fact rise. While 5G is sold as a technology that has the potential to …
Google's Youtube Autoplay toggle set to default 'ON', (when cookies are cleared), is a far bigger data hog and energy wasting feature, than Chrome Mobile's data saving feature, certainly overnight.
And for any (clueless, far from pro-active) regulators out there reading, that simple change forcing Google to have Autoplay toggle set to default 'OFF', would make a massive difference (quite literally) overnight to 4G/5G and network (fibre) energy consumption.
And regarding the article, it's marketing spiel, there is no standard definition of what constitutes as 5G, in terms of the RAN and the edge computing, linking back to the core, and once data is sitting in a Data Centre somewhere, is that latent data still part of the 5G energy consumption calculation, if it's required at a future time to facilitate cell handover, then it clearly is.
The article is pure greenwash.
"device manufacturers take steps to increase user awareness of energy use"
Perhaps all devices should come labeled with the energy required for manufacture, like some things do for their energy use in operation. That way, we the users can finally make informed decisions about buying products, based on the actual energy use, not manufacturer's marketing twaddle.
We should also stop using terminology like "90% more efficient than..." as if the previous system was already efficient and the new one is even better. "x% less energy intensive" would be better, to reinforce in the consumers mind that energy is still being used (wasted?) and that simply being less energy intensive than the previous system, whilst a step in the right direction, isn't everything we need to do.
It is yet another sales pitch.
Consider the following: "Our system is 50% more efficient in use." What they did not mention is that you use three times as many resources.
In principle they are correct, but the real world scenario is way off. This is what one calls a perfectly marketable and highly profitable device. The "efficiency" calls for a higher price and the effective extra usage doubles up at the bottom line. That is what counts. It is about how much money the pushers can make.
I think the problem here revolves around understanding what a "unit of traffic" is and getting through the marketing-hype.
Thinking about my 10/100/1000 Ethernet adaptor, I suspect my adaptor uses the same amount of energy over time regardless of the data rate. If a "unit of traffic" is one 1518 byte frame, it is obvious when working in 1000Mbps mode it will send more frames/units of traffic in a given time period and amount of energy consumed and thus "energy consumption per unit of traffic" will be less. Thus my adaptor will still consume the same amount of energy, but in 1000Mbps mode it is using it more efficiently.
Following through, from an installation point-of-view, the best solution is for the 5G kit to not require any power supply and/or cooling upgrade to an existing 4G installation, hopefully facilitating a simple equipment swap upgrade.
The thing is, most of us are completely unaware of how much data we consume.
Networks have become so good and fast (generally speaking) that they are no longer a constraint for what we want to do.
With that:
- websites have become more content rich and more network dependent / dynamic
- videos have gone from 480 or 576 line resolution to HD or even UHD at 2160 lines (not that you need that on a relatively small smartphone screen)
- images are often not optimised in documents (e.g. logos which make a standard Word document 2MB or more in size before you add any text)
- ...
Wouldn't it be nice if data consumption was considered in the building and running of applications and content to ensure that it is fit for purpose, but not wasteful. Every byte processed, transmitted and consumes requires energy.
I am living that reality working with battery powered LPWAN devices and networks - and this is constantly on my mind.
At a much larger scale it also applies to mobile/cellular, WiFi and wired communications.
One can keep taking into account everything in this way ad infinitum. We are creating faster networks because there's a demand for it. That demand itself is driven by requirements for efficiency. If I want to have a video call with 10 people and I live in an area typical in the UK with only 500Kbit/s ADSL access and a crap mobile signal, I'll *drive* to my colleague who has gigabit fibre, using more energy than the entire call. Every single week. Now if I could get 5G and make a decent video call using a faster network...
The whole point is that yes, we could use less energy if we did less stuff. If we didn't create use cases for more bandwidth requirements, if we didn't stream so many movies, etc. But we are talking about technologies that REPLACE energy-guzzling alternatives like driving, flying etc. The much-heralded 3D-hologram-style immersive video calls will use incredible amounts of bandwidth but they save energy in the long run.
Ultimately, I think this report about 5G shows that the people doing the study are just looking for a problem. Energy savings in content creation can be looked at separately, but if we start complaining about 5G because it creates more energy usage further down the line (or up the line), then ultimately we'll become stagnant without any innovation.
Same here. The days of the phone I am writing this on are very, very quickly coming to an end of support on my carrier (end of March).
I was under the impression that, all things being equal, higher frequencies [the 5G bands] will always require higher power to sustain across the electronics (higher switching loses, etc). This is of course offset with a next generation of transistor lithography (smaller, using less power) but adding those two together doesn't equal a huge power savings, only a modest one. So, although I am not a telecommunications engineer and do not have any access to the formulae, I do not see how they could claim large power savings in the telecommunications sector simply by switching transmission 'generations'.
https://en.wikipedia.org/wiki/Jevons_paradox
It's entirely expected that reducing the price per bit of mobile communications will make the total spending on them go up.
This isn't necessarily a bad thing if they're displacing other activities that would have cost even more energy, as other people have mentioned.
The thing about Jevon's paradox is that it applies when the increased efficiency leads directly to an increase in use. But data use is going to increase regardless of any efficiency gains. In the timeframes considered in the article most people still won't even have access to 5G, and in any case by far the biggest contributor to data use in recent years has been the ubituity of wifi, not improved mobile connections. Meanwhile speculation that more efficient infrastructure will lead to mobile providers offering better deals to their customers seem hilariously naive.
So I'm not entirely convinced that there's anything especially bad happening here. It's not that improved efficiency will lead to an increase in use that will more than cancel out any benefit. But rather that improved efficiency will be beneficial given the huge increase in data use that's going to happen anyway.
Sure. It did; past tense. Adding a second, similarly sized pipe in exactly the same location isn't likely to have the same effect. The problem is that the article compares 5G to 4G in isolation, and assumes that the improvements will lead to people using more data. But people are already using 5G levels of data, just over wifi instead of mobile networks. Rolling out 5G might lead them to migrate some of their data use from wifi to mobile, but it's not a step up in capacity or efficiency so there's no reason to expect a signficant increase in data use as a result.
In the words of Pogo, "We have met the enemy and he is us."
Society, as a whole, wants more. They are not content with simple text based websites, they want pictures, zoomable pictures, rotatable pictures, video, virtual reality, etc. They don't read books, they binge TV shows based on books. Their news comes from talking heads video recaps of the headlines. They download apps rather than seek out websites. They live on social media and look forward to the day when they can live in a virtual universe.
The more that 5G allows them to consume, the more they will and the same will be true of 6G and 7G.
For better or worse, it's the world we live in.
Can't help wondering how much energy an ad blocker might achieve?
I block a lot of advertising, and try to avoid auto-run videos, etc. But it is quite possible that the blocking makes the web usable - I might give up browsing if every ad did get through.
A bandwidth-levy applied to online advertising might help? Though how you collect is a bit of a problem given the lack of borders.
The physics of using higher frequencies (new bands added with 5G) doesn't stack up. Sure, you have a wider frequency band, that is bandwidth, right there, but hang on! We learn in high school that E=hf, energy is proportional to frequency. Plus, penetration as you move up from GHz to THz gets worse and worse, these frequencies are scattered by airborne particulates, and also resonate to collective atomic motion of water and biomolecules, so less range, so obliged to pump it up even more. Why won't people just ditch this stupid telecoms standard?
Isn't it the case that 5G *supports* higher frequencies... Thry aren't mandated, just available if there are capacity issues.
For instance, I understand that the high UHF band that used to be used for TV has been cleared for 5G. That would make it the lowest frequency any standard uses.