
French fry me right up
Ah! That'll explain why Paris will see 60F (15C) temperatures tomorrow ... hopefully they don't run this 50 million degrees machine right smack in the middle of summer now!
France’s Commissariat à L'énergie Atomique et aux Energies Alternatives on Tuesday claimed it’s topped China’s recently-established for record maintaining fusion plasma in a tokamak, and therefore taken another step towards building a fusion reactor capable of producing cheap energy. The Commissariat (French Alternative …
Hold - ENORMOUS PRESSURE ??? Hey, that is a near vacuum in there to make those H and H² fly! Else they'd constantly bumping into other atoms, never reach the speed, never reach the heat, and we could not control their direction with magnet fields but with baseball bats strategically placed...
Article lost, not worth reading any more after those five words.
Peak plasma pressures in the tokamaks I know about are measured in millionths of an atmosphere. The peak pressure is near vacuum, and pressures get lower (ideally) as you get closer to the walls of the chamber or else you leak too much heat. Because the temperatures are so high, the density of the plasma to produce such (low) pressures is very very low (high temperature = high particle velocity, and the pressure comes from the density of particles multiplied by their momentums).
I don't know what plasma pressure CEA West achieved, but ITER is expected to run at about 2.6 atmospheres of pressure in its plasma, while MIT's Alcator C-Mod achieved 2.05 atmospheres in 2016.
I have found that CEA West's diverters are cooled with water at 30 bars pressure, but that should have little to do with plasma pressure unless there's a leak and nascent Stackpole event. So, rampant speculation: since CEA West is meant to inform ITER's design and operation, it's plasma pressure is in the 1 to 2 atmosphere range. (I have found a nice article on CEA West's vacuum pumping system when it was "Tore Supra," but the vacuum pressure between the plasma and the plasma-facing components when it was a graphite-lined reactor isn't quite answering the question.)
How do you know we haven't?
Like the everlasting razor, the bagless vacuum cleaner and turkeys voting for Thanksgiving, Christmas and the Superbowl, how likely do you think it is that the companies that make a lot of money from us for electrickery are going to admit when someone does develop a source of almost limitless, almost completely free power? Have you not noticed that the more of our power is provided by "free" renewables, the higher our bills go?
Could it have anything to do with trying to make as much money as they can before the public realise what they are up to, and carpe the illegitimi?
Not sure about razors but I think lightbulbs were some of the first products to be shown to have builtin obsolescence.
Renewables are not free but have near zero marginal generation costs. But you've still got to pay for making, installing and connecting them – that's where most of the cost related to them now occurs – and in many countries there are residual feed-in tariffs, which should phased out as soon as possible. But, all in all, and in the right places – renewables are among the cheapest sources of electricity available but storage is still a problem. But, as long as we're still increasing generating capacity, we'll face higher bills. Decades ago, Amory Lovins of the Rocky Mountain Institute noticed this and coined the term "Negawatt" as a way to describe, and price, energy efficiency that could benefit both producers and consumers. Almost worthy of a Nobel prize and tellingly missing from many projects and products, which is why cars have got bigger and heavier and we're still building power plants.
There's a Technology Connections video about light bulbs and 'built in obsolescence'. In short, whilst there *was* a short lived conspiracy among manufacturers in the early 20th century, it didn't last long and obsolescence is an inbuilt feature of incandescent lamps. Basically it's a trade off between cost, power usage, and longevity. Long life incandescent bulbs exist but the increased cost is not worth their additional power consumption.
Where that falls down is
1) This shouldn't apply to LEDs. They should last for absolutely ages, and failures appear to be due to cheap electronics.
2) Light bulb manufacturers, or rather their marketing, don't help themselves. They advertise 'n years of usage' but then in incredibly small print this is defined as '2-3 hours a day' which I suppose might be realistic for a kitchen, but certainly isn't for many lounges or landings.
For razors you're getting 15-20 shaves for the fancy 3-5 blade versions (figures from Gillette - I'd say they're reasonably accurate, possibly slightly conservative) so there's not a huge amount of room to compete.
I think given the cost per usage of light bulbs, razors, or vacuum cleaner bags they're not the best targets for a conspiracy. The better target, given this is an IT news site, is subscription based software. There is absolutely no reason why the old defined releases and per version patches model could not be retained, the companies simply want more money for less effort.
> 1) This shouldn't apply to LEDs. They should last for absolutely ages, and failures appear to be due to cheap electronics.
Go watch Big Clive as he disassembles LED bulbs, pops in a resistor and gets them running at lower currents, so they no longer get pointlessly hot.
Well, pointless unless you have some reason to want to have them fail early...
"Renewables are not free but have near zero marginal generation costs. "
There can be nothing in terms of fuel costs, but there are often substantial running and wear costs. Doing maintenance on a very large wind turbine is not a cheap affair. Blades degrade, etc. Solar panels degrade over time so doing some long division will give a cost per day over the lifetime of the panel and there's also a loss in generation over time that factors in as well. Not keeping the panels clean will have a cost. Some initial costs do live for a substantial amount of time. Pylon mounts for both ground mounted solar and turbines can last significantly longer than the turbines and panels so it's a matter of not needing to do that again if later upgrades are built to mount in the same place with the same hardware. An issue that might crop up is a government coming in and reassessing the sites in terms of valuation for property taxes and aiming for the moon or committees might insist on painting pylons a new color every couple of years to keep up with the latest thinking in aesthetics.
I see wind and solar electricity generation as a very good use of land that's otherwise not very useful for other things. What's missing is the technology that the power plugs into that can roll with the intermittency either by how it's used or priced in real time.
Mentioning solar panel degradation reminds me of the Europes oldest grid-connected solar installation in Switzerland, set up in 1982. All of the modules are still operational, degradation was between 0.2% and 0.7% per year. They used different cell types and materials to test degradation. The oldest installation in Germany is at Universität Oldenbug one year younger, and smaller.
About cleaning: I see around here that roof-mounted modules at about 45° angle or steeper get automatically cleaned by rain. My installation is quite young, but there are others around in visible range from my balcony which hit the 10 year mark and look fine. Though I've never seen a panel-cleaner in my neighborhood on those steep roof-mounted panels, it does not mean that they were never cleaned :D. Those with only 30° or less are known to need regular cleaning.
But: if you have 50° or 60° balcony mounted in a way that bird poop lands on them, i.e. birds on your handrail "let it go", rain may not be enough. In that case only 90° works if you don't want to clean them often.
"Those with only 30° or less are known to need regular cleaning."
There's an optimum angle to get the best performance so just bolting panels to any ol' roof and hoping for the best can be a big waste. A general rule of thumb is an angle equal to your latitude with some bias depending on which season you want to have more power. If you are really fancy, you can have a system that adjusts.
Commercial installations are more sensitive to efficiency so will monitor performance and clean the panels when the cost to do it makes sense. Private owners might not give a toss. There are examples posted of what 10% occlusion looks like and it's not that much. When panels have the appearance of being dirty, they can be down by 50% or more. Bird doodaws are a big problem.
Well built panels using quality materials will last a fair amount of time. I've seen super cheap panels where the backing and other bits are just flapping in the breeze in a relatively short period of time. Once the cells are exposed to weather, even from underneath, they last a fair bit less.
Besides checking for bad shadows (chimeys, electricity lines etc) making roof full does make sense. Even north direction (in Germany) makes sense since it is so cheap. It will show when there is bad weather, and the little light the clouds let through come from, practically, all angles. When those evil clouds kick your 400W panels down to 30 watt per panel (darkest day in Germany 2024), you can add those (six or more) north panels with 20 to 25 watt each. When the sun is shining those northern panels still generate 50 to 100 Watt each, but that depends on the surrounding houses or mountains how much diffuse light comes around. Of course that only works 'cause that stuff got so damn cheap.
I read an article a while ago (no idea where so I can't link to it) where it said (assuming you pay someone to do the cleaning) the cost of cleaning solar panels will never be recouped by the extra electricity generated as a result of the cleaner panels. The conclusion was, never clean solar panels as it's not economically viable.
"Bagless vacuum cleaner - newsflash theyve been on the market for 3 decades now"
The implementations can be less than impressive as compromises are made to fit into the marketing departments form factor. Mine does two stages of separation and then captures the really fine particulates in a disposable bag. I had a long hair cat and my pure bag vacuum filled up very quickly. I got the newer one to mitigate having to buy so many bags. I tried a washable bag and it was rubbish.
Water has a high specific capacity, unsurpisingly has convenient phase change temperatures*, is a good conductor and, above all, abundant. It's not used to generate heat, but to transport and transfer it.
* Life on earth would be a very different place if not all phases of solid, liquid and gaseous water weren't possible.
And what is your point exactly?
You complained that we still use water-aggregate transitions to generate power. Okay.
Question 1: What exactly is the problem with that method in your opinion?
Question 2: Do you have a better approach?
Question 3: What makes you believe that scientists are not doing this research already? Just because haven't seen any results yet, doesn't mean the research isn't done. And that, my friend, isn't scientists fault, but rather the idiotic process of science communication that society imposed on itself due to, *drumroll* capitalism! Because NEGATIVE results, even though scientifically speaking they are easily as important as positive ones, doesn't get people published, doesn't result in funding, doesn't result in jobs.
> we still use water-aggregate transitions to generate power. Okay.
> Question 1: What exactly is the problem with that method in your opinion?
It just seems like a shockingly complicated way to get enough heat to boil water to spin a turbine generator.
Why not just exploit the temperature difference between the surface and deep in the earth's crust? Maybe use thermoelectric generators, rather than turbines.
Or use the sun to irradiate photovoltaic generators*? Distribution cables to move it to the dark side of the globe could well be cheaper than thousands of fusion power plants.
How about putting the same research effort into drastically reducing our energy needs?
* yes, I know we do that bit:)
"Why not just exploit the temperature difference between the surface and deep in the earth's crust?"
It's easy enough to go back to Carnot's work and also look at the typical temperature gradient as one drills down into the Earth. In a volcanic area, the delta is great, in more stable patches of the crust, it's not that good. I spent a nice bit of time chatting with an engineer that works(ed) at Teledyne on RTG's about some ideas for lunar applications and got a bunch of schooling that I had never put together from my uni classes.
Once you get the theory, you have to work out the engineering. Drilling really deep holes in the Earth is crazy expensive. Once you get one drilled and haven't broken off a bit, you have to install some sort of heat transfer infrastructure to get the heat to some point where you have enough of a gradient to do useful work at an efficiency that is financially viable.
"How about putting the same research effort into drastically reducing our energy needs?"
At how much an impact to how people are accustomed to living?
We do know how to be much more efficient in many different areas. Electric trains are a big one. In the US, there's perishing little use of electricity for moving trains. It's a very large country with a gazillion miles of track, but having locos that have a diesel engine, a battery tender and the ability to use electricity from overhead lines would be a good step. It will take some time before there's even a chance as electrifying a small portion of track that there could be far more than enough ROI on those locos to be getting on with. I can see a point where diesel "helpers" are used in places where installing overhead lines and powering them would be too expensive and hard to maintain. Not only would the helper supply traction, it would also be able to recharge a battery tender car to shorten the time it would be needed.
I'm always working on making my house more efficient. It's worth it to me as I own the house. Where's the value for somebody in council housing or a rented downtown flat? Even with ownership, a solar light tube is too expensive for me to see a return given how cheap it is to run an LED lamp instead. For much less, I can install a solar panel that's dedicated to powering lights in a room. I still really like the solar tube lighting as it's so simple. If I ever find a kit selling for a really low price, I know right where I'd put it.
> Why not just exploit the temperature difference between the surface and deep in the earth's crust?
Iceland does it. The problem is that it may generate earthquakes, and can cause havoc on houses several kilometers / miles away. In Germany Staufen im Breisgau is one of the worst examples, though they tried in the middle of the town. However that experience displayed how far such an effect can spread and how unpredictable it can be.
If you are further away from a town, and got a heat source near the surface, it works well. If you need to go too far down it gets too expensive.
But turbines are still more efficient. For thermoelectric you need high temperatures of preferably > 500°C temperature difference. And then efficiency is still worse than turbines. In space it is different, see Voyager and several other probes using plutonium (Alpha type radiation isotope) to generate heat with thermoelectric conversion.
> Distribution cables to move it to the dark side of the globe
We have resistance in cables. And over a large distance we have, when using AC, even more loss. The energy transfer speed in copper (don't mix up with electron move speed) is somewhere around 250000 km/s, so for large distances you need DC on top of a very high voltage. China uses 800000 volt and even 1000000 volt DC lines for exactly that reason. Around the globe with such cables would be very inefficient. And then add cost of political, countries with monetary interests, on top. Could lead to issues like the Putin-Gas-Germany thing. Storing locally suddenly looks much better then.
> Why not just exploit the temperature difference between the surface and deep in the earth's crust? Maybe use thermoelectric generators, rather than turbines.
Marvelous idea.
Small problem though: The efficiency of a TEG is between 5-8%. That of a steam-turbine-generator is between 44-49%. So, what else you got?
> Distribution cables to move it to the dark side of the globe could well be cheaper than thousands of fusion power plants.
First of all, you don't need "thousands of plants". France, which is a really big country, has 18 nuclear reactors, fission reactors which are much weaker than fusion. They cover more than 70% of their electricity requirements from those 18 plants.
And now to why this "global cabling" thing doesn't work...
a) Cables have electrical resistence. If you run cabling halfway around the globe, no power makes it to the other side. Years ago, planning was underway for giant solar farms in the Sahara desert to cover the EUs power usage. The cabling was one of the major reasons why the projects went nowhere.
b) No nation on earth would make its nightly energy budget dependent on the goodwill of some country on the other side of the earth
c) How is the "sunny side" supposed to manage its own energy budget if it has to provide power for the other side at the same time. Solar already cannot provide enough power to cover the basic usage (which is precisely why we have nuclear power plants, and coal, and gas, and hydro, etc.)
> How about putting the same research effort into drastically reducing our energy needs?
Again, this research is already being done. We are developing more energy efficient tech all the time.
But we are also getting more humans every year, and more tech every year. A technological civilization, unless it is in decline, will ALWAYS grow its power demands, thats the Basis for the Kardashev Scale.
"Water has a high specific capacity, unsurpisingly has convenient phase change temperatures*"
It's handy, but I'm using a low-melt alloy metal for my thermal storage battery. I need to iterate it again as I'm finding better system efficiency with some mods. The metal transitions from a solid to a liquid at around 70C and stores a whole lotta energy. I use a water/glycol mix to move the heat around. I need to put another loop in the reservoir to tap off a feed to the tankless water heater so the water going into that isn't as cold.
Iain_Cognito,
the world's greatest scientists still cannot concieve of an energy generation system that doesn't require heating water
You are so very wrong... What kind of a boffin would they be, if they weren't researching better ways of making tea?
[Dear El Reg, where's our teapot icon?]
" the world's greatest scientists still cannot concieve of an energy generation system that doesn't require heating water"
Many LFTR designs use super-critical CO2 as the working fluid since the temperatures are too high to use water practically and a dry system is less prone to wear issues that are seen with HP steam. Water is still an excellent medium to cool the gas back down and carry heat away. If there were a business located adjacent to the power plant that could make use of "waste" heat, getting it there with water is the simplest way about it.
There are plenty of alternatives to water/steam driven power generation. But they all have limitations.
Thermocouples/thermopiles convert heat directly to electricity, but their output is far too small for use at scale (useful for RTGs in spacecraft and submarines though). Same with Peltier devices.
Solar and wind don't use the water state cycle. Hydroelectric systems use the water to drive turbines directly - no need to heat it.
But in other power stations, particularly at large scale, water is the most efficient and safe substance to use for the job. And the anthropic principle applies here: it's because we have evolved in an environment where water exists in solid, liquid and gaseous states that the corresponding temperature range is most useful to us and thus water is useful for transferring heat energy.
Oh great, the world's greatest scientists still cannot concieve of an energy generation system that doesn't require heating water to produce steam.
I once encountered a fellow who was deriding bronze as an old and primitive material. He did not like my description of modern bronzes and their exceptional performance in assorted modern applications ranging from pumps to propulsors, nor pointing out that some bronzes had tensile strengths rivaling that of the strongest titanium alloys. Generally, the thinking on his part seemed to be that because bronze came in ye olden times before the modern iron-and-steel age, bronze was primitive and unworthy of consideration.
As others have noted, water has exceptional features for power generation. Sure, the oldest electrical generating power stations boiled water to generate electricity, but does the longer history of water in power generation mean that there's something fundamentally wrong with boiling water for electricity?
Look at nuclear power plants: yes, they're boiling water, but water is also performing numerous critical functions beyond cooling and driving turbines. Water is moderating the nuclear reactions, and there are few moderators so convenient as water. (Graphite: burns, isn't a coolant, has its own cooling problems.) Water doesn't mind radiolysis; it is easily recombined unlike, say, organic coolants, which turn into sludge and shutdown reactors. Water is a self-regulating buffer against power fluctuations that gaseous coolants (helium, nitrogen, carbon dioxide) can't match. Water has convenient phase changes that not only allow reactor regulation but also means problems like "freezing solid" (lead, lead-bismuth) aren't an issue. While water isn't the most innocuous chemical, it presents less of a headache than some coolants (sodium, sodium-potassium). Water's just great stuff for getting electricity out of fission reactions.
Water answers a lot of problems for fusion reactors, too. Why is that a bad thing?
Your song needs a verse that praises its use in bushings and bearings if it's going to chart.
I was thinking about it since the place I most recently used bronze was a linear plain bearing. Bronze allowed us to get rid of a recirculating linear bearing that had been crapping millimeter-scale ball bearings into some optics and their motors. But "pumps and propulsors" popped into my head first.
"I was thinking about it since the place I most recently used bronze was a linear plain bearing. "
I had a design that used a powdered metal linear bearing that used a very light oil pushed through the bronze and recaptured to be used over and over. The proto was pretty cool, but the project never completed. The moving section of the device was very heavy so having a large bearing surface was important and pushing oil through the material insured there would be good dispersion across the surface without a flood.
Which do direct energy capture and do not need to boil water.
Unfortunately they require even higher temperatures than what stuff like this and ITER are trying to achieve, but I think long term that's where the future lies. Because not only is that much more efficient, you also don't have a bunch of neutrons bashing around irradiating and embrittling the containment vessel (there would be some stray neutrons but fewer and slower)
Hear hear. The article said:
"It’s not easy to create plasma, never mind maintain it, because doing so first requires heating gases under enormous pressure to the point at which some electrons are freed from atomic nuclei."
Wrong on both points. Neon lights contain plasma when on, and so do plasma displays. Neither are under "enormous pressure" or significantly above room temperature.
(For fun bonus points, put a potent electromagnet under an expanded section of neon light tube. I did a science experiment almost 30 years ago showing it was possible to split the beam that way.)