WHY????
Putting a data centre in orbit is crazy stupid. The first three problems that come to mind are servicing, power and cooling.
Space startup Aetherflux says it plans to put its first data center satellite into orbit during the first quarter of 2027. The company, founded and run by Baiju Bhatt, co-founder of financial firm Robinhood, sees satellites as a time-saving alternative to terrestrial data center construction, which can take five or more years …
Well yes, the output might end up making sense of being correct. If it ended up being used for AI Overview on Google I'd then have to go back to checking actual sources for information — as it stands at the moment I can get the correct answer to my queries by scanning the AI Overview and inferring the pole opposite of whatever it says.
Servicing isn't a problem if it's treated as a sealed, disposable module.
Power and cooling... AI shills are calling for up to 400kW per rack. The ISS solar panels were rated for 90kW at launch to run the entire f-ing station. This is actually double what the ISS needs because it spends 45minutes of each 90min orbit in darkness and runs off batteries, so your actual power budget is <45kW. To run a 42U equivalent AI satellite, you need maybe 800kW of solar at LEO, or a mere 400kW if you go to a (higher/more expensive) always-sunny orbit. That's an incredible amount of solar capacity to launch just to run a single rack equivalent of cutting-edge, 2nm compute, whose longevity in space conditions is unknown, but probably not good.
The panels themselves will last more than 3 years, so you could have a "power satellite" with a docking/mating mechanism to which you launch a rack of compute, and 3 years later you launch a new compute module that you plug in and deorbit the old one (full of obsolete and/or failed chips). Maybe get 3-4 generations of compute per solar array.
But then you need a hell of a cooling package, which various space engineers have raised eyebrows at.
The whole thing is just unnecessary. A way of separating fools from their money. I can see a way to an architecture, but I don't see it being cost-effective, even if there was sufficient demand for AI to make it profitable. Which it isn't. If you can't make money on the ground, you won't in space.
The future is under the sea. Microsoft's underwater data center blazed the path, and now China is going to try it out with wind power. Space is really for sensors and communication (and maybe interceptors eventually). Here's a competing platform article on the Chinese effort: https://www.tomshardware.com/tech-industry/cnina-deploys-wind-powered-underwater-data-center
There is a trade off between two types of 'rent'.
High orbit: Your satellite can stay up effectively forever but will get damaged by fragments of dead satellites. After a while it contributes fragments to damage other satellites. On top of that, maximum payload decreases with altitude and you get less bandwidth per watt for a fixed mass antenna.
Low orbit: Your satellite stays up as long as you provide thrust. The good news is that fragments dead satellites do not have thrust so will likely de-orbit before causing damage. The rocket equation is not k.ind. The propellant required to provide thrust will make up a proportion of the mass, as will the solar panels needed to power the thrusters.
Earth receives a continuous rain of space dust. We are roughly at the point where falling satellites dump as much mass into the upper atmosphere as natural causes. The effect this has is mostly unknown, being studied and likely to be a problem at some unknown point in the future - or present. Communications constellations are bad enough. Orbital chat-bots will be far worse if they launch before the bubble bursts.
Is it now ? Purely an article of faith and on pretty shaky ground even at this point.
I suppose a perfectly serviceable article of bad faith if the intention is to flog the idea, grab the cash and leg it.
Personally I don't believe any LLM even on a scale comparable with the human brain will ever exhibit general intelligence leaving aside that such an LLM would likely require a planetary scale power supply.
Exactly. The notion that AGI is just a really, really big LLM is something Nvidia and datacenter operators are pushing like hell for obvious reasons, but there's really no hard reason to believe that. In fact, we've been making bigger and bigger LLMs for a while now, and the main problems are not going away.
I'd just like to point out that Natural General Intelligence already runs on processors weighing a couple of pounds and consume a few hundred Watts of power each. New ones don't need multi billion quid factories, but can be made by many people in only nine months, using materials they have around the home. Training them is time consuming, typically a couple of decades before they can be used to solve real world problems, but once done you can easily get four, or five decades use out of them till the performance declines. You can even put them in space, or underwater as well, if you want.
Radiative cooling is a thing.... but *all* of the energy going into a rack ends up as waste heat in the end, so if your rack consumes 400kW of power, that also means 400kW out. I've done a *very* rough hand calc, and assuming your radiative panel is completely black, (not actually possible) I think you'll need it to be about 25m x 25m to keep the *average* temperature of the rack at about 50 degrees C (323 K). This assumes that heat transfer from the rack to the panel is perfect (Not actually possible) and that you can completely insulate the side of the panel facing the sun and the other side *always* faces deep space. On the bright side, I believe the solar panels needed to power the 400kW rack will be bigger than that, so you can use them as part of your multi layer insulation to prevent solar heating!