Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Last time these folks were mentioned on HN, there was a lot of skepticism that this is really possible to do. The issue is cooling: in space, you can't rely on convection or conduction to do passive cooling, so you can only radiate away heat. However, the radiator would need to be several kilometers big to provide enough cooling, and obviously launching such a large object into space would therefore eat up any cost savings from the "free" solar power.

More discussion: https://news.ycombinator.com/item?id=43977188



Maybe this is reductive, but there are times that I'm concerned the only thing keeping me from getting gobs and gobs of startup funds are the facts that I understand basic principles of engineering in space.

I could be wrong and this will be a slam dunk. To me, however, the costs/complexity (Cooling, SRP perturbation, stationkeeping, rendezvous, etc.) far outweigh the benefits of the Cheap as Free (tm) solar power


Assuming these people don't understand that their ideas are unworkable is a mistake. Don't believe for a second they are stupid or ignorant.

The difference between a criminal and a law-abiding citizen isn't that the citizen knows that crimes are wrong, it's that the citizen cares that crimes are wrong and the criminal doesn't.


>they are stupid or ignorant.

Nope, probably the more apt description is 'in denial'.


I think that's very charitable.

I'd describe it as "kayfabe".


Reading the paper they wrote on this from their GitHub site, it does take into account the thermal management aspects quite considerably.

https://starcloudinc.github.io/wp.pdf

Your thinking seems more risk averse, which is similar to myself. However that doesn't mean that without the business drivers these types of things can't happen if enough attention is given too it. Costs are often because we're comparing one thing which has significant efficiencies built into the supply chain, vs something that doesn't, which by virtue drives up the cost. Perhaps Nvidia have money to burn on trying something.


You've also got the problem of cosmic radiation flipping bits. Your fault tolerant architecture probably mitigates this with redundancy, with the extra servers again eating into the purported advantages of extra solar power. Dealing with the PITA of single event upsets is something developers of edge data processing software in space put up with to avoid the latency issues that data clouds in space introduce


In all seriousness, if AI models can handle quantization, they can handle some flipped bits from time to time! There are probably some fascinating papers to be written around how to choose which layers in an LLM architecture could benefit more than others from redundant computation in a high-radiation environment.


Brilliant, to turn up the model temperature we just hinge open the shielding. I call dibs on the patent!


Ok, has anyone patented chips with radioactive source glued to them? For "true" randomness.

If it not i want dibs on it.


https://en.wikipedia.org/wiki/Hardware_random_number_generat...

> and even the nuclear decay (due to practical considerations the latter, as well as the atmospheric noise, is not viable except for fairly restricted applications or online distribution services)


yeah, I think the space weather experts would have fun statistically analysing the single-event-upset RNG :)


I wonder if "normal" RDIMM ECC would be enough to mitigate most of those radiation bit-flipping issues. If so it wouldn't really make a difference to earth-based servers since most enterprise servers use RDIMM ECC too


You'll get bitflips elsewhere besides just in RAM. A bitflip in L1 or L3 cache will be propagated to your DIMM and noone will be the wiser.


I thought server CPUs already handled this? E.g. for Epyc https://moorinsightsstrategy.com/wp-content/uploads/2017/05/...

> Because caches hold the most recent and most relevant data to the current processing, it is critical that this data be accurate. To enable this, AMD has designed EPYC with multiple tiers of cache protection. The level 1 data cache includes SEC-DED ECC, which can detect two-bit errors and correct single-bit errors. Through parity and retry, L1 data cache tag errors and L1 instruction cache errors are automatically corrected. The L2 and L3 caches are extended even further with the ability to correct double errors and detect triple errors.


Sun Microsystems famously had this problem with their servers using the UltraSPARC II chips, with cache SRAM that didn’t have ECC. Later versions of their processors had ECC added.


Those do ECC already


What about the registers?


What about the ALU/FPU/TPU itself?


My initial thought was "cooling is going to be a fun challenge, in addition to data transfer, latency, hardware maintenance and all that other fun stuff". It truly feels like one of those, you-have-too-much-money moments.


This is a big thing never shown in sci-fi. For example, those huge torch ships in The Expanse would need gigantic radiators. Even if the drive were upwards of 90% efficient the waste heat would melt the engine and the rest of the ship.

Even the ISS has sizable radiators. The Shuttle had deployable radiators in the form of the bay doors if my memory serves me correctly.

Oddly enough the otherwise dumb Avatar films are among the only ones to show starships with something approaching proper radiators.

There’s no air resistance in space so radiators don’t impact your flight characteristics.


Your memory serves well with respect to the Shuttle. Astronaut Mike Mullane, from his autobiography Riding Rockets:

> Next [after loading the computers with on-orbit software] we opened the payload bay doors. The inside of those doors contained radiators used to dump the heat generated by our electronics into space. If they failed to open, we’d have only a couple hours to get Discovery back on Earth before she fried her brains. But both doors swung open as planned, another milestone passed.


The Mass Effect video games talk about cooling ships, with the warships glowing red from heat if they go too fast


I enjoyed seeing it described in those games :)

I'm pretty sure it was that series that also described https://en.wikipedia.org/wiki/Liquid_droplet_radiator , with the side effects of different ships having very distinct heat patterns because of their radiator patterns. And that if a ship ever had to make a turn while they were active, big glowing arcs of slowly-cooling droplets would be flung out into space and leave a kind of heat plume.


> Oddly enough the otherwise dumb Avatar films are among the only ones to show starships with something approaching proper radiators.

I imagine it's the same reason James Cameron is a world expert on submersibles - the guy picks individual topics in his movies to really get right.


Neal Stephenson's _Seveneves_ covers these dynamics in detail :)


The book Saturn Run has an interesting design utilized for a spaceship.


By my back of the envelope calculations, the radiators would be comparable to the solar arrays, probably somewhat smaller and not massively bigger at least.


Care to share them?


Extremely rough one significant digit analysis from first principles, containing a lot of assumptions:

For solar panels:

Assuming area of 1000 square meters (30m x 30m square), solar irradiance of 1 kW/m^2, efficiency of 0.2. As a result power is 200 kW.

For radiators:

Stefan-Boltzmann constant 6E-8, temperature difference of 300 K, emissivity of one, we get total radiator power 1000 x 6E-8 x 300^4 = 486 kW.

The radiator number is bigger so the radiator could be smaller than the solar panels and could still radiate away all the heat. With caveats.

Temperature difference in the radiator is the biggest open question, and the design is very sensitive to that. Say if your chips run at 70 C (340 K), what is the cool temperature needed to cool down to, what is the assumed solar and earth flux hitting the radiator, depends on geometry and so on. And then in reality part of the radiator is cooler and radiates way less, so most of the energy is radiated from the hot part. How low do you need to get the cool end temperature to, in order to not fry your chips? I guess you could run at very high flow rates and small temperature deltas to minimize radiator size but then rest of the system becomes heavier.


There's a very clever scheme I remember reading about a while ago where you dump the heat into an oil that you then spray in a fine mist towards a collector. You get a collosal surface area that way, in a very confined volume, with not that much more mass than a coolant fluid which you already need; and it's relatively easy to homogenise the temperature across the radiating particles. I seem to recall that it got as far as Dupont coming up with a specific coolant mix for the job; the rest of the system is a relatively well-understood (if precise) nozzle/collector design so you don't end up squirting your coolant off somewhere you can't catch it.


In space this wouldn't really work since there's no conduction or convection.

If you think of a big ball of droplet mist. From the point of view of a droplet in the center, it gets heat radiation from all the droplets around it. It can only radiate heat to black sky it sees, and it might be none, it's "sky" is just filled by other hot droplets. So it doesn't cool at all.

The total power radiated can't exceed the proportion to the macro surface area with tricks.


Are there any liquids with a low enough vapour pressure for this sort of thing?


Question: for a larger system, can a heat pump be used to increase the temperature of the radiator without making the rest of the system hotter? Thus radiating more heat from fewer panels?


Your temperature differential is already 300K, so the efficiency needs to be high enough. 50K change is only 18% more cooling, but if COP=5 then it's also putting out 20% more heat...


In addition to the math, you can also look at existing examples, like how large the ISS radiators need to be relative to its solar panels. Like this project, it is essentially a closed system where all power generated by the solar panels will eventually be converted to heat that needs to be dissipated.

I'm skeptical that it makes any economic sense to put a datacenter in orbit, but the focus on the radiators in the last discussion was odd - if you can make the power generation work, you can make the heat dissipation work.


Their white paper touches on the issue, which seems slightly hand-wavy without much detail on quantification. They could potentially take advantage of heat gradients from deep space and dissipate heat to explore the Seeback effect.


Deep space? So they want to be outside geostationary orbit?


p3 of their white paper https://starcloudinc.github.io/wp.pdf akshually...


In LEO, pointing at deep-space just means away from Sun/Earth. You don't have to be in deep-space to use it for radiating heat away.


Even beyond cooling, just getting all the hardware up there is extremely costly, and for what benefit over ground based DCs? The cooling is the ongoing problem but the cost of lifting it there obliterates all the other problems, IMO.


Space X thinks they will reduce the cost by 90% with Starship, so they are probably calculating off that.


On the linked page there are animations using Starship.


And who is The Law, in space? What's to prevent E.G. Amazon Kuiper or Musk Starlink from crashing one of their vehicles into the array, when they want to takeover their market?


My understanding is that the normal rule here is that the launching state has jurisdiction over (and international legal responsibility for) what is done by a spacecraft, but I’d bet that if private parties crashing their spacecraft into those of other private parties with widespread, economically significant use became a thing, a whole lot of countries in which one or more of the companies have assets or interests would discover jurisdiction in underused provisions of their domestic law rather quickly, no matter where either of the craft involved were launched.


Their website pitches it as 16 square km


Makes me wonder about building a 16km square datacenter on earth. I wonder if building in that way, with a lower "data density" would allow for more passive cooling and you'd have the large solar field.

Wonder if that would be less impactful than how ever many rockets they'll need to send up, plus you could, ya know, ~drive~ bike to a failed machine.


It says "Starcloud plans to build a 5-gigawatt orbital data center with super-large solar and cooling panels approximately 4 kilometers in width and length."

So, it's the solar/cooling panels that make up that space, not the data centre per se.


I know. I'm saying what if you build lower density data centers that could be more passively cooled. Apparently being in space is no issue for latency, so I can't see why building it on earth in a remote-ish area would matter.


I can think of some parts of earth where passive cooling isn't a major problem, and some of them even have power sources...


Should we be adding massive sources of heat (datacenters) to regions that can easily passively cool them? It sounds like that would be somewhere around the Arctics. These are already seeing record high temperatures both in winter and summer. Maybe if we manage to radiate all the heat directly back into space by mimicking snow…?


Wouldn't a 16km² gigantic solar roof on Earth already cover the energy needs that they're pitching will be saved with this space data center?


No. It would need to be larger, probably by a factor of 3 or 4, for a couple reasons.

1) The atmosphere attenuates sunlight (even when it's not cloudy)

2) The solar array in orbit can pivot to face the sun all the time.

3) While most orbits will go into earth's shadow some of the time, on average they'll be in sunlight more of the time than a typical point on the surface.

see https://en.wikipedia.org/wiki/Solar_irradiance


There's already about 0.4 square km of solar panels across the Starlink constellation. (~4,000 v2 satellites at ~100 meter^2 each).


This project seems 40x larger than all of Starlink's constellation combined. So quite huge.


If the Mass Effect games have taught me anything, it's that heat dissipation in outer space is hard.


They taught me that Sir Isaac Newton is the deadliest son-of-a-bitch in space. Which is probably something else these space data centers will struggle against, it'll be interesting to see how much shielding they have against impacts. There was a Soyuz that had a coolant leak blamed on a micrometeorite strike.


I learned something interesting here, thanks. I've never really thought about it so I'd always assumed space = cold so that would be fine.


Space is cold. There are just very little cold molecules to take over the energy from your hot molecules.

Here on earth we are surrounded by many molecules, that are not so cold, but colder than us and together they can take a lot of our excess heat energy away.


Yeah, it's totally obvious now that it's been pointed out, I'd just never thought it through properly.


Space is not cold. Space is empty. It has no real value for temperature.

Stuff in space does.


> Space is empty.

This prompted my curiosity. None of the following contradicts the thrust of your message, but I thought the nuance is interesting to share.

Interstellar space isn't a vacuum. Space is mostly empty compared to Earthly standards, but it still contains gas (mostly hydrogen and helium), dust, radiation, magnetic fields, and quantum activity.

The emptiest regions are incredibly sparse, but not completely empty. Even in a perfect vacuum, quantum mechanics predicst that particle-antiparticle pairs constantly pop in and out of existence, so empty space can be said to be buzzing with tiny fluctuations.

> Space is not cold. It has no real value for temperature. Stuff in space does.

The cosmic microwave background radiation, the left-over energy from the Big Bang, sets a baseline temperature of about 2.7K (-270°C), just above absolute zero.

Temperature depends on particle collisions, and since space isn't a vacuum, just incredibly sparse, one can talk about the temperature of space, but you're right that what is typically more relevant is the temperature of "specific" objects.


Not sure if I follow really. Cooling from it's own generated heat? Are we even sure the system would get that hot in the first place? The temperatures can plunge up to -200 degrees. If needed, they'd cool it just like they keep the James Webb Telescope cool.


The Webb telescope is a _wildly_ different apparatus, designed from the ground up to run as cool as possible, and with an effectively unlimited budget. It lives in the shadow of the Earth behind multiple layers of shielding. These "data centers" need to live in direct sunlight and operate as cheaply as possible _at scale._ Very little of Webb's tech is applicable.


Keeping things cool in space is very hard. On earth we usually transfer heat from one medium to another (water to water, water to air, etc.). In space that's not possible because even though the matter in space is quite cold, there is very little. Therefore the only real way to get rid of heat in space is to radiate it away (think infrared light bulb). The James Webb Telescope does the same thing.


There are two real challenges in running a data center: how to get power in (reliably), and how to get heat out.

Any data center that isn't generating massive heat is a waste of our time.

And no, JWST is not doing industrial scale cooling.


Thank you for the responses. I understand the issue a bit more now.


Figure 3 shows they're hoping to have a 4km x 4km combo solar panel and radiator. The main truss of ISS is 110 meters long.

These plans are so much larger than anything built so far that they're scifi.


The Github paper seems to indicate they have considered the thermal aspects fairly heavily and mention that "conduction and convection to the environment are not available in space".


Alternatively, assuming they are aware of the cost, what does this say about what they are implying the cost of electricity is going to be?


TFA> “In space, you get almost unlimited, low-cost renewable energy,”

Why is this exclusive to space? If you're powering datacenters on solar, one would think covering the Sahara or other large desert in datacenters would be easier than launching them into space. Renewable energy is just as plentiful and free there, you can connect it to the rest of the world with multiple TB/s of fiber links, and the construction/maintainence costs would be a few orders of magnitude less.


But then you wouldn't be able to launch to space. It would also seem like a very mundane project wouldn't it?


Hey, at least it's not going to end up with a bunch of actual people getting treatment based on invalid blood test results.


interesting, what if we put datacenters in the ocean floor with nuclear power? like the Army Janus program


Given the water needs of data centers and the ongoing and upcoming water scarcity, I imagine the problem of heat dissipation seems easier to solve, long term, in space.


We can and do build data centres that don't use evaporative cooling, evaporation is just often the cheapest option in places with large natural water sources.


Wut?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: