Even as technology companies are projected to spend more than $5 trillion globally on earth-based data centers by the end of the decade, Elon Musk is arguing the future of AI computing power lies in space—powered by solar energy—and that the economics and engineering to make it work could align within a few years.

Over the past three weeks, SpaceX has filed plans with the Federal Communications Commission for what amounts to a million-satellite data-center network. Musk has also said he plans to merge his AI startup, xAI, with SpaceX to pursue orbital data centers. And at an all-hands meeting last week, he told xAI employees the company would ultimately need a factory on the moon to build AI satellites—along with a massive catapult to launch them into space.

“The lowest-cost place to put AI will be in space, and that will be true within two years, maybe three at the latest,” Musk said at the World Economic Forum meeting in Davos this January.

Still, while Musk and some other bulls argue that space-based AI could become cost-effective within a few years, many experts say anything approaching meaningful scale remains decades away—especially as the bulk of AI investment continues to flow into terrestrial infrastructure. That includes Musk’s own Colossus supercomputer in Memphis, which analysts estimate will cost tens of billions of dollars.

Read more: https://fortune.com/2026/02/19/ai-data-centers-in-space-elon-musk-power-problems/

by fortune

11 Comments

  1. It makes no sense from a physics point of view due to cooling being one of the biggest datacenter / compute issues.

    Cooling AND data transfer are both significantly more difficult IN SPACE.

    Microsoft has already done experiments ON EARTH with sealed datacenter units run under water for cooling but fed with power and high speed data from land.

    They proved the containers worked, completely sealed (what you would need for space even).
    They didn’t end up going this way however as the project was deemed logistically and economically impractical for widespread use.

    Considering that space would be WORSE in terms of deployment, data and cooling there should be no logical point to this.

    Edit: radiation may also present an issue for compute in space as well.

  2. Elon really is so ignorant that I don’t understand how he became a billionaire. I don’t understand how you could hear this concept and think this is a good idea.

  3. Lack of power and cooling is the nail and since he owns a space launch system, that is his hammer.

    But it is a horrible idea when you look at the power and cooling requirements, and we haven’t even gotten to the issue of shielding fragile electronics from radiation.

  4. The great thing about datacenters is that they don’t need to be close to population centers or stuff like that so you can put them in the cheapest places.

    Space is the most expensive place.

  5. A post I made on LinkedIn a couple of weeks ago:

    I’ve been hearing more and more talk about “data centers in space.” But there’s a physics problem that isn’t being discussed.

    On Earth, data center cooling is straightforward. Fans blow air across heatsinks. Water circulates through cooling towers. Convection moves heat from hot components into cooler fluids that carry it away. Cheap, proven, and scalable.

    In space, none of this works.

    Space is a vacuum. No air, no water, no medium to absorb heat. The only option is thermal radiation: slowly emitting infrared photons from massive radiator panels. The entire ISS uses this approach to handle about 70 kW of waste heat. A single modern AI rack generates 40-120 kW. That figure will probably be more like 200kW in about two years.

    Everyone thinks “space is cold” because of movies and simple explanations. In reality, space “cold” isn’t the same type of “cold” we have on earth. The background temperature of space is ~2.7K, but temperature without a transfer medium is meaningless. A vacuum thermos keeps your coffee hot for hours using this same principle. Vacuum insulates. It does not cool. Heat in space needs to be radiated away, not actively disbursed like on earth. And that passive radiation is MUCH slower than active heat transfer.

    For a modest 10 MW AI training facility (small by terrestrial standards) you’d need 30,000+ square meters of radiator surface, hundreds of tonnes of cooling infrastructure, and billions of dollars. The total system cost is 50-200x what the same facility costs on the ground. The cost of cooling would be more than the cost of the GPUs, the solar array, and the launch costs combined.

    Elon Musk knows this. SpaceX engineers absolutely know this. Yet the narrative around space-based compute keeps growing, and it’s not hard to see why. SpaceX is heading toward an IPO, and every ambitious-sounding use case (Mars colonization, orbital data centers, point-to-point travel) inflates the total addressable market story that drives valuation.

    Musk has a pattern of announcing physics-defying timelines and capabilities to move markets. The Cybertruck was supposed to be bulletproof and cost $40K. Full self-driving was supposed to be solved by 2020. The Boring Company was supposed to revolutionize transit.

    Space data centers for AI are the same playbook. The engineering challenges are real and unsolved. The promises serve a financial purpose, not a technical one. Musk and the techbros that regurgitate his talking points only have one goal: artificially inflate the SpaceX IPO, especially since xAI has been absorbed by it.

    Next time someone pitches you on orbital compute, ask them one question: how do you disperse the heat? If the answer doesn’t include specific radiator mass budgets, you’re not hearing engineering. You’re hearing a pre-IPO narrative based on lies.

    I’ll post later about how to actually solve the AI problem, and why it is incredibly dangerous for people to continue to platform the lies of billionaire capitalists.