WHAT ARE YOU LOOKING FOR?

Commentary: What is the deal with space-based data centers for Artificial Intelligence?

by Abhimanyu Ghoshal

February 7, 2026 - Big Tech believes orbital data centers are the best way to scale up compute infrastructure needed to run AI services.

Terrestrial data centers are so 2025. We are taking our large-scale compute infrastructure into orbit, or at least that is what Big Tech is yelling from the rooftops at the moment.

Let us start with the basics. You might already know that a data center is essentially a large warehouse filled with thousands of servers that run 24/7.

Artificial Intelligence companies like Anthropic, OpenAI and Google use data centers in two main ways.

Training AI models - This is incredibly compute-intensive. Training a model like the ones powering OpenAI's ChatGPT or Anthropic's Claude required running calculations across thousands of specialized chips (GPUs) simultaneously for weeks or months.

Running AI services - When you converse with those models' chatbots, your messages go to a data center where servers process it and send back the model's response. Multiply that by millions of users having conversations simultaneously and you need enormous computing power ready on demand.

AI companies need data centers because they provide the coordinated power of thousands of machines working in tandem on these functions, plus the infrastructure to keep them running reliably around the clock.

To that end, these facilities are always online with ultra-fast Internet connections, and they have vast cooling systems to keep those servers running at peak performance levels. All this requires a lot of power, which puts a strain on the grid and squeezes local resources.

The idea of having data centers in space has been bandied about for a while now as a vastly better alternative that can harness infinitely abundant solar energy and radiative cooling hundreds of miles above the ground in low Earth orbit.

Powerful GPU-equipped servers would be contained in satellites, and they'd move through space together in constellations, beaming data back and forth as they travel around the Earth from pole to pole in the sun-synchronous orbit.

The thinking behind space data centers is that it will allow operators to scale up compute resources far more easily than on Earth. Up there, there aren't any constraints of easily available power, real estate, and fresh water supplies needed for cooling.

There are a number of firms getting in on the action, including big familiar names and plucky upstarts. You've got Google partnering with Earth monitoring company Planet on Project Suncatcher to launch a couple of prototype satellites by next year. Aetherflux, a startup that was initially all about beaming down solar power from space, now intends to make a data center node in orbit available for commercial use early next year. Nvidia-backed Starcloud, which is focused exclusively on space-based data centers, sent a GPU payload into space last November, and trained and ran a large language model on it.

The latest to join the fold is SpaceX, which is set to merge with Elon Musk’s AI company, xAI, in a purported $1.25 trillion deal with a view to usher in the era of orbital data centers.

According to Musk's calculations, it should be possible to increase the number of rocket launches and the data center satellites they can carry. "There is a path to launching 1 TW/year (1 terawatt of compute power per year) from Earth," he noted in a memo, adding that AI compute resources will be cheaper to generate in space than on the ground within three years.

In an excellent article in The Verge from last December, Elissa Welle laid out the numerous challenges these orbital data centers will have to overcome in order to operate as advertised. For starters, they would have to safely wade through the 6,600 tons of space debris floating around in orbit, as well as the 14,000-plus active satellites in orbit. Dodging these will require fuel.

You have also got to dissipate heat from the space-based data centers, and have astronauts maintain them periodically. That is to say nothing about how these satellites will affect the work of astronomers or potentially increase light pollution.

Ultimately, there is a lot of experimentation and learning to be gleaned from these early efforts to build out compute resources in space before any company or national agency can realistically scale them up to plans.

While it might eventually become possible to do so despite substantial difficulties, it is worth asking ourselves whether AI is actually on track to benefit humanity in all the ways we have been promised, and whether we need to continually build out infrastructure for it - whether on the ground or way up beyond the atmosphere.