The AI boom has a dirty secret: it runs on electricity. Lots of it. While the world argues about chips and models, the real bottleneck is turning into power, cooling, and build time.
Now SpaceX is floating a sci-fi answer that’s also oddly logical: put data-center-style compute in space, power it with near-constant solar, and skip a big chunk of the land-based energy and permitting headaches.
The proposal in plain English
In a new filing with the Federal Communications Commission, Elon Musk’s company asks for permission tied to an ambitious concept: a constellation of satellites designed to harvest solar energy and support AI data-center workloads in orbit.
The filing describes the appeal as straightforward:
- sunlight in orbit is abundant and consistent
- satellites don’t need fuel deliveries or on-site staff
- the setup could reduce ongoing “operating/maintenance” burdens compared with terrestrial facilities
- the company argues it could lower the environmental footprint relative to land-based expansion
The number that grabs attention: “one million”
The filing mentions a constellation size on the order of one million satellites. That’s not a prediction of what will actually launch—it’s a regulatory number that can create design flexibility and optionality. Satellite operators often request approvals above what they ultimately deploy.
For context, the total number of satellites currently in orbit is far smaller, and even Starlink itself—already the dominant mega-constellation—sits at a tiny fraction of that proposed scale.
Why “orbital data centers” isn’t as crazy as it sounds
If you strip away the hype, the pitch is basically: move compute closer to power.
On Earth, AI compute is constrained by:
- grid availability and interconnection queues
- permitting and community pushback
- water and cooling constraints
- supply chains for transformers, turbines, and switchgear
- “years, not months” construction timelines
In orbit, the power source (sunlight) is persistent—and the “real estate” doesn’t require zoning approval. The challenge shifts from land constraints to launch cost, reliability, thermal management, and networking.
Starship is the make-or-break dependency
This plan leans heavily on Starship becoming what it’s meant to be: a high-cadence, reusable launch system that dramatically lowers the cost per kilogram to orbit.
SpaceX’s argument is essentially: if fully reusable launch becomes routine, then deploying truly large on-orbit infrastructure becomes feasible—faster than building the equivalent on land.
The questions regulators (and everyone else) will ask
Even if the idea is technologically seductive, the hurdles are real:
- Spectrum & interference: what frequencies, what bandwidth, and how it coexists with everyone else in space?
- Collision risk & debris: more satellites means more traffic and more consequences when something fails
- De-orbit plans: how are end-of-life satellites removed cleanly?
- Security: what does “AI compute in orbit” imply for data protection and misuse?
- Network routing: how data gets up, processed, and down at scale without turning the sky into a congestion problem
Bottom line
This is xAI-era Musk logic: solve an Earth bottleneck by leaving Earth. Whether the “orbital AI data center” becomes real infrastructure or remains an ambitious filing, it points to a deeper truth about the AI race:


