The rapid advancement of large language models is fueling a global data center boom and driving a surge in energy demand. But the electricity required to power data centers is straining the grid, pushing infrastructure operators to search for alternative sources of power. Some are even looking beyond Earth.
One company that’s looking to the stars for energy is Orbital Inc. In mid-April, the Los Angeles–based startup emerged from stealth and announced plans to build space data centers. Backed by Andreessen Horowitz (A16z), Orbital is designing infrastructure for AI inference, where trained models generate outputs. Much like other companies advocating for space-based data centers, Orbital is banking on the “free” energy generated by the sun to power compute for workloads such as chatbots and agents, sidestepping terrestrial energy constraints.
“There simply isn’t enough capacity here [on Earth], and the only way is up,” says Euwyn Poon, Orbital’s founder and CEO. “There’s actually abundant solar energy that’s not being harnessed.”
Orbital’s vision is a mesh constellation of small satellites in low Earth orbit. Each satellite would be equipped with a GPU server rack powered by solar panels roughly the size of a tennis court, plus radiative cooling panels of comprable size. The long-term goal is up to 10,000 fridge-sized satellites—each with 100 kilowatts of power—forming a distributed cloud, similar to SpaceX’s proposed AI Sat Mini.
Orbital’s first test will come in 2027, when it plans to launch a prototype satellite aboard a SpaceX Falcon 9 to validate its GPU operations in orbit and run commercial inference workloads. Another company, Starcloud, has already run a similar test last year. Orbital’s differentiator is their plans to match the solution with a problem: small satellites equipped to run inference workloads specifically could benefit from lower launch costs. However, they face the same difficulties as other space data center hopefuls: every watt of “free” energy must be dissipated as heat via large radiative coolers; radiation in low earth orbit degrades compute equipment; and regular maintenance in space is difficult and costly.
Orbital’s inference focus
Poon says Orbital’s focus on a distributed network of smaller satellites designed to run inference workloads across independent GPU nodes rather than large, tightly coupled systems, makes the execution more feasible.
That idea shapes Orbital’s design. Training large AI models typically relies on tightly coupled GPU clusters optimized for massive compute throughput. Inference workloads, by contrast, are generally less compute-intensive per request and can often run on smaller numbers of GPUs, making them easier to distribute across systems. Capping each satellite at roughly 100 kilowatts, Poon says, greatly simplifies the design.“It’s very simple,” Poon says, referring to the concept behind the satellites’ engineering. “Engineers would…
Read full article: Orbital Inference Data Center Bets On Space GPUs
The post “Orbital Inference Data Center Bets On Space GPUs” by Aaron Mok was published on 05/10/2026 by spectrum.ieee.org

































Leave a Reply