Orbital sets 2027 test mission for space-based AI data centers as energy limits hit compute growth

Startup backed by a16z Speedrun is developing solar-powered AI infrastructure in orbit as demand for compute outpaces power and cooling capacity on Earth.

Orbital has set a 2027 launch date for its first space-based AI data center test mission, positioning low Earth orbit as a potential solution to rising energy constraints on AI infrastructure as global demand for compute continues to accelerate.

The Los Angeles-based startup, backed by a16z Speedrun, is building satellite-based AI data centers designed to run inference workloads in orbit using continuous solar power and radiative cooling, removing reliance on terrestrial power grids and traditional cooling systems.

Funding and first mission plans

Orbital has secured backing from a16z Speedrun to support Orbital-1, its first test mission, scheduled to launch on a SpaceX Falcon 9 in April 2027. The mission will test sustained GPU performance in orbit, including radiation resilience and the ability to run commercial AI inference workloads following validation.

Andrew Chen, General Partner at a16z Speedrun, says: "Speedrun backs founders to explore ambitious ideas — the harder the problem, the better. Orbital is taking on AI's biggest constraint with a bold and radical idea."

The company is also developing Factory-1, a research and development facility in Los Angeles, and is filing with the Federal Communications Commission as it works toward deploying a wider constellation of AI compute satellites.

AI infrastructure shifts beyond terrestrial limits

Orbital is building a network of satellites equipped with NVIDIA-powered servers, designed to operate in low Earth orbit where solar energy is continuous and cooling can be managed by radiating heat into space.

Euwyn Poon, CEO and Founder of Orbital, says: “AI progress is being constrained by the grid. Data center economics are dominated by electricity and cooling, and both are getting harder. In orbit, solar power is continuous and cooling is fundamentally different. Orbital is building compute infrastructure that removes the energy ceiling and scales with AI's potential.”

The company’s approach reflects a growing pressure point in AI development, where access to power, rather than chips, is becoming a limiting factor for scaling models and deploying infrastructure.

Focus on inference and distributed AI compute

Orbital is targeting AI inference workloads, where tasks can be distributed across multiple nodes rather than requiring tightly coupled GPU clusters used in model training. This allows compute to be scaled across a satellite constellation, handling parallel requests without the same latency constraints.

The Orbital-1 mission will test whether this distributed model can operate reliably in orbit and support commercial AI workloads.

Orbital was founded by Euwyn Poon, who previously founded Spin, acquired by Ford. Poon says: "The energy ceiling on AI isn't theoretical, it's a real constraint that will impede the advancement of intelligence. This is the solution."

If successful, the model could shift how AI infrastructure is deployed, particularly as energy demand from data centers continues to rise faster than available supply.

Previous
Previous

ETIH Innovation Awards: Best AI Tutor or Personalized Learning Agent shortlist shows one-to-one AI at scale

Next
Next

Google.org opens $30M AI for Science funding call with extended deadline