China Begins Building World’s First Supercomputer in Space

China’s Long March rocket launches from Launch Complex 101
China’s Long March rocket launches from Launch Complex 101. Credit: China News Service / CC BY 3.0

China has begun building what could become the world’s first supercomputer in space. On May 14, a Long March 2D rocket launched from the Jiuquan Satellite Launch Center, carrying 12 advanced satellites into orbit.

This marks the first phase of a larger effort to deploy a space-based computing system capable of processing massive amounts of data without relying on Earth-based infrastructure. Led by ADA Space and Zhejiang Lab, the project aims to build a 2,800-satellite network known as the Three-Body Computing Constellation.

Designed to perform high-speed data processing directly in orbit, the constellation represents a major shift in how artificial intelligence may be deployed beyond Earth—and signals China’s growing lead in the race to bring supercomputing power to space.

Cooling and computing in space

The satellites will utilize the cold vacuum of space as a natural cooling system while managing data at extreme speeds. According to Chinese officials, the integrated system is anticipated to achieve 1,000 peta operations per second, equivalent to one quintillion calculations every second.

Each satellite carries an AI model with 8 billion parameters and can perform up to 744 trillion operations per second. Together, they reach a collective computing power of five quadrillion operations per second. In comparison, Microsoft’s newest AI-powered laptops operate at roughly 40 trillion operations per second.

Technology leaders see space as the next frontier

At a technology conference in Macau on May 21, Zhejiang Lab director Wang Jian described space as the next major stage for AI development. As reported by the South China Morning Post, Wang said the time has come to think beyond phones and laptops, calling space a new frontier for innovation over the next few decades.

Edge computing reduces lost data

Satellites have long been utilized for navigation, climate monitoring, weather forecasting, and communication. However, they typically send raw data back to Earth for analysis—a process limited by bandwidth and timing. Much of the information gets delayed or lost in transmission.

The new constellation uses edge computing to address these challenges. By processing data directly on board, the satellites can transmit only final results back to Earth. This saves time, reduces bandwidth requirements, and enables faster, more efficient use of satellite resources.

Solar power and space cooling lower environmental impact

The system is also designed with sustainability in mind. The satellites rely on solar energy and release heat directly into space, helping to lower their environmental footprint.

The 12 satellites communicate through laser links, forming a connected network in orbit. One satellite also carries a specialized detector to observe high-energy space events such as gamma-ray bursts.

Project inspired by science and fiction

The constellation is named after the “three-body problem,” a scientific question first raised by Isaac Newton, which deals with the unpredictable movement of three objects under gravitational forces. The name also references the acclaimed science fiction trilogy The Three-Body Problem by Chinese author Liu Cixin, now adapted into a Netflix series.

According to Wang, the project reflects the complexity of managing many interconnected systems. He called for international cooperation, noting that the computing network will be open to organizations worldwide for development and use.

US eyes similar advancements

While the United States and Europe have conducted limited tests of space-based computing, China’s constellation is the first to reach an operable scale. Other efforts are also taking shape.

Former Google CEO Eric Schmidt, now an investor in the California-based space firm Relativity Space, has proposed building entire data centers in orbit.

Rising demand drives new energy concerns

At an April 9 hearing at the US House Committee on Energy and Commerce, Schmidt discussed the increasing energy demands of artificial intelligence and data infrastructure. “People are planning 10 gigawatt data centers,” he said. “It gives you a sense of how big this crisis is.”

He estimated that data centers could require 29 additional gigawatts of power by 2027, and 67 more by 2030. “These things are industrial at a scale that I have never seen in my life,” Schmidt told lawmakers.

Bringing you the latest news and insights, Everyday!
© 2024 • All Rights Reserved.