Goto

Collaborating Authors

 constellation


Orbital AI data centers could work, but they might ruin Earth in the process

Engadget

Samsung Galaxy Unpacked 2026 is Feb. 25 A single collision could cause a cascading effect in orbit. Elon Musk's plan to launch millions of AI satellites could be disastrous for the planet. At the start of the month, Elon Musk announced that two of his companies -- SpaceX and xAI -- were merging, and would jointly launch a constellation of 1 million satellites to operate as orbital data centers. Musk's reputation might suggest otherwise, but according to experts, such a plan isn't a complete fantasy. However, if executed at the scale suggested, some of them believe it would have devastating effects on the environment and the sustainability of low Earth Earth orbit.


Why did SpaceX just apply to launch 1 million satellites?

New Scientist

Why did SpaceX just apply to launch 1 million satellites? We are only a month into 2026, yet it's already clear what one of the major space stories of the year is going to be: mega-constellations, and the ongoing attempts to launch thousands of satellites into Earth's orbit. The latest development is that SpaceX has asked the US Federal Communications Commission (FCC) for permission to launch 1 million orbital data centre satellites. The previous largest filing with the FCC, also by SpaceX, was for 42,000 Starlink satellites in 2019. "This is beyond what's been proposed by any constellation," says Victoria Samson at the Secure World Foundation in the US.


Amazon's 180 internet satellites are already too bright. It wants 3,000 more.

Popular Science

Science Space Deep Space Amazon's 180 internet satellites are already too bright. A new study determined 92% of Amazon Leo's satellites may currently impede research. Breakthroughs, discoveries, and DIY tips sent six days a week. Amazon is racing to catch up to Starlink in the battle for satellite internet dominance, and it's creating problems for everyone else. Only 180 of the proposed 3,236 Amazon Leo satellites are currently in low Earth orbit, but they're already routinely bright enough to disrupt astronomical research, according to a forthcoming study .


Stratospheric internet could finally start taking off this year

MIT Technology Review

High-altitude platforms could help connect over 2 billion people around the world who are still offline. Today, an estimated 2.2 billion people But that number could drop this year, thanks to tests of stratospheric airships, uncrewed aircraft, and other high-altitude platforms for internet delivery. Even with nearly 10,000 active Starlink satellites in orbit and the OneWeb constellation of 650 satellites, solid internet coverage is not a given across vast swathes of the planet. One of the most prominent efforts to plug the connectivity gap was Google X's Loon project . Launched in 2011, it aimed to deliver access using high-altitude balloons stationed above predetermined spots on Earth. But the project faced literal headwinds--the Loons kept drifting away and new ones had to be released constantly, making the venture economically unfeasible.


China applies to launch 200,000 satellites into space, sparking concerns they plan to build a 'mega-constellation'

Daily Mail - Science & tech

Each of these enormous collections of spacecraft, dubbed CTC-1 and CTC-2, would contain 96,714 satellites spread over 3,660 different orbits. If completed, China's new mega-constellation would dwarf even SpaceX's bold ambition to put 49,000 Starlink satellites in orbit. Together, CTC-1 and CTC-2 would be the largest assembly of satellites ever put in orbit, and would effectively lock competitors out of a region of low-Earth orbit. With Chinese authorities remaining quiet about the satellites' intended use, experts have raised concerns that the constellation may pose a security or defence threat. As reported by China in Space, the Nanjing University of Aeronautics claims that the satellites will focus on: 'Low-altitude electromagnetic space security, integrated security defence systems, electromagnetic space security assessment of airspace, and low-altitude airspace safety supervision services.'


Decentralized Trust for Space AI: Blockchain-Based Federated Learning Across Multi-Vendor LEO Satellite Networks

Elmahallawy, Mohamed, Akbarfam, Asma Jodeiri

arXiv.org Artificial Intelligence

The rise of space AI is reshaping government and industry through applications such as disaster detection, border surveillance, and climate monitoring, powered by massive data from commercial and governmental low Earth orbit (LEO) satellites. Federated satellite learning (FSL) enables joint model training without sharing raw data, but suffers from slow convergence due to intermittent connectivity and introduces critical trust challenges--where biased or falsified updates can arise across satellite constellations, including those injected through cyberattacks on inter-satellite or satellite-ground communication links. We propose OrbitChain, a blockchain-backed framework that empowers trustworthy multi-vendor collaboration in LEO networks. OrbitChain (i) offloads consensus to high-altitude platforms (HAPs) with greater computational capacity, (ii) ensures transparent, auditable provenance of model updates from different orbits owned by different vendors, and (iii) prevents manipulated or incomplete contributions from affecting global FSL model aggregation. Extensive simulations show that OrbitChain reduces computational and communication overhead while improving privacy, security, and global model accuracy. Its permissioned proof-of-authority ledger finalizes over 1000 blocks with sub-second latency (0.16,s, 0.26,s, 0.35,s for 1-of-5, 3-of-5, and 5-of-5 quorums). Moreover, OrbitChain reduces convergence time by up to 30 hours on real satellite datasets compared to single-vendor, demonstrating its effectiveness for real-time, multi-vendor learning. Our code is available at https://github.com/wsu-cyber-security-lab-ai/OrbitChain.git


Graph Theory Meets Federated Learning over Satellite Constellations: Spanning Aggregations, Network Formation, and Performance Optimization

Nadimi, Fardis, Abdisarabshali, Payam, Chakareski, Jacob, Mastronarde, Nicholas, Hosseinalipour, Seyyedali

arXiv.org Artificial Intelligence

In this work, we introduce Fed-Span: \textit{\underline{fed}erated learning with \underline{span}ning aggregation over low Earth orbit (LEO) satellite constellations}. Fed-Span aims to address critical challenges inherent to distributed learning in dynamic satellite networks, including intermittent satellite connectivity, heterogeneous computational capabilities of satellites, and time-varying satellites' datasets. At its core, Fed-Span leverages minimum spanning tree (MST) and minimum spanning forest (MSF) topologies to introduce spanning model aggregation and dispatching processes for distributed learning. To formalize Fed-Span, we offer a fresh perspective on MST/MSF topologies by formulating them through a set of continuous constraint representations (CCRs), thereby integrating these topologies into a distributed learning framework for satellite networks. Using these CCRs, we obtain the energy consumption and latency of operations in Fed-Span. Moreover, we derive novel convergence bounds for Fed-Span, accommodating its key system characteristics and degrees of freedom (i.e., tunable parameters). Finally, we propose a comprehensive optimization problem that jointly minimizes model prediction loss, energy consumption, and latency of {Fed-Span}. We unveil that this problem is NP-hard and develop a systematic approach to transform it into a geometric programming formulation, solved via successive convex optimization with performance guarantees. Through evaluations on real-world datasets, we demonstrate that Fed-Span outperforms existing methods, with faster model convergence, greater energy efficiency, and reduced latency.


Towards a future space-based, highly scalable AI infrastructure system design

Arcas, Blaise Agüera y, Beals, Travis, Biggs, Maria, Bloom, Jessica V., Fischbacher, Thomas, Gromov, Konstantin, Köster, Urs, Pravahan, Rishiraj, Manyika, James

arXiv.org Artificial Intelligence

If AI is a foundational general-purpose technology, we should anticipate that demand for AI compute -- and energy -- will continue to grow. The Sun is by far the largest energy source in our solar system, and thus it warrants consideration how future AI infrastructure could most efficiently tap into that power. This work explores a scalable compute system for machine learning in space, using fleets of satellites equipped with solar arrays, inter-satellite links using free-space optics, and Google tensor processing unit (TPU) accelerator chips. To facilitate high-bandwidth, low-latency inter-satellite communication, the satellites would be flown in close proximity. We illustrate the basic approach to formation flight via a 81-satellite cluster of 1 km radius, and describe an approach for using high-precision ML-based models to control large-scale constellations. Trillium TPUs are radiation tested. They survive a total ionizing dose equivalent to a 5 year mission life without permanent failures, and are characterized for bit-flip errors. Launch costs are a critical part of overall system cost; a learning curve analysis suggests launch to low-Earth orbit (LEO) may reach $\lesssim$\$200/kg by the mid-2030s.


Bringing Federated Learning to Space

Kim, Grace, Svoboda, Filip, Lane, Nicholas

arXiv.org Artificial Intelligence

Abstract-- As Low Earth Orbit (LEO) satellite constellations rapidly expand to hundreds and thousands of spacecraft, the need for distributed on-board machine learning becomes critical to address downlink bandwidth limitations. Federated learning (FL) offers a promising framework to conduct collaborative model training across satellite networks. Realizing its benefits in space naturally requires addressing space-specific constraints, from intermittent connectivity to dynamics imposed by orbital motion. This work presents the first systematic feasibility analysis of adapting off-the-shelf FL algorithms for satellite constellation deployment. We introduce a comprehensive "space-ification" framework that adapts terrestrial algorithms (FedA vg, FedProx, FedBuff) to operate under orbital constraints, producing an orbital-ready suite of FL algorithms. We then evaluate these space-ified methods through extensive parameter sweeps across 768 constellation configurations that vary cluster sizes (1-10), satellites per cluster (1-10), and ground station networks (1-13). Our analysis demonstrates that space-adapted FL algorithms efficiently scale to constellations of up to 100 satellites, achieving performance close to the centralized ideal. Multi-month training cycles can be reduced to days, corresponding to a 9X speedup through orbital scheduling and local coordination within satellite clusters. These results provide actionable insights for future mission designers, enabling distributed on-board learning for more autonomous, resilient, and data-driven satellite operations. Low Earth Orbit (LEO) satellite constellations are expanding rapidly, supporting applications in Earth observation (EO), telecommunications, and navigation. Large-scale constellations such as Planet Labs' Dove fleet, SpaceX's Starlink, and Amazon's Project Kuiper already consist of hundreds to thousands of spacecraft, representing some of the largest distributed systems ever deployed. This unprecedented scale is driving a dramatic increase in the volume and diversity of space-based data. Earth observation missions in particular bear the brunt of this data challenge. High-resolution missions such as Landsat-8 produce 1.8 GB per scene and more than 400 TB annually [1]. At constellation scale, Planet Labs' fleet of over 200 satellites generates terabytes of imagery each day [2].