Event-RGB Fusion for Spacecraft Pose Estimation Under Harsh Lighting
Jawaid, Mohsi, Märtens, Marcus, Chin, Tat-Jun
–arXiv.org Artificial Intelligence
Spacecraft pose estimation is crucial for autonomous in-space operations, such as rendezvous, docking and on-orbit servicing. Vision-based pose estimation methods, which typically employ RGB imaging sensors, is a compelling solution for spacecraft pose estimation, but are challenged by harsh lighting conditions, which produce imaging artifacts such as glare, over-exposure, blooming and lens flare. Due to their much higher dynamic range, neuromorphic or event sensors are more resilient to extreme lighting conditions. However, event sensors generally have lower spatial resolution and suffer from reduced signal-to-noise ratio during periods of low relative motion. A beam-splitter prism was employed to achieve precise optical and temporal alignment. Then, a RANSAC-based technique was developed to fuse the information from the RGB and event channels to achieve pose estimation that leveraged the strengths of the two modalities. The pipeline was complemented by dropout uncertainty estimation to detect extreme conditions that affect either channel. To benchmark the performance of the proposed event-RGB fusion method, we collected a comprehensive real dataset of RGB and event data for satellite pose estimation in a laboratory setting under a variety of challenging illumination conditions. Encouraging results on the dataset demonstrate the efficacy of our event-RGB fusion approach and further supports the usage of event sensors for spacecraft pose estimation. To support community research on this topic, our dataset has been released publicly. Keywords: event-based pose estimation, rendezvous, domain gap, sensor fusion, close proximity, harsh lighting1. Introduction Spacecraft pose estimation is the problem of determining the 6-degrees-of-freedom (6DoF) pose consisting of the position and orientation of a space-borne object, typically a satellite. It is a critical step in a wide range of space applications, including rendezvous, close proximity operations, debris removal, refueling and on-orbit servicing [1, 2, 3, 4]. Robust pose estimation is paramount to safely and effectively executing these tasks [5, 6]. Several types of sensor technologies can be employed for spacecraft pose estimation, but they are all subject to size-weight-power and cost (SWaP-C) constraints. Optical sensors such as RGB imaging sensors are favored due to their low SWaP-C requirements, high resolution and the availability of established vision-based algorithms. However, operating in the space environment can present nontrivial challenges to vision-based systems.
arXiv.org Artificial Intelligence
Nov-4-2025
- Country:
- North America > United States (0.28)
- Oceania > Australia
- South Australia > Adelaide (0.04)
- Genre:
- Research Report
- New Finding (0.67)
- Promising Solution (0.45)
- Research Report
- Technology: