Abstract
Deep space missions to asteroids depend significantly on autonomous navigation systems to perform proximity operations around the target asteroids, which requires the perception of the target and its relative pose (position and attitude). The dynamical environment around asteroids is challenging due to poor illumination conditions and large uncertainties in the target’s shape and motion. Developing robust pose estimation methods is essential to the mission success. This work aims to evaluate how sensor data fusion can improve the robustness of pose estimation, which allows for the shortcoming of an individual sensor to be addressed by combining it with complimentary data other sensors. The fusion of visible camera data, thermal camera data, and laser altimeter (range) data is explored in this study as they are common sensors onboard a spacecraft around asteroids. This allows limitations of visible data in shadow and in depth to be addressed by thermal data and range data. The Convolutional Neural Network (CNN) is applied for feature extraction as CNN architectures have achieved superior results over hand crafted methods for many image processing tasks, including artificial satellite pose estimation. The feature maps as the output of the CNN are then used to infer the relative pose and change in pose between frames. Different fusion levels are explored. Source level fusion is performed by training a CNN with a tensor input containing the data from multiple sensors. Decision level fusion is performed by producing covariance matrices for the position of the features identified using the CNN and fusing them using an Unscented Kalman Filter. Due to the lack of available real data sets for CNN training and testing, synthetic sensor data sets are generated using Blender. Range data is generated using the BLAINDER plugin. Thermal images required simulation of a thermal model of the asteroid before importing temperature data into Blender. This study is performed with the Didymos binary asteroid as the target asteroid which is also the target of ESA’s Hera mission that will be launched later this year. The accuracy and computational cost of the developed methodology are investigated and compared to a control scenario where only visible images are used for pose estimation. This research demonstrates how different fusion methods and levels could improve the robustness of pose estimation with poor lighting conditions and dynamical uncertainties. It would allow robust and autonomous navigation design for future asteroid missions.
Original language | English |
---|---|
Title of host publication | IAF Astrodynamics Symposium |
Subtitle of host publication | Held at the 75th International Astronautical Congress (IAC 2024) |
Pages | 466-475 |
DOIs | |
Publication status | Published - 18 Oct 2024 |
Event | 75th International Astronautical Congress - MICO Convention Centre, Milan, Italy Duration: 14 Oct 2024 → 18 Oct 2024 Conference number: 75 https://www.iac2024.org/ |
Conference
Conference | 75th International Astronautical Congress |
---|---|
Abbreviated title | IAC 2024 |
Country/Territory | Italy |
City | Milan |
Period | 14/10/24 → 18/10/24 |
Internet address |
Keywords
- AI
- deep learning (DL)
- sensor fusion