AI-based sensor fusion for robust pose estimation and autonomous navigation of spacecraft missions to asteroids

Research output: Chapter in Book/Report/Conference proceedingConference contribution book

Abstract

Deep space missions to asteroids depend significantly on autonomous navigation systems to perform proximity operations around the target asteroids, which requires the perception of the target and its relative pose (position and attitude). The dynamical environment around asteroids is challenging due to poor illumination conditions and large uncertainties in the target’s shape and motion. Developing robust pose estimation methods is essential to the mission success. This work aims to evaluate how sensor data fusion can improve the robustness of pose estimation, which allows for the shortcoming of an individual sensor to be addressed by combining it with complimentary data other sensors. The fusion of visible camera data, thermal camera data, and laser altimeter (range) data is explored in this study as they are common sensors onboard a spacecraft around asteroids. This allows limitations of visible data in shadow and in depth to be addressed by thermal data and range data. The Convolutional Neural Network (CNN) is applied for feature extraction as CNN architectures have achieved superior results over hand crafted methods for many image processing tasks, including artificial satellite pose estimation. The feature maps as the output of the CNN are then used to infer the relative pose and change in pose between frames. Different fusion levels are explored. Source level fusion is performed by training a CNN with a tensor input containing the data from multiple sensors. Decision level fusion is performed by producing covariance matrices for the position of the features identified using the CNN and fusing them using an Unscented Kalman Filter. Due to the lack of available real data sets for CNN training and testing, synthetic sensor data sets are generated using Blender. Range data is generated using the BLAINDER plugin. Thermal images required simulation of a thermal model of the asteroid before importing temperature data into Blender. This study is performed with the Didymos binary asteroid as the target asteroid which is also the target of ESA’s Hera mission that will be launched later this year. The accuracy and computational cost of the developed methodology are investigated and compared to a control scenario where only visible images are used for pose estimation. This research demonstrates how different fusion methods and levels could improve the robustness of pose estimation with poor lighting conditions and dynamical uncertainties. It would allow robust and autonomous navigation design for future asteroid missions.
Original languageEnglish
Title of host publicationIAF Astrodynamics Symposium
Subtitle of host publicationHeld at the 75th International Astronautical Congress (IAC 2024)
Pages466-475
DOIs
Publication statusPublished - 18 Oct 2024
Event75th International Astronautical Congress - MICO Convention Centre, Milan, Italy
Duration: 14 Oct 202418 Oct 2024
Conference number: 75
https://www.iac2024.org/

Conference

Conference75th International Astronautical Congress
Abbreviated titleIAC 2024
Country/TerritoryItaly
CityMilan
Period14/10/2418/10/24
Internet address

Keywords

  • AI
  • deep learning (DL)
  • sensor fusion

Fingerprint

Dive into the research topics of 'AI-based sensor fusion for robust pose estimation and autonomous navigation of spacecraft missions to asteroids'. Together they form a unique fingerprint.

Cite this