Abstract
Asteroid missions depend on autonomous navigation to carry out operations. The estimation of the relative position of the asteroid is a key step but can be challenging in poor illumination conditions. We explore how data fusion of optical and thermal sensor data using machine learning can potentially allow for more robust estimation of position. Source level fusion of visible images and thermal images using Convolutional Neural Networks is developed and tested using synthetic images based on ESA’s Hera mission scenario. It is shown that the use of thermal images allows for improved feature extraction. It also demonstrates that the use of source-level sensor fusion achieves better results than just using thermal images. This results in better identification of the asteroid’s centroid but has a much smaller effect on range estimation.
Original language | English |
---|---|
Title of host publication | Proceedings of SPAICE2024 |
Subtitle of host publication | The First Joint European Space Agency / IAA Conference on AI in and for Space |
Publisher | Zenodo |
Pages | 470-475 |
Number of pages | 5 |
DOIs | |
Publication status | Published - 19 Sept 2024 |
Event | SPAICE : AI in and for Space - European Centre for Space Applications and Telecommunications (ECSAT), UK, Oxford, United Kingdom Duration: 17 Sept 2024 → 19 Sept 2024 Conference number: 1 https://spaice.esa.int/ |
Conference
Conference | SPAICE |
---|---|
Abbreviated title | SPAICE |
Country/Territory | United Kingdom |
City | Oxford |
Period | 17/09/24 → 19/09/24 |
Internet address |
Keywords
- AI
- Navigation
- asteroids
- sensor fusion