Vision enabled smart manipulations for in-space construction

Xiu-Tian Yan, Nassir Oumer, Mutian Li, Chenhao Ran, Maximo Roa, Baixiang Zhao, Youhua Li, Pierre Letier

Research output: Contribution to conferencePaperpeer-review

Abstract

In recent years, vision systems and in particular the algorithms developed to process images have made significant advancement in recognizing various types of objects in facial recognition, agricultural objects recognition, engineering parts and so forth. Vision hardware systems become much more accessible. Vision systems therefore have the potential to allow automatic monitoring and manoeuvring of spacecrafts and robots in space. Meaningful and critical spatial and geometric information of a targeted object can be extracted from camera images, and can be communicated to a robotic planner to enable real time replanning based on new information identified. This could also be communicated to a human operator for immediate reaction to unforeseen events. Such vision systems have been deployed successfully in factories and terrestrial robotic applications, but they face unique challenges in space where the environment is unstructured and could also be unknown. Unreliable and unstable illumination affect the image quality over extended observations. Furthermore, satellites and spacecrafts are composed of highly reflective material and parts might no longer be recognizable due to excessive reflected brightness. As the space industry practice requires the deployment of high reliable systems, it is imperative that these problems are systematically analysed and addressed before a vision system can become operative in space. This article is a step forward toward reliable space vision systems. It introduces a systematic approach to measuring and validating the performance of vision algorithms under different illumination conditions. Firstly, a test environment, a test rig purposely built for space applications and a new image dataset for testing 3D surface reconstruction algorithms are presented. Thereafter, the performance measures and test results from object localization algorithms are described, supporting a wider European space demonstrator project MOSAR which aims to demonstrate the capabilities of autonomous assemblies of spacegrafts using a walking manipulator.
Original languageEnglish
Publication statusPublished - 12 Oct 2020
Event71st International Astronautical Congress - Virtual
Duration: 12 Oct 202014 Oct 2020
Conference number: 71
https://www.iafastro.org/events/iac/iac-2020/

Conference

Conference71st International Astronautical Congress
Abbreviated titleIAC 2020
Period12/10/2014/10/20
Internet address

Keywords

  • vision systems
  • automatic monitoring
  • manoeuvring
  • spacecraft

Fingerprint

Dive into the research topics of 'Vision enabled smart manipulations for in-space construction'. Together they form a unique fingerprint.

Cite this