Visual pose estimation system for autonomous rendezvous of spacecraft

Mark A. Post, Xiu T. Yan, Junquan Li, Craig Clark

Research output: Contribution to conferencePaperpeer-review

165 Downloads (Pure)


In this work, a tracker spacecraft equipped with a short-range vision system is tasked with visually identifying a target spacecraft and determining its relative angular velocity and relative linear velocity using only visual information from onboard cameras. Focusing on methods that are feasible for implementation on relatively simple spacecraft hardware, we locate and track objects in three-dimensional space using conventional high-resolution cameras, saving cost and power compared to laser or infrared ranging systems. Identification of the target is done by means of visual feature detection and tracking across rapid, successive frames, taking the perspective matrix of the camera system into account, and building feature maps in three dimensions over time. Features detected in two-dimensional images are matched and triangulated to provide three-dimensional feature maps using structure-from-motion techniques. This methodology allows one, two, or more cameras with known baselines to be used for triangulation, with more images resulting in higher accuracy. Triangulated points are organized by means of orientation histogram descriptors and used to identify and track parts of the target spacecraft over time. This allows some estimation of the target spacecraft's motion even if parts of the spacecraft are obscured or in shadow. The state variables with respect to the camera system are extracted as a relative rotation quaternion and relative translation vector for the target. Robust tracking of the state variables for the target spacecraft is accomplished by an embedded adaptive unscented Kalman filter. In addition to estimation of the target quaternion from visual Information, the adaptive filter can also identify when tracking errors have occurred by measurement of the residual. Significant variations in lighting can be tolerated as long as the movement of the satellite is consistent with the system model, and illumination changes slowly enough for state variables to be estimated periodically. Inertial measurements over short periods of time can then be used to determine the movement of both the tracker and target spacecraft. In addition, with a sufficient number of features tracked, the center of mass of the target can be located. This method is tested using laboratory images of spacecraft movement with a simulated spacecraft movement model. Varying conditions are applied to demonstrate the effectiveness and limitations of the system for online estimation of the movement of a target spacecraft at close range.
Original languageEnglish
Number of pages9
Publication statusPublished - 13 May 2015
Event13th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA 2015) - Noordwijk, Netherlands
Duration: 11 May 201513 May 2015


Conference13th Symposium on Advanced Space Technologies in Robotics and Automation (ASTRA 2015)
Abbreviated titleASTRA2015


  • satellite
  • vision
  • pose estimation
  • spacecraft navigation


Dive into the research topics of 'Visual pose estimation system for autonomous rendezvous of spacecraft'. Together they form a unique fingerprint.

Cite this