Skip to main content

Airbus DS Vision Based Navigation solutions tested on LIRIS experiment data

Aurore Masson1,Paul Duteis1,Christoph Haskamp2,Ingo Ahrns2,Roland Brochard1,Keyvan Kanani1,Remi Delage1
ADS Toulouse1ADS Bremen2

Document details

Publishing year2017 PublisherESA Space Debris Office Publishing typeConference Name of conference7th European Conference on Space Debris
Pagesn/a Volume
7
Issue
1
Editors
T. Flohrer, F. Schmitz

Abstract

The LIRIS Demonstrator is an experiment of vision based navigation sensors implemented on ATV-5 George LemaƮtre and activated during the approach phase with the International Space Station (ISS). Studies of non-cooperative rendezvous stress the need for a GNC based on image processing using Lidar sensors and cameras. During the ATV-5 approach, two infra-red cameras, one monochrome visible camera and one scan LiDAR have recorded images during both a dedicated ISS fly-under and the Rendezvous with the ISS ending with docking. This flight database was completed with a reference trajectory processed from telemetry issued from ATV nominal navigation and with information on LIRIS sensors such as position, orientation, calibration laws.
Airbus Defence and Space (ADS) has been working for many years on vision-based navigation solutions including target detection, target tracking and navigation filtering. Our image processing solutions are tested on LIRIS flight data to assess their performances on real images and are compared with the reference trajectory to make a post-flight analysis. The experiment covered the full range of rendezvous distances, the ISS starting as a point-like object in images, growing in the field of view and finally having it fully resolved in the sensor. The image processing modules (detection and tracking) use an a priori 3D model of the ISS, and mainly rely on target edges processing, when the resolution is sufficient. They feed a navigation filter with LoS measurements at long range, and both position and attitude measurements at short range. The navigation filter then fuses these inputs together with star tracker and IMU measurements to refine it knowledge on the ISS.
In this paper, the main features of the LIRIS demonstration and sensors are presented. The ADS vision based navigation solution (Image Processing + Navigation) is described. The test campaign results on LIRIS data are provided and a discussion on on-board real time implementation is proposed.

Preview