Airbus uses cookies (including from third parties) to offer useful features and measure performance to improve you browsing experience, providing visitors with the best possible experience when using the website. By clicking “Accept”, you agree to the use of cookies. For further information, please refer to our cookies policy. Accept

SurRender is used for the validation of vision-based navigation algorithms for the JUICE mission

Vision-Based Navigation (VBN)

Navigating the future 

Airbus is a leading developer of Vision-Based Navigation (VBN) systems, which use optical sensors and state-of-the-art techniques to provide localisation information for moving vehicles – providing a robust alternative when GPS (Global Positioning System) services are unavailable or insufficient.

VBN is relied upon for some of the most exciting – and critical – moments of space exploration: rendezvous (including meetups with orbiting space stations); reentry, descent and landing; as well as extending to interplanetary travel. It also has terrestrial applications, supporting aircraft and drones. A striking example is Automatic Air to Air Refuelling (A3R) where algorithms are used to track the refuelling boom.

An infographic highlighting Airbus’ expertise in Vision-Based Navigation (VBN) systems.

With more than a decade of experience and investment in VBN, Airbus is well-positioned to guide this emerging, fast-growing technology into the next generation. 

Space capabilities

VBN is a vital component to several high-profile missions for which Airbus has a leading role, including PILOT (Precise and Intelligent Landing using Onboard Technologies) – a key European element to be flown to the Moon on the robotic Luna-Resource lander mission as part of the European Space Agency’s (ESA) cooperation with its Russian counterpart, Roscosmos.

PILOT will provide key information to Luna-Resource during its autonomous landing phase in the last minutes of the lander’s descent to the Moon’s surface.

Applications leveraging Airbus’ VBN expertise for orbital rendezvous and On-Orbit Servicing include International Space Station supply missions using European Automated Transfer Vehicles (ATV), along with the Space Tug and Moon Cruiser. Rendezvous also extends to uncontrolled objects, and covers space debris removal around the Earth with the remarkable RemoveDebris mission and its many debris capture prototypes, or rendezvous manoeuvres in the vicinity of Mars with the Mars Sample Return mission.

VBN also is an enabler of complex space exploration missions. On the forefront lies the JPL-led Mars Sample Return (MSR) mission, which involves collecting Mars rock and dust samples onboard the Mars 2020 Rover and returning them to Earth. Once the samples are collected, they are gathered in a container (called the Orbiting Sample) and launched into a Low Mars Orbit. Airbus currently is working on the ambitious and challenging Earth Return Orbiter that detects and captures the Orbiting Sample using optical means before bringing it back to Earth.

To explore farther away, Airbus is preparing ESA’s JUICE (JUpiter ICy moons Explorer), which will study Jupiter and its icy moons. The spacecraft will begin its 7.6-year cruise to the planet in 2022 and will spend 3.5 years in the Jovian system – using VBN on its deep space voyage. 

VBN solutions at Airbus

Our key algorithms


A model-based detector and tracker, used on Earth (A3R), and in Space (rendezvous, debris-removal, etc.). Provides relative position with regard to a model object found on the image.


A feature detector and tracker for relative navigation. Tracks a set number of features across successive images.


A high-genericity landmark matching solution for absolute navigation. Detects known landmarks on the image using a set database.

StarNav UFS

A range of pre-processing algorithms to interface image-processing with navigation filters. Links image–processing metrics (points or edges on the image) with navigation metrics (movement between successive images).

StarNav Schmidt EKF

An efficient extended Kalman filter implementation relying on polymorphic model representations. Merges StarNav UFS inputs with the inputs of possible conventional sensors (accelerometers, gyrometers, star trackers, doppler measurements, etc.).

Development and validation methodology

An infographic detailing the software development cycle for Airbus’ Vision-Based Navigation (VBN) systems.

Analysis and specifications

The analysis of mission needs is the root of a navigation design. The top-level needs flow down into system specifications, sub-system specifications, and component specifications. Heritage from past missions serve the specification phase as it provides rationales for the design choices.


Simulations on standard PC platforms are used to verify the compliancy of the design. Navigation sensors can be simulated using a range of performance models which are backed up by previous flight data and sensor analysis. Input parameters can be varied to assess the robustness of the solution.

The in-house StarNav tool includes a full-fledged VBN library, with highly representative models and simulation means.

High-fidelity images can be simulated through image synthesis by the Airbus tool SurRender.

Representative real-time tests

Real-time tests validate the on-board software architecture including scheduling and input and output logic. Generally, these tests are also performed on the target hardware, and are referred to as Processor-in-the-Loop (PIL) tests. This test bench is a first step towards the Avionics Test Bench (ATB), which also evaluates sensor hardware and is used in more advanced development phases.

The use of the StarNav development process accelerates the transition from simulation tests to PIL tests thanks to a highly transitive design: simulation code is flight-ready code.

Flight tests

A flight platform is assembled once the compliancy to the specifications and the real-time architecture are validated. The tests on the flight platform implement the real-time architecture and replace sensor simulation by actual sensor acquisition. They are sometimes referred to as Hardware-in-the-Loop (HIL) tests. For advanced studies and feasibility assessment, space hardware is often replaced by small form factor COTS hardware with similar characteristics. Flight tests can be performed on-board drones, on-board helicopters (e.g. GENEVIS), or using robotic platforms (e.g. EPOS).

Mission heritage

Past missions make up our VBN expertise. Our heritage non-exhaustively includes:

  • Relative navigation on-board a drone with extensive flight-tests
  • SpaceTug/EPOS: model-based tracking for semi-cooperative rendezvous. Tested at the EPOS facilities.
  • A3R: model-based tracking for automatic air-to-air refuelling. Flight test on-board a A330 MRTT.
  • GENEVIS: relative and absolute navigation for precision moon landing. Flight test on-board a Cabri G2 helicopter.

GENEVIS for lunar touchdowns

Developed by Airbus and ESA, GENEVIS is a full-software VBN solution dedicated to precise lunar landings. The outputs of two image processing algorithms, providing information on relative and absolute positioning respectively, are hybridised with inertial measurements by a highly efficient extended Kalman filter.

GENEVIS validates the performance and robustness of the solution with simulations, the CPU load performance on space-graded processors in real-time, as well as the in-flight accuracy with real hardware and representative dynamic conditions.

The GENEVIS demonstration upped the solutions’ maturity level to TRL5 and paved the way to a variety of other applications thanks to the high genericity of Airbus’ VBN solutions.

Vision navigation contacts:

Computer Vision:

GNC Advanced Studies:

You might also be interested in:

Back to top