Master data

Modular Environment-Aware Dynamic Multi-Sensor Fusion for Robust Navigation of SWAP-Constrained Autonomous Systems
Description:

While sensor fusion for state estimation on mobile platforms has a long history, recent issues particularly concerning the availability of communication and GNSS signals clearly demonstrate the limited robustness and resilience of current methods in contested environments. The addition of sensor modalities mitigates failure modes of individual sensors, but quickly scales up both the computational and operational complexity of the overall estimation framework. An intelligent approach using the right sensor signals at the right time, with the necessary self-awareness to take appropriate decisions with respect to signal usage, motion, and state confidence and state initialization would provide a robust estimation approach resilient to environmental and sensor anomalies. Importantly, such an approach may not rely on expert knowledge for initialization or operation.

The goal is to enable innovative solutions providing precision state-estimation, PNT solutions for GPS denied and contested environments and extremely size, weight, power, and processing constrained systems to generate leap-ahead technologies by pursuing non-traditional materials, integration methods, and system architectures.

University of Klagenfurt’s Control of Networked Systems Group proposes the development of a robust embedded software architecture and algorithms to perform multi-sensor fusion localization to tactical precision for small UAS systems, using multiple GNSS sensors and a run-time enumerated set of aiding sensors. The novelty of the proposed approach is in its dynamic modularity, as the system is able to exploit GNSS signals when available but fall back upon proprioceptive and exteroceptive sensors (e.g. IMU, vision) when access to GNSS signals is challenged or degraded. An important capability beyond the state of the art in multi-sensor fusion is run-time automatic self-calibration of extrinsic and intrinsic states (in particular, camera lens parameters), and self-healing. The approach is designed around the principle that the set of sensors does not need to be known at build or deployment time; therefore, the system can exploit whatever sensors are functioning in a robust and survivable manner.

In addition, a novel equivariant estimator formulation aims at consistent and guaranteed state convergence behavior from practically arbitrary initialization values.

Keywords: State estimation, sensor fusion, equivariant estimators, sensor identification
Short title: MEDuSe
Period: 08.09.2021 - 07.09.2024
Contact e-mail: -
Homepage: -

Employees

Employees Role Time period
Stephan Michael Weiss (internal)
  • 08.09.2021 - 07.09.2024
  • 08.09.2021 - 07.09.2024

Categorisation

Project type Research funding (on request / by call for proposals)
Funding type §27
Research type
  • Fundamental research
  • Applied research
Subject areas
  • 102003 - Image processing
  • 102009 - Computer simulation
  • 202035 - Robotics
Research Cluster
  • Self-organizing systems
Gender aspects Genderrelevance not selected
Project focus
  • Science to Science (Quality indicator: n.a.)
Classification raster of the assigned organisational units:
working groups
  • Control of Networked Systems

Cooperations

No partner organisations selected