4.7 Article

VIRAL-Fusion: A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach

Journal

IEEE TRANSACTIONS ON ROBOTICS
Volume 38, Issue 2, Pages 958-977

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TRO.2021.3094157

Keywords

Quaternions; Location awareness; Matrix converters; Visualization; Simultaneous localization and mapping; Laser radar; Distance measurement; Aerial robots; localization; optimization

Categories

Funding

  1. Autonomous Systems and Software Program (WASP) - Knut and Alice Wallenberg Foundation, under the Wallenberg-NTU Presidential Postdoctoral Fellowship Program

Ask authors/readers for more resources

In recent years, significant progresses have been made in onboard self-localization (OSL) methods based on cameras or lidar. However, these methods still face inherent challenges such as estimation drift and robustness in low-texture environment. This study proposes a comprehensive optimization-based estimator that integrates data from multiple sensors to achieve long-term consistent and flexible localization for unmanned aerial vehicles (UAVs). The proposed method is evaluated in various scenarios and demonstrates effective resolution of drift issue with minimal installation requirements.
In recent years, onboard self-localization (OSL) methods based on cameras or lidar have achieved many significant progresses. However, some issues such as estimation drift and robustness in low-texture environment still remain inherent challenges for OSL methods. On the other hand, infrastructure-based methods can generally overcome these issues, but at the expense of some installation cost. This poses an interesting problem of how to effectively combine these methods, so as to achieve localization with long-term consistency as well as flexibility compared to any single method. To this end, we propose a comprehensive optimization-based estimator for the 15-D state of an unmanned aerial vehicle (UAV), fusing data from an extensive set of sensors: inertial measurement unit (IMU), ultrawideband (UWB) ranging sensors, and multiple onboard visual-inertial and lidar odometry subsystems. In essence, a sliding window is used to formulate a sequence of robot poses, where relative rotational and translational constraints between these poses are observed in the IMU preintegration and OSL observations, while orientation and position are coupled in the body-offset UWB range observations. An optimization-based approach is developed to estimate the trajectory of the robot in this sliding window. We evaluate the performance of the proposed scheme in multiple scenarios, including experiments on public datasets, high-fidelity graphical-physical simulation, and field-collected data from UAV flight tests. The result demonstrates that our integrated localization method can effectively resolve the drift issue, while incurring minimal installation requirements.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available