Journal
SENSORS
Volume 17, Issue 4, Pages -Publisher
MDPI
DOI: 10.3390/s17040802
Keywords
aerial robots; SLAM; sensor fusion
Funding
- Robocity-III-CM project (Robotica aplicada a la calidad de vida de los ciudadanos, fase III) - Programas de Actividades I+D en la Comunidad de Madrid [S2013/MIT-2748]
- Structural Funds of the EU
- Estrategias de guiado y exploracion para micro-robots aereos en entornos interiores Project of the University of Alcala [CCG2016/EXP-049]
Ask authors/readers for more resources
One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available