4.3 Article

PRGFlow: Unified SWAP-aware deep global optical flow for aerial robot navigation

期刊

ELECTRONICS LETTERS
卷 57, 期 16, 页码 614-617

出版社

WILEY
DOI: 10.1049/ell2.12274

关键词

-

资金

  1. Brin Family Foundation
  2. Northrop Grumman Mission Systems University Research Program
  3. ONR [N00014-17-1-2622]
  4. National Science Foundation [BCS 1824198]

向作者/读者索取更多资源

Global optical flow estimation is crucial for aerial robot navigation, and combining cameras with IMUs can achieve low-latency odometry. Deep learning approaches are gaining momentum for their high accuracy and robustness. These techniques also offer inherent scalability and unification, adapting to different sized aerial robots.
Global optical flow estimation is the foundation stone for obtaining odometry which is used to enable aerial robot navigation. However, such a method has to be of low latency and high robustness whilst also respecting the size, weight, area and power (SWAP) constraints of the robot. A combination of cameras coupled with inertial measurement units (IMUs) has proven to be the best combination in order to obtain such low latency odometry on resource-constrained aerial robots. Recently, deep learning approaches for visual inertial fusion have gained momentum due to their high accuracy and robustness. However, an equally noteworthy benefit for robotics of these techniques are their inherent scalability (adaptation to different sized aerial robots) and unification (same method works on different sized aerial robots). To this end, we present a deep learning approach called PRGFlow for obtaining global optical flow and then loosely fuse it with an IMU for full 6-DoF (Degrees of Freedom) relative pose estimation (which is then integrated to obtain odometry). The network is evaluated on the MSCOCO dataset and the dead-reckoned odometry on multiple real-flight trajectories without any fine-tuning or re-training. A detailed benchmark comparing different network architectures and loss functions to enable scalability is also presented. It is shown that the method outperforms classical feature matching methods by 2x under noisy data. The supplementary material and code can be found at .

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据