4.5 Article

Cortically inspired sensor fusion network for mobile robot egomotion estimation

Journal

ROBOTICS AND AUTONOMOUS SYSTEMS
Volume 71, Issue -, Pages 69-82

Publisher

ELSEVIER
DOI: 10.1016/j.robot.2014.11.019

Keywords

Sensor fusion; Cortically inspired network; Egomotion estimation; Mobile robots

Funding

  1. Elite Network of Bavaria

Ask authors/readers for more resources

All physical systems must reliably extract information from their noisy and partially observable environment and build an internal representation of space to orient their behaviour. Precise egomotion estimation is important to keep external (i.e. environmental information) and internal (i.e. proprioception) cues coherent. The constructed representation subsequently defines the space of possible actions. Due to the multimodal nature of incoming streams of sensory information, egomotion estimation is a challenging sensor fusion problem. In this paper we present a distributed cortically inspired processing scheme for sensor fusion, which given various sensory inputs, and simple relations defining inter-sensory dependencies, relaxes into a solution which provides a plausible interpretation of the perceived environment. The proposed model has been implemented for egomotion estimation on an autonomous mobile robot. We demonstrate that the model provides a precise estimate of both robot position and orientation. (C) 2015 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available