Journal
ROBOTICS AND AUTONOMOUS SYSTEMS
Volume 71, Issue -, Pages 69-82Publisher
ELSEVIER
DOI: 10.1016/j.robot.2014.11.019
Keywords
Sensor fusion; Cortically inspired network; Egomotion estimation; Mobile robots
Funding
- Elite Network of Bavaria
Ask authors/readers for more resources
All physical systems must reliably extract information from their noisy and partially observable environment and build an internal representation of space to orient their behaviour. Precise egomotion estimation is important to keep external (i.e. environmental information) and internal (i.e. proprioception) cues coherent. The constructed representation subsequently defines the space of possible actions. Due to the multimodal nature of incoming streams of sensory information, egomotion estimation is a challenging sensor fusion problem. In this paper we present a distributed cortically inspired processing scheme for sensor fusion, which given various sensory inputs, and simple relations defining inter-sensory dependencies, relaxes into a solution which provides a plausible interpretation of the perceived environment. The proposed model has been implemented for egomotion estimation on an autonomous mobile robot. We demonstrate that the model provides a precise estimate of both robot position and orientation. (C) 2015 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available