4.7 Article

Autonomous Behavior-Based Switched Top-Down and Bottom-Up Visual Attention for Mobile Robots

Journal

IEEE TRANSACTIONS ON ROBOTICS
Volume 26, Issue 5, Pages 947-954

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TRO.2010.2062571

Keywords

Vision-guided robotics; visual attention control

Categories

Funding

  1. German Research Foundation (DFG) Excellence Initiative Research Cluster Cognition for Technical Systems (CoTeSys)
  2. Institute for Advanced Study, Technische Universitat Munchen

Ask authors/readers for more resources

In this paper, autonomous switching between two basic attention selection mechanisms, i.e., top-down and bottom-up, is proposed. This approach fills a gap in object search using conventional top-down biased bottom-up attention selection, which fails, if a group of objects is searched whose appearances cannot be uniquely described by low-level features used in bottom-up computational models. Three internal robot states, such as observing, operating, and exploring, are included to determine the visual selection behavior. A vision-guided mobile robot equipped with an active stereo camera is used to demonstrate our strategy and evaluate the performance experimentally. This approach facilitates adaptations of visual behavior to different internal robot states and benefits further development toward cognitive visual perception in the robotics domain.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available