4.6 Article

Visual Perception for Multiple Human-Robot Interaction From Motion Behavior

Journal

IEEE SYSTEMS JOURNAL
Volume 14, Issue 2, Pages 2937-2948

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSYST.2019.2958747

Keywords

Kinematics; Robot sensing systems; Visual perception; Legged locomotion; Head; Command cognition; human motion analysis; multiple human targets; multiple robots; omnidirectional (O-D) camera; robotic perception; target identification; thermal vision; visual perception; walking behavior

Funding

  1. U.S. Navy, Naval Surface Warfare Center Dahlgren
  2. U.S. Army Research Laboratory
  3. Ministry of National Education of Turkey

Ask authors/readers for more resources

Visual perception is an important component for human-robot interaction processes in robotic systems. Interaction between humans and robots depends on the reliability of the robotic vision systems. The variation of camera sensors and the capability of these sensors to detect many types of sensory inputs improve the visual perception. The analysis of activities, motions, skills, and behaviors of humans and robots have been addressed by utilizing the heat signatures of the human body. The human motion behavior is analyzed by body movement kinematics, and the trajectory of the target is used to identify the objects and the human target in the omnidirectional (O-D) thermal images. The process of human target identification and gesture recognition by traditional sensors have problem for multitarget scenarios since these sensors may not keep all targets in their narrow field of view (FOV) at the same time. O-D thermal view increases the robots' line-of-sights and ability to obtain better perception in the absence of light. The human target is informed of its position, surrounding objects and any other human targets in its proximity so that humans with limited vision or vision disability can be assisted to improve their ability in their environment. The proposed method helps to identify the human targets in a wide FOV and light independent conditions to assist the human target and improve the human-robot and robot-robot interactions. The experimental results show that the identification of the human targets is achieved with a high accuracy.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

Article Computer Science, Artificial Intelligence

Human Behavior-Based Target Tracking With an Omni-Directional Thermal Camera

Emrah Benli, Yuichi Motai, John Rogers

IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS (2019)

Article Engineering, Electrical & Electronic

Dynamic 3-D Reconstruction of Human Targets via an Omni-Directional Thermal Sensor

Emrah Benli, Jonas Rahlf, Yuichi Motai

IEEE SENSORS JOURNAL (2018)

Article Engineering, Electrical & Electronic

Vegetation Segmentation for Sensor Fusion of Omnidirectional Far-Infrared and Visual Stream

David L. Stone, Guarav Shah, Yuichi Motai, Alex J. Aved

IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING (2019)

Article Engineering, Electrical & Electronic

Direct Spherical Calibration of Omnidirectional Far Infrared Camera System

David L. Stone, Guarav Shah, Yuichi Motai

IEEE SENSORS JOURNAL (2019)

Article Geochemistry & Geophysics

Detecting Infrared Maritime Targets Overwhelmed in Sun Glitters by Antijitter Spatiotemporal Saliency

Bin Wang, Yuichi Motai, Lili Dong, Wenhai Xu

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING (2019)

Article Automation & Control Systems

Thermal Multisensor Fusion for Collaborative Robotics

Emrah Benli, Richard Lee Spidalieri, Yuichi Motai

IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS (2019)

Article Automation & Control Systems

Electric Load Forecasting Model Using a Multicolumn Deep Neural Networks

Ammar O. Hoori, Ahmad Al Kazzaz, Rameez Khimani, Yuichi Motai, Alex J. Aved

IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS (2020)

Article Geochemistry & Geophysics

DeepFuseNet of Omnidirectional Far-Infrared and Visual Stream for Vegetation Detection

David L. Stone, Sumved Ravi, Emrah Benli, Yuichi Motai

Summary: This study investigates the application of deep learning in the fusion of omnidirectional infrared and visual sensors to enhance the intelligent perception of autonomous robotic systems. By introducing novel fusion methods, deep learning can better handle the extraction of vegetation materials, reduce false detection rates, and demonstrate efficiency in experiments.

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING (2021)

Article Computer Science, Artificial Intelligence

Robust Detection of Infrared Maritime Targets for Autonomous Navigation

Bin Wang, Emrah Benli, Yuichi Motai, Lili Dong, Wenhai Xu

IEEE TRANSACTIONS ON INTELLIGENT VEHICLES (2020)

Article Engineering, Biomedical

Elastographic Tomosynthesis From X-Ray Strain Imaging of Breast Cancer

Corey Sutphin, Eric Olson, Yuichi Motai, Suk Jin Lee, Jae G. Kim, Kazuaki Takabe

IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE (2019)

No Data Available