4.6 Review

A Survey on Hand Pose Estimation with Wearable Sensors and Computer-Vision-Based Methods

Journal

SENSORS
Volume 20, Issue 4, Pages -

Publisher

MDPI
DOI: 10.3390/s20041074

Keywords

human-computer interaction; computer vision; data gloves; hand pose estimation; deep learning; wearable devices

Funding

  1. National Natural Science Foundation of China [61772217, 71771098]
  2. Fundamental Research Funds for the Central Universities of China [2017KFYXJJ225]

Ask authors/readers for more resources

Real-time sensing and modeling of the human body, especially the hands, is an important research endeavor for various applicative purposes such as in natural human computer interactions. Hand pose estimation is a big academic and technical challenge due to the complex structure and dexterous movement of human hands. Boosted by advancements from both hardware and artificial intelligence, various prototypes of data gloves and computer-vision-based methods have been proposed for accurate and rapid hand pose estimation in recent years. However, existing reviews either focused on data gloves or on vision methods or were even based on a particular type of camera, such as the depth camera. The purpose of this survey is to conduct a comprehensive and timely review of recent research advances in sensor-based hand pose estimation, including wearable and vision-based solutions. Hand kinematic models are firstly discussed. An in-depth review is conducted on data gloves and vision-based sensor systems with corresponding modeling methods. Particularly, this review also discusses deep-learning-based methods, which are very promising in hand pose estimation. Moreover, the advantages and drawbacks of the current hand gesture estimation methods, the applicative scope, and related challenges are also discussed.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available