Next Article in Journal
Spectral Efficiency Improvement Using Bi-Deep Learning Model for IRS-Assisted MU-MISO Communication System
Previous Article in Journal
Spectrum Sensing, Clustering Algorithms, and Energy-Harvesting Technology for Cognitive-Radio-Based Internet-of-Things Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Saliency-Driven Hand Gesture Recognition Incorporating Histogram of Oriented Gradients (HOG) and Deep Learning

Department of Computing Science, University of Alberta, Edmonton, AB T6G 2E8, Canada
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(18), 7790; https://doi.org/10.3390/s23187790
Submission received: 10 June 2023 / Revised: 23 August 2023 / Accepted: 1 September 2023 / Published: 11 September 2023

Abstract

Hand gesture recognition is a vital means of communication to convey information between humans and machines. We propose a novel model for hand gesture recognition based on computer vision methods and compare results based on images with complex scenes. While extracting skin color information is an efficient method to determine hand regions, complicated image backgrounds adversely affect recognizing the exact area of the hand shape. Some valuable features like saliency maps, histogram of oriented gradients (HOG), Canny edge detection, and skin color help us maximize the accuracy of hand shape recognition. Considering these features, we proposed an efficient hand posture detection model that improves the test accuracy results to over 99% on the NUS Hand Posture Dataset II and more than 97% on the hand gesture dataset with different challenging backgrounds. In addition, we added noise to around 60% of our datasets. Replicating our experiment, we achieved more than 98% and nearly 97% accuracy on NUS and hand gesture datasets, respectively. Experiments illustrate that the saliency method with HOG has stable performance for a wide range of images with complex backgrounds having varied hand colors and sizes.
Keywords: Canny edge detection; convolutional neural network (CNN); hand gesture detection; histogram of oriented gradients (HOG); saliency map; skin color Canny edge detection; convolutional neural network (CNN); hand gesture detection; histogram of oriented gradients (HOG); saliency map; skin color

Share and Cite

MDPI and ACS Style

Jafari, F.; Basu, A. Saliency-Driven Hand Gesture Recognition Incorporating Histogram of Oriented Gradients (HOG) and Deep Learning. Sensors 2023, 23, 7790. https://doi.org/10.3390/s23187790

AMA Style

Jafari F, Basu A. Saliency-Driven Hand Gesture Recognition Incorporating Histogram of Oriented Gradients (HOG) and Deep Learning. Sensors. 2023; 23(18):7790. https://doi.org/10.3390/s23187790

Chicago/Turabian Style

Jafari, Farzaneh, and Anup Basu. 2023. "Saliency-Driven Hand Gesture Recognition Incorporating Histogram of Oriented Gradients (HOG) and Deep Learning" Sensors 23, no. 18: 7790. https://doi.org/10.3390/s23187790

APA Style

Jafari, F., & Basu, A. (2023). Saliency-Driven Hand Gesture Recognition Incorporating Histogram of Oriented Gradients (HOG) and Deep Learning. Sensors, 23(18), 7790. https://doi.org/10.3390/s23187790

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop