Research Article
Open access
Published on 31 March 2025
Download pdf
Lou,X. (2025). Vision-based Hand Gesture Recognition Technology . Applied and Computational Engineering,141,54-59.
Export citation

Vision-based Hand Gesture Recognition Technology

Xinyue Lou *,1,
  • 1 Zhejiang A&F University, College of Chemistry and Materials Engineering, Hangzhou, China 311300

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/2755-2721/2025.21696

Abstract

Human-computer interaction has a wide range of application prospects in many fields such as medicine, entertainment, industry and education. Gesture recognition is one of the most important technologies for gesture interaction between humans and robots, and visual gesture recognition increases the user's comfort and freedom compared with data glove recognition. This paper summarizes the general process of visual gesture recognition based on the literature, including three steps: pre-processing, feature extraction, and gesture classification. It also defines static and dynamic gestures and makes a comparison between their differences and recognition emphases. Based on static and dynamic gesture recognition, this paper summarizes the commonly - used visual gesture recognition methods. For static gesture recognition, it includes methods such as the template - matching method and the AdaBoost - based method. As for dynamic gesture recognition, it encompasses methods like the hidden Markov model method and the dynamic time regularization method. Finally, some applications of visual gesture recognition are introduced, for example, a non-contact system for operating rooms and smart home control.

Keywords

gesture recognition, posture recognition, computer vision, human-computer interaction

[1]. Wu.​ , Xie &​ Zhou.​ (2016).​ Research on human-​computer interaction technology based on gesture recognition.​ Computer age (2), 4.​

[2]. Qi.​ , Xu &​ Ding .​ (2017).​ Research progress of robot visual gesture interaction technology.​ Robot (04), 565-​584.​ DOI:​ 10.​13973/​J.​CNKI.​robot.​2017.​0566.​200106006066

[3]. Wei &​ Wang.​ (2024).​ Overview of gesture recognition and interaction.​ Computer and modernization (08), 67-​76.​

[4]. Zhang et al.​ (2024).​ Research on gesture recognition technology and methods.​ Journal of Northwest University for Nationalities (Natural Science Edition) (02), 21-​36.​ DOI:​ 10.​14084/​j.​cnki.​cn62-​1188/​N.​

[5]. Tian, Yang , Liang &​ Bao.​ (2020).​ Overview of Visual Dynamic Gesture Recognition.​ Journal of Zhejiang Sci-​Tech University (Natural Science Edition) (04), 557-​569.​

[6]. Fu , Li &​ Luo.​ Overview of dynamic gesture recognition technology based on vision.​ Computer Measurement and Control 1-​15.​

[7]. Pisharady, P.​ K.​ , &​ Saerbeck, M.​ .​ (2015).​ Recent methods and databases in vision-​based hand gesture recognition:​ a review.​ Computer Vision and Image Understanding, 141(C), 152-​165.​

[8]. Gallo, L.​, Placitelli, A.​ P.​ , &​ Ciampi, M.​ .​ (2011).​ Controller-​free exploration of medical image data:​ Experiencing the Kinect.​ CBMS '11:​ Proceedings of the 24th IEEE International Symposium on Computer-​Based Medical Systems.​ IEEE.​

[9]. Zhang (2023).​ Research on dynamic human gesture detection and recognition technology based on computer vision (Ph.​D.​ dissertation, Beijing University of Technology).​ Dr.​ https:​/​/​link.​cnki.​net/​doi/​10.​26935/​d.​cnki.​gbjgu.​2023.​000367doi:​10.​26935/​d.​cnki.​gbjgu.​2023.​000367.​

[10]. Mendels, O.​ , Stern, H.​ , &​Berman, S.​ .​ (2017).​ User identification for home entertainment based on free-​air hand motion signatures.​ IEEE Transactions on Systems Man &​ Cybernetics Systems, 44(11), 1461-​1473.​

[11]. Ohn-​Bar, E.​ , &​ Trivedi, M.​ M.​(2014).​ Hand gesture recognition in real time for automotive interfaces:​ a multimodal vision-​based approach and evaluations.​ IEEE Transactions on Intelligent Transportation Systems, 15(6), 2368-​2377.​

Cite this article

Lou,X. (2025). Vision-based Hand Gesture Recognition Technology . Applied and Computational Engineering,141,54-59.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceedings of the 3rd International Conference on Mechatronics and Smart Systems

Conference website: https://2025.confmss.org/
ISBN:978-1-83558-997-7(Print) / 978-1-83558-998-4(Online)
Conference date: 16 June 2025
Editor:Mian Umer Shafiq
Series: Applied and Computational Engineering
Volume number: Vol.141
ISSN:2755-2721(Print) / 2755-273X(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).