Research Article
Open access
Published on 29 November 2024
Download pdf
Han,J. (2024). Human-Computer Interaction on Facial Recognition and Emotional Feedback. Applied and Computational Engineering,111,26-34.
Export citation

Human-Computer Interaction on Facial Recognition and Emotional Feedback

Jinze Han *,1,
  • 1 Brunel London School, North China University of Technology, Beijing, China

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/2755-2721/111/2024CH0114

Abstract

The advancement of science and technology has resulted in the growing prevalence of facial recognition technology in human-computer interaction, particularly in the domain of emotion recognition, where it shows significant potential. By identifying the user's facial expressions and emotional responses, the system is capable of conducting further analysis and prediction of the user's needs, thereby optimizing the emotion recognition experience of human-computer interaction. This article will elucidate the principles of facial recognition technology, with a particular focus on its practical applications and technical realization in emotion recognition. Furthermore, this article will examine the current limitations of the technology, discuss potential avenues for improvement, and speculate on the future development of facial recognition technology in the field of human-computer interaction. The technology is currently being used in a variety of areas, including smart home technology, medical rehabilitation and public services. The technology has the potential to improve the user experience by increasing the intelligence of devices and providing more personalised services to users through sentiment analysis.

Keywords

Human-computer interaction, facial recognition, emotional identification.

[1]. Lyons MJ, Akamatsu S, Kamachi M & Gyoba J (1998) Japanese Female Facial Expression (JAFFE) Database. In Proceedings of the Third IEEE International Conference on Automatic Face and Gesture Recognition, pp. 200-205.

[2]. Cohn JF, Ambadar Z, Ekman P (2007) Observer-based measurement of facial expression with the facial action coding system. Neurosci Letters, pp.203-221

[3]. Valstar M & Pantic M(2010). Induced disgust, happiness and surprise: An addition to the MMI facial expression database. In Proceedings of the International Conference on Language Resources and Evaluation, Workshop on Emotion, pp.65–70

[4]. Kanade T, Tian Y & Cohn JF (2002) Comprehensive database for facial expression analysis. In Proceedings of the IEEE International Conference on Automatic Face & Gesture Recognition (Grenoble, France), pp.46-53

[5]. Lucey P , Cohn JF, Kanade T et al. (2010) The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression. In Computer Vision and Pattern Recognition Workshops (San Francisco, CA, USA), pp.94-101

[6]. Kuang L, Zhang M & Pan Z (2016) Facial expression recognition with CNN ensemble. In International Conference on Cyberworlds (IEEE, Chongqing, China), pp.259-262

[7]. SatrioAgung E, Rifai AP & Wijayanto T (2024) Image-based facial emotion recognition using convolutional neural network on Emognition Dataset. Scientific Reports, 14:(14429)

[8]. Sultana S, Mustafa R & Chowdhury MS (2023) Human emotion recognition from facial images using convolutional neural network. In Machine Intelligence and Emerging Technologies. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering (Springer, Cham). Vol. 491, pp.141–153

[9]. Bisen D, Shukla R, Rajpoot N, et al. (2022) Responsive human-computer interaction model based on recognition of facial landmarks using machine learning algorithms. Multimedia Tools and Applications, 81:(14577-14599)

[10]. Liu J, Wang HX & Feng YJ (2021) An end-to-end deep model with discriminative facial features for facial expression recognition. IEEE Access, 9:(12158–12166)

[11]. Wang M, Hong J & Ying L (2010) Face recognition based on DWT/DCT and SVM. In 2010 International Conference on Computer Application and System Modeling (IEEE, Taiyuan)

[12]. Li C., Diao Y, Ma H & Li Y (2009) A statistical PCA method for face recognition. In 2008 Second International Symposium on Intelligent Information Technology Application (IEEE, Shanghai, China)

[13]. Sharma S, Verma R, Sharma MK, Kumar V (2016) Facial expression recognition using PCA. International Journal of Scientific Research and Development, 4(2):1905–1910

[14]. Luo Y, Wu CM, Zhang Y (2013) Facial expression recognition based on fusion feature of PCA and LBP with SVM. Optik - International Journal of Light and Electron Optics, 124(17):2767–2770

[15]. Shan C, Gong S, Mcowan PW (2009) Facial expression recognition based on local binary patterns: a comprehensive study. Image and Vision Computing, 27(6):803–816

[16]. Mishra N, Bhatt A (2021) Feature extraction techniques in facial expression recognition. In 2021 5th International Conference on Intelligent Computing and Control Systems (Madurai, India)

[17]. Yadav KS, Singha J (2020) Facial expression recognition using modified Viola-John’s algorithm and KNN classifier. Multimedia Tools and Applications, 79(19):13089–13107

[18]. Kim S, An GH, Kang SJ (2017) Facial expression recognition system using machine learning. In International SoC Design Conference (Seoul, Korea)

[19]. Guo X, Zhang Y, Lu S, Lu Z (2023) Facial expression recognition: a review. Multimedia Tools and Applications, 83(8):23689–23735

[20]. Georghiades AS, Belhumeur PN, Kriegman DJ (2002) From few to many: Illumination cone models for face recognition under variable lighting and pose. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(6):643–660

[21]. Saito JH, Carvalho TVD, Hirakuri M et al. (April 27–29, 2005) Using CMU PIE human face database to a convolutional neural network - Neocognitron. In ESANN 2005, 13th European Symposium on Artificial Neural Networks (Bruges, Belgium) , Proceedings (pp.491–496)

[22]. Zhao G, Huang X, Taini M, Li SZ, Pietikäinen M (2011) Facial expression recognition from near infrared videos. Image and Vision Computing, 29(9):607–619.

[23]. Langner O, Dotsch R, Bijlstra G, Wigboldus DH, Hawk ST et al. (2010) Presentation and validation of the Radboud faces database. Cognition and Emotion, 24(8):1377–1388.

[24]. Lundqvist D, Flykt A, hman A (1998) The Karolinska directed emotional faces – KDEF [CD-ROM]. Department of Clinical Neuroscience, Psychology Section, Karolinska Institute.

[25]. Sengupta S, Chen JC, Castillo C, Patel VM, Jacobs DW (2016) Frontal to profile face verification in the wild. In 2016 IEEE Winter Conference on Applications of Computer Vision. (Lake Placid, NY, USA), pp.1–9

[26]. Zhalehpour S, Onder O, Akhtar Z, Erdem CE (2017) BAUM-1: a spontaneous audio-visual face database of affective and mental states. IEEE Transactions on Affective Computing, 8(3):300–313

[27]. Ebner NC, Riediger M, Lindenberger U (2010) FACES—a database of facial expressions in young, middle-aged, and older women and men: Development and validation. Behavior Research Methods, 42(1):351–362.

[28]. Aifanti N, Papachristou C, Delopoulos A (2010) The MUG facial expression database. In 2010 11th International Workshop on Image Analysis for Multimedia Interactive Services (IEEE, Desenzano del Garda, Italy), pp.1–4

[29]. Li S, Deng W, Du JP (2017) Reliable crowdsourcing and deep locality-preserving learning for expression recognition in the wild. In 2017 IEEE Conference on Computer Vision and Pattern Recognition ( IEEE, Honolulu, HI, USA), pp.2584–2593

[30]. Dhall A, Goecke R, Lucey S, Gedeon T (2011) Acted facial expressions in the wild database. Technical Report TR-CS-11-02. Australian National University.

[31]. Dhall A, Goecke R, Lucey S, Gedeon T (2011) Static facial expression analysis in tough conditions: Data, evaluation protocol and benchmark. In IEEE International Conference on Computer Vision Workshops ( IEEE, Barcelona, Spain), pp.2106–2112

[32]. Mollahosseini A, Hasani B, Mahoor MH (1949) AffectNet: A database for facial expression, valence, and arousal computing in the wild. IEEE Transactions on Affective Computing, 10(1):18–31

[33]. Yin L, Wei X, Yi S, Wang J, Rosato MJ (2006) A 3D facial expression database for facial behavior research. In 7th International Conference on Automatic Face and Gesture Recognition (IEEE, Southampton, UK.), pp. 211–216

[34]. Xing Z, Yin L, Cohn JF, Canavan S, et al. (2013) A high-resolution spontaneous 3D dynamic facial expression database. Image and Vision Computing, 32(10):692–706

[35]. Huang CC, Zhao HM, Chen YC, et al. (2021) A system for the detection of hazardous driving due to driver fatigue based on facial recognition.Electromechanical Engineering Technology, 50(12):143–146

[36]. Yu Yanxiu, Yu Rui, Yan Feijie (2021) Intelligent hotel management system based on facial recognition. Journal of Intelligent Systems, , 50(1):116

[37]. Luo Peng (2020, April) A study of facial recognition-based reproduction control of anthropomorphic expressions in a welcoming robot. Master of Engineering Thesis, Harbin Institute of Technology.

[38]. Kalpana Chowdary M, Nguyen TN, Hemanth DJ (2021) Deep learning-based facial emotion recognition for human–computer interaction applications, Neural Computing and Applications, 35(10):692–706

[39]. Palanichamy N (2023) Occlusion-aware facial expression recognition: A deep learning approach. Multimedia Tools and Applications, 82(28):32895–32921

[40]. Ujjwal MK, Parameswaran S, Chowdary VG, et al.(2024) Comparative analysis of facial expression recognition algorithms. In Data Science and Security (Lecture Notes in Networks and Systems, vol. 922) (pp. 419–431)

Cite this article

Han,J. (2024). Human-Computer Interaction on Facial Recognition and Emotional Feedback. Applied and Computational Engineering,111,26-34.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceedings of CONF-MLA 2024 Workshop: Mastering the Art of GANs: Unleashing Creativity with Generative Adversarial Networks

Conference website: https://2024.confmla.org/
ISBN:978-1-83558-745-4(Print) / 978-1-83558-746-1(Online)
Conference date: 21 November 2024
Editor:Mustafa ISTANBULLU, Marwan Omar
Series: Applied and Computational Engineering
Volume number: Vol.111
ISSN:2755-2721(Print) / 2755-273X(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).