1 Introduction
Brain-Computer Interface (BCI) technology has become a focal point of research in neuroscience and human-computer interaction, offering a direct communication link between the brain and external devices. Originally conceptualized in the late 19th century through discoveries in brain electrical activity, BCI technology has since evolved into a sophisticated system with wide applications, particularly in neurorehabilitation, communication, and human enhancement [1]. The core of BCI technology revolves around its ability to collect, process, and apply brain signals, making it an invaluable tool in fields ranging from medical applications to entertainment. With its growing capabilities, BCI has increasingly been used to investigate and influence complex human behaviors, including emotional states. Emotions play a critical role in human experience, affecting decision-making, social interactions, and overall well-being. Emotional dysregulation can lead to various mental health disorders, including depression and anxiety, making the ability to recognize and regulate emotions a key area of focus for researchers. By leveraging neurophysiological signals such as electroencephalography (EEG), BCI technology allows for real-time monitoring of emotional states, providing insights into how emotions are formed, perceived, and controlled. Models such as Ekman’s basic emotions theory and Russell’s circumplex model of affect serve as foundational frameworks for categorizing emotional states within BCI applications. These models enable the classification of emotions along dimensions such as valence (positive or negative) and arousal (high or low), which are crucial for developing accurate emotion recognition systems. The ability to regulate emotions is equally significant, particularly for individuals suffering from emotional disorders. Techniques such as neurofeedback, which allows individuals to consciously alter their brain activity, and brain stimulation methods like transcranial direct current stimulation (tDCS) have shown promise in improving emotional regulation. These methods target specific brain regions associated with emotional control, such as the prefrontal cortex, and have been used in both clinical and non-clinical settings to enhance emotional resilience and reduce symptoms of emotional dysregulation [2]. This paper aims to provide a comprehensive review of the existing literature on BCI applications in emotion recognition and regulation. The primary focus is on synthesizing current research to map out key developments, challenges, and future directions. By integrating insights from multiple studies, this review seeks to consolidate knowledge on how BCI systems are being employed to decode and modulate emotional states. This article serves as a valuable resource for researchers looking to build upon existing work in this evolving field.
2 Overview of BCI Technology
BCI technology is a system which was put forward by Richard Caton, Adolf Beck and Hans Berger in the 1800s depending on the discoveries regarding the brain’s continuous electrical activity provided a foundation for measuring and manipulating nervous system signals. When broadly construed, BCIs are intend to develop, restore or rehabilitate function, ultimately aiming to improve user’s abilities to communicate, interact with their environment, and help them with achieving personal goals. BCIs for the restoration of lost function typically require bypassing lesions caused by disease or trauma to directly replace the lost function [3]. The BCIs system comprises three fundamental components, each serving a specific role. Signal collection or acquisition, its processing, and subsequent application. The components are interlinked and collaborate to facilitate the transmission of signals to the intended BCIs application. Under certain conditions, the BCIs application can transmit control signals back to the brain, stimulating basic human functions like visual and auditory perception [4]. The BCIs microcontroller must process the collected signals to remove any noise or artifacts resulting from external or device-specific factors. The analysis of the obtained signals and the recognition of corresponding commands are performed by an artificial neural network with advanced data processing and adaptive capabilities. Ultimately, the obtained decoded signals are interpreted based on their specific characteristics on the controlled device. Neural interfaces are classified into three categories based on their level of invasiveness: invasive, non-invasive, and semi-invasive. Invasive neural interfaces necessitate the implantation of intracortical microelectrodes (IM) directly into the brain, delivering the highest effectiveness while presenting a higher level of risk. Non-invasive neural interfaces utilize methods such as electroencephalography (EEG), magnetoencephalography (MEG), or functional magnetic resonance imaging (fMRI) to assess brain activity from the head's surface, avoiding electrode implantation. Electrodes in semi-invasive BCIs are located under the skull and rest on the brain's surface, as in the case of electrocorticography (ECoG). EEG, fNIRS, MEG, and ECoG each offer distinct advantages and limitations in BCI applications, with notable differences across platforms. EEG stands out for its affordability, portability, and non-invasiveness, making it a versatile option for various uses. However, its precision is hindered by low spatial resolution and susceptibility to artifacts. fNIRS, similarly non-invasive and portable, excels in detecting oxygenated blood changes with high accuracy, but falls short in temporal resolution and struggles with deeper brain structures due to skull thickness. MEG shines in both temporal resolution and localization, outperforming EEG and fNIRS in these areas, yet its high cost, complexity, and sensitivity to environmental interference reduce its practicality. On the other hand, ECoG, though invasive due to requiring surgical implantation, delivers the highest spatial accuracy and excellent signal quality, rivaling MEG in temporal resolution, but with significant risks like infection and tissue damage. When compared, fNIRS and EEG offer more user-friendly, non-invasive solutions, while MEG and ECoG push the boundaries of accuracy and detail but at the cost of increased expense, complexity, and invasiveness [5].
3 Application of BCIs in Emotion Recognition
Typically, emotions are triggered by concepts, recollections, or occurrences that take place around living environment. It plays a crucial role in decision-making and human interaction. Unpleasant emotions can lead to both psychological and physical problems. Thus, unfavorable emotions may lead to health issues, whereas positive emotions promote better living conditions. Historically, people identified six core emotions—sadness, surprise, happiness, disgust, fear, and anger -- that are universally expressed through facial cues. More complex emotions, such as shyness, guilt, and contempt, originate from these foundational emotions (Ekman model). Up until now, researchers have developed numerous multidimensional techniques to emotion modeling, Russell’s circumplex model, one of the most widely adopted emotional models, is a two-dimensional framework that categorizes emotions using arousal and valence dimensions and can encompass up to 150 affective labels. In this model, emotions can be categorized along the axes of arousal and valence in a nuanced manner. High arousal positive valence (HAPV) emotions encompass a spectrum ranging from feelings of pleasure to excitement, while high arousal negative valence (HANV) emotions span from nervousness to irritation. In contrast, low arousal negative valence (LANV) emotions include states such as sadness, boredom, and sleepiness. Meanwhile, low arousal positive valence (LAPV) emotions are characterized by a sense of relaxation and calmness. The brain regions directly associated with emotions include the prefrontal lobe -- includes frontal motion area, primary motion area, etc. The parietal lobe, the temporal lobe [6]. The original DEAP dataset undergoes two main stages: processing and feature extraction. In the processing stage, data is handled using Empirical Mode Decomposition (EMD), Intrinsic Mode Function (IMF), and Variational Mode Decomposition (VMD) to extract valuable signal information. In the feature extraction stage, key features like Entropy and Higher-Order Fourier Dimension (HFD) are derived for emotion classification. These features are then vectorized and split into training and testing datasets. Machine learning algorithms such as Naive Bayes, KNN, CNN, and Decision Tree (DT) are applied for training. Finally, a performance measurement module evaluates the classification accuracy on the testing dataset. DEPA as a dataset, freely accessible online, is commonly used for studying human emotions through EEG signals [7]. The content derived from the DEAP dataset mainly revolves around the selection of classifiers for emotion recognition using EEG signals. The emotion recognition method for EEG signals based on machine learning typically involves two steps: manual feature extraction and classifier selection. Feature extraction methods primarily include time-domain analysis, frequency-domain analysis, time-frequency domain analysis, multivariate statistical analysis, and nonlinear dynamic analysis. Commonly used unsupervised time-domain methods for summarizing EEG data include principal component analysis (PCA), linear discriminant analysis (LDA), and independent component analysis (ICA) [8]. The conventional fMRI designs are typically divided into two primary categories: resting-state and task-based. Task-based fMRI employs a combination of simple, controlled, and frequently abstract stimuli, organized to examine specific processes in isolation. Thus, a standard experimental task design alternates between task and control periods, consisting of sequentially presented stimuli. Additionally, participants are typically instructed to respond to the stimuli. Conversely, resting-state fMRI monitors spontaneous brain activity over extended periods without experimental constraints or active responses, uncovering unique patterns of both static and dynamic functional connectivity (FC) that can predict behavior and differ among individuals, clinical conditions, or affective states. Both approaches have their respective strengths and limitations. Current research suggests that film stimuli serve as an effective tool for examining functional brain networks in neuroscience studies. Evidence shows that distinct storylines within narratives evoke progressively differentiated neural signatures over time. BCIs detect signals related to emotion recognition as subjects engage with film stimuli [9].
4 Application of BCIs in Emotion Regulation
The capacity to regulate emotions is fundamental to healthy development and functioning across various domains. Conversely, dysregulated emotional control has been identified as a transdiagnostic risk factor for numerous mental health disorders. Consequently, emotion regulation is considered a crucial developmental task. Many definitions have been offered to outline the contours of "emotion regulation", these definitions tend to converge on the processes or competencies related to the awareness, evaluation, maintenance, and/or modulation of emotional states in order to achieve one's goals. Emotion regulation can be either conscious and deliberate or unconscious and automatic; it can be self-managed or externally supported, and it may occur within both positive and negative emotional contexts. Given the intricate array of processes that emotion regulation encompasses (including physiological, cognitive, and social aspects), it is widely regarded as a continuous developmental task, evolving significantly from birth through childhood, adolescence, and even into adulthood [10]. Emotion is reflected in two ways: externally and internally. External reactions include human facial expressions, gestures, and speech. Skin electrical responses, heart rate, blood pressure, respiratory rate, electroencephalogram (EEG), electroencephalography (EOG) and magnetoencephalogram (MEG) are all encompassed in internal reactions. A game assistant system for emotional feedback regulation, which was constructed based on EEG and physiological signals, providing players with a fully immersive and highly interactive experience. The implementation of EEG-based games for concentration training and emotion-based applications, such as web-based music therapy, represents a significant application of EEG in emotional regulation [11]. The regulation of emotions through BCI technology is inseparable from neurofeedback. Neurofeedback is a technique that enables individuals to regulate their brain activity in real-time, attracting significant attention for its potential applications extend to clinical therapy, cognitive enhancement, and performance optimization [12]. Transcranial electrical stimulation (tES) employs various current waveforms, including transcranial direct current stimulation (tDCS), transcranial alternating current stimulation (tACS), and transcranial random noise stimulation (tRNS), to modulate neuronal states by applying them to the scalp. Among these, tDCS and TMS are two commonly used brain stimulation methods, with research demonstrating their efficacy in targeting the prefrontal cortex (PFC) to modulate emotion and emotion perception. Furthermore, emerging evidence suggests that repetitive TMS (rTMS) and anodal tDCS can enhance PFC activity during emotion regulation, potentially improving emotional regulation abilities, which may benefit the treatment of emotion regulation deficits in psychiatric disorders such as anxiety and depression [13].
5 Multimodal Emotion Recognition Based on BCI
The emotion recognition process can be performed using multimodal physiological records [14]. Single modality information is easily affected by various types of noise, making it difficult to capture emotional states, which highlights the need for multimodal affective BCIs [15,16]. The signal flow of the multimodal emotional regulation and interaction system begins with the collection of human brain signals. In sequence, it progresses from multimodal blending, followed by fusion strategy, and then moves into decision making. The decision results are conveyed to the emotion response controller, which engages in real-world interaction. During this interaction, diagnosis is conducted, and social effect feedback is generated. This feedback forms heterogeneous sensory stimuli, such as audio-visual and visual-olfactory stimuli, which in turn influence the user. In the multimodal blending process, the input signals include not only traditional single modality BCI signals like EEG or other similar neuroimaging technology signals, but also physiological signals such as eye movements and facial expressions. This integration of various neurophysiology modalities enables a comprehensive analysis of brain behavior and states. The fusion strategy consists of three components: data fusion, feature fusion, and decision fusion. Data fusion integrates multimodal data from various sources to ensure cooperative operation of the signals. Feature fusion extracts and selects useful features after data fusion to optimize the subsequent decision-making process. Decision fusion further combines the results of the previous fusion steps to form a higher level of signal integration, providing the foundation for system decisions and responses. During the decision-making process, complex mathematical models are employed, including matrix tensors, machine learning, and deep learning techniques. These models analyze the fused signals to determine the actions or decisions the system should take. Once the decision results enter the emotion response controller, the system's internal decisions are translated into actual emotional and behavioral responses, including emotional state regulation and other reactions. Overall, a feedback loop is formed. The system continuously monitors and adjusts its interaction with the user through this feedback mechanism, and the feedback results further influence the emotion response controller, allowing the system to make real-time adjustments and optimizations. Emotionally relevant features are categorized into three primary components: Face Detection, Feature Extraction, and Expression Classification. For Face Detection, two key methodologies are employed: Feature-based Techniques and Image-based Techniques. The former utilizes approaches such as Low-level Analysis, Feature Analysis, and Active Shape Models to identify facial features, whereas the latter employs methods like Example-based Learning and Support Vector Machines (SVM) for detection. Upon detecting a face, the system engages in Feature Extraction, focusing on Geometric and Texture Features (GTF) and Facial Action Units (FAU). GTF involves techniques such as Gabor Wavelets and Local Binary Patterns to extract geometric shapes and textures, while FAU examines specific facial action units -- such as the brow, eyelid, cheek, nose, nasolabial fold, lip, chin, and mouth -- along with their corresponding movements like raising, lowering, pulling, pressing, and stretching. Once sufficient facial features have been extracted, the system proceeds to Expression Classification, which sorts these features into Basic Expressions (including fear, disgust, happiness, anger, sadness, surprise, and contempt) as well as Compound Expressions, encompassing more complex, abnormal, and micro expressions. Beyond facial expressions, the system also evaluates fundamental features such as pupil diameter, which helps capture visual attention or emotional responses. Fixation, which analyzes the areas and duration of gaze. Saccades, which are rapid eye movements between focal points, tracking the trajectories of these movements. The analysis of these fundamental features generates statistical metrics, divided into Frequency Events and Dispersion measures (average, maximum, and minimum). The Frequency Events include blink frequency, fixation frequency, and saccade frequency, all of which reveal patterns of eye movement behavior. The Dispersion metrics encompass total and maximum fixation dispersion, saccade duration averages, as well as saccade amplitude and latency averages, providing deeper insights into the variation in both fixation and saccade behaviors. These analyses enable the system to more accurately detect and classify the user's facial expressions, recognize their emotional state or psychological response, and provide more comprehensive behavioral analysis through the observation of basic and statistical features. D'mello applied statistical methods to evaluate the performance of emotion recognition systems, comparing single-modality approaches with multimodal ones across various algorithms and datasets. The most effective multimodal emotion-recognition system achieved an 85% accuracy rate, outperforming the best single-modality system by an average of 9.83%, with a median improvement of 6.60% [11]. A thorough analysis of multiple signals and their interdependencies can be utilized to build a model that more accurately captures the underlying nature of human emotional expression.
6 Applications and Case Studies
The mechanism of DBS involves applying electrical stimulation to specific regions of the brain, thereby influencing neural network activity and regulating emotional responses. This method can sustain long-term antidepressant effects and reduce the recurrence of symptoms. Literature indicates that the majority of patients involved in studies maintained significant antidepressant responses during follow-up periods of up to 8 years, suggesting that DBS may have a positive effect on the long-term alleviation of emotional disorders, such as depression, by modulating emotional networks [15]. Combining BCI technology with VR enables emotion recognition to provide patients with more intuitive and interactive regulation methods. In a VR environment, patients can experience the regulatory effects of virtual scenarios through visual, auditory, and even tactile feedback. This approach not only encourages active patient participation but also promotes emotional stability and improvement through feedback mechanisms. BCI paradigms such as MI (motor imagery) and P300, widely used in neurorehabilitation, can also be extended to emotional regulation. By engaging patients in tasks or training within virtual scenes, neuroplasticity can be stimulated, thereby affecting emotional response mechanisms. For patients with depression, BCI-VR systems can achieve personalized emotional regulation by adjusting task difficulty and feedback mechanisms within the virtual environment. These systems, by increasing the fun and engagement of training, are expected to shorten treatment cycles and enhance therapeutic outcomes. Such systems can be applied not only to rehabilitation training but also to significantly improve patients' mental health [16]. Emotion regulation holds great potential in neuromarketing, particularly when BCI technology is capable of recognizing and regulating emotions. This technology can help businesses gain deep insights into consumers' emotional responses, thereby optimizing strategies for advertising, product design, and brand experiences. Emotions play a crucial role in consumer decision-making, especially in shaping purchasing behavior and brand loyalty. Through BCI technologies like EEG, companies can track consumers' emotional reactions to products and marketing messages in real-time. Using this data, businesses can better adjust the content, duration, and presentation of advertisements to evoke positive emotions and increase consumers' purchase intentions. Moreover, BCI can aid in the personalized customization of marketing strategies. For instance, businesses could adjust advertisement content in real-time based on individual consumers' emotional responses, or present more motivational ads when emotions are low to boost their emotional state. This real-time emotion recognition-based marketing approach will help companies more accurately understand consumer psychology and thereby influence their purchasing decisions more effectively. Emotion regulation also plays a significant role in brand experience design. BCI's emotion recognition and regulation technologies present new opportunities in neuromarketing, providing businesses with more precise and personalized marketing strategies [17].
7 Challenges and Future
Although BCIs holds the potential to provide novel interaction methods in various fields such as healthcare, industry, and entertainment, its limitations in terms of information transmission rate remain evident. The low transmission rate means that BCI systems cannot rapidly and accurately transmit sufficient information during the process of extracting and relaying data from the brain, which hinders tasks requiring efficient decision-making and real-time control. The inefficiency in signal transmission is also linked to the complexity and instability of brain signals. Due to the nonlinearity and unpredictability of human brain signals, extracting useful information and precisely classifying it becomes more challenging, further reducing overall transmission efficiency [18]. Applications related to BCIs require training and learning before use, which can be time-consuming and particularly inconvenient for individuals with mental disabilities [19]. These applications often involve complex interface operations, specialized brain signal recognition training, and continuous adjustments based on user behavior feedback. For individuals with mental disabilities, this can lead to mental fatigue, anxiety, and even feelings of frustration. In the future, within the context of Industry 4.0, emotional BCI can enhance industrial performance by optimizing operator cognitive load, promoting human-machine interaction, and improving workplace safety. Although current BCI technology is still in its early stages of industrial application, it is expected to see large-scale deployment in the future era of Industry 4.0 [20].
8 Conclusion
This review highlights the significant advancements in the application of BCI technology for emotion recognition and regulation. Through an in-depth exploration of pivotal methodologies, including emotion classification frameworks like the Ekman’s model and the Russell circumplex model, as well as intervention techniques such as neurofeedback and tDCS, the potential of BCI systems to profoundly impact emotional health and well-being is evident. These technologies offer promising tools for emotion-based interventions, showing efficacy in both clinical and therapeutic settings. Despite these considerable advancements, BCI technology still faces substantial challenges. Key issues include low information transmission rates, less-than-optimal user experiences, and the inherent complexity of multimodal emotion recognition systems. The difficulties in effectively transmitting and processing complex brain signals hamper real-time applications, while the intricacies of system interaction present significant barriers to widespread user adoption, particularly in practical, everyday contexts. Additionally, the complexity of user interactions with BCI systems further hinders their broad acceptance, especially among those unfamiliar with such technologies. Future research must focus on improving BCI system performance by advancing signal processing techniques, simplifying user interactions, and enhancing the accuracy of multimodal emotion recognition methods. The incorporation of advanced machine learning algorithms holds considerable promise in making BCI technology more efficient and user-friendly. Moreover, investigating BCI’s potential within the framework of Industry 4.0 offers intriguing possibilities. As industries increasingly shift towards more intelligent and automated systems, BCI technology could play a pivotal role in optimizing human-machine interactions, managing cognitive workloads, and enhancing overall workplace productivity. The review also underscores the potential of BCI to revolutionize emotional regulation, particularly through its integration with immersive technologies like VR, which can provide novel avenues for engaging users in emotional therapy. Although significant challenges remain, BCI technology stands as a promising frontier in the domain of emotion recognition and regulation, with far-reaching implications across various fields. This progress paves the way for more intuitive, adaptive human-computer interfaces in the future.
References
[1]. Kawala-Sterniuk, A., Browarska, N., Al-Bakri, A., Pelc, M., Zygarlicki, J., Sidikova, M., Martinek, R., & Gorzelanczyk, E. J. (2021). Summary of over Fifty Years with Brain-Computer Interfaces-A Review. Brain Sci, 11(1), 43. https://doi.org/10.3390/brainsci11010043
[2]. Figeys, M., Villarey, S., Leung, A. W., Raso, J., Buchan, S., Kammerer, H., ... & Kim, E. S. (2022). tDCS over the left prefrontal cortex improves mental flexibility and inhibition in geriatric inpatients with symptoms of depression or anxiety: A pilot randomized controlled trial. Front. Rehabil. Sci., 3, 997531. https://doi.org/10.3389/fresc.2022.997531
[3]. Young, M. J., Lin, D. J., & Hochberg, L. R. (2021). Brain-Computer Interfaces in Neurorecovery and Neurorehabilitation. Semin Neurol, 41(2), 206–216. https://doi.org/10.1055/s-0041-1725137
[4]. Maiseli, B., Abdalla, A. T., Massawe, L. V., Mbise, M., Mkocha, K., Nassor, N. A., Ismail, M., Michael, J., & Kimambo, S. (2023). Brain-computer interface: trend, challenges, and threats. Brain Inf, 10(1), 20. https://doi.org/10.1186/s40708-023-00199-3
[5]. Peksa, J., & Mamchur, D. (2023). State-of-the-Art on Brain-Computer Interface Technology. Sensors, 23(13), 6001. https://doi.org/10.3390/s23136001
[6]. Houssein, E. H., Hammad, A., & Ali, A. A. (2022). Human emotion recognition from EEG-based brain–computer interface using machine learning: a comprehensive review. Neural Comput&Applic, 34(15), 12527-12557. https://doi.org/10.1007/s00521-022-07292-4
[7]. Alhalaseh, R., & Alasasfeh, S. (2020). Machine-learning-based emotion recognition system using EEG signals. Computers, 9(4), 95. https://doi.org/10.3390/computers9040095
[8]. Wang, X., Ren, Y., Luo, Z., He, W., Hong, J., & Huang, Y. (2023). Deep learning-based EEG emotion recognition: Current trends and future perspectives. Front. Psychol, 14, 1126994. https://doi.org/10.3389/fpsyg.2023.1126994
[9]. Morgenroth, E., Vilaclara, L., Muszynski, M., Gaviria, J., Vuilleumier, P., & Van De Ville, D. (2023). Probing neurodynamics of experienced emotions-a Hitchhiker's guide to film fMRI. Social Cognitive and Affective Neuroscience, 18(1), nsad063. https://doi.org/10.1093/scan/nsad063
[10]. Paley, B., & Hajal, N. J. (2022). Conceptualizing Emotion Regulation and Coregulation as Family-Level Phenomena. Clin Child Fam Psychol Rev, 25(1), 19–43. https://doi.org/10.1007/s10567-022-00378-4
[11]. He, Z., Li, Z., Yang, F., Wang, L., Li, J., Zhou, C., & Pan, J. (2020). Advances in Multimodal Emotion Recognition Based on Brain-Computer Interfaces. Brain Sci, 10(10), 687. https://doi.org/10.3390/brainsci10100687
[12]. Jubair, H., Islam, M., Mehenaz, M., Akter, F., & yeasmin, N. (2024). Neurofeedback: applications, advancements, and future directions. https://doi.org/10.21203/rs.3.rs-4842929/v1
[13]. Qiu, X., He, Z., Cao, X., & Zhang, D. (2023). Transcranial magnetic stimulation and transcranial direct current stimulation affect explicit but not implicit emotion regulation: a meta-analysis. Behav Brain Funct, 19(1), 15. https://doi.org/10.1186/s12993-023-00217-8
[14]. Erat, K., Şahin, E. B., Doğan, F., Merdanoğlu, N., Akcakaya, A., & Durdu, P. O. (2024). Emotion recognition with EEG-based brain-computer interfaces: a systematic literature review. Multimed Tools Appl, 1-48. https://doi.org/10.1007/s11042-024-18259-z
[15]. Crowell, A. L., Riva-Posse, P., Holtzheimer, P. E., Garlow, S. J., Kelley, M. E., Gross, R. E., Denison, L., Quinn, S., & Mayberg, H. S. (2019). Long-Term Outcomes of Subcallosal Cingulate Deep Brain Stimulation for Treatment-Resistant Depression. American journal of Psychiatry, 176(11), 949–956. https://doi.org/10.1176/appi.ajp.2019.18121427
[16]. Wen, D., Fan, Y., Hsu, S. H., Xu, J., Zhou, Y., Tao, J., Lan, X., & Li, F. (2021). Combining brain-computer interface and virtual reality for rehabilitation in neurological diseases: A narrative review. Annals of physical and rehabilitation medicine, 64(1), 101404. https://doi.org/10.1016/j.rehab.2020.03.015
[17]. Rawnaque, F. S., Rahman, K. M., Anwar, S. F., Vaidyanathan, R., Chau, T., Sarker, F., & Mamun, K. A. A. (2020). Technological advancements and opportunities in Neuromarketing: a systematic review. Brain Inf, 7(1), 10. https://doi.org/10.1186/s40708-020-00109-x
[18]. Mridha, M. F., Das, S. C., Kabir, M. M., Lima, A. A., Islam, M. R., & Watanobe, Y. (2021). Brain-Computer Interface: Advancement and Challenges. Sensors, 21(17), 5746. https://doi.org/10.3390/s21175746
[19]. Värbu, K., Muhammad, N., & Muhammad, Y. (2022). Past, Present, and Future of EEG-Based BCI Applications. Sensors, 22(9), 3331. https://doi.org/10.3390/s22093331
[20]. Douibi, K., Le Bars, S., Lemontey, A., Nag, L., Balp, R., & Breda, G. (2021). Toward EEG-Based BCI Applications for Industry 4.0: Challenges and Possible Applications. Front. Hum. Neurosci, 15, 705064. https://doi.org/10.3389/fnhum.2021.705064
Cite this article
Liu,Y. (2024). Brain-Computer Interface Application in Recognition and Regulation of Emotion. Theoretical and Natural Science,64,193-200.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of the 4th International Conference on Biological Engineering and Medical Science
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Kawala-Sterniuk, A., Browarska, N., Al-Bakri, A., Pelc, M., Zygarlicki, J., Sidikova, M., Martinek, R., & Gorzelanczyk, E. J. (2021). Summary of over Fifty Years with Brain-Computer Interfaces-A Review. Brain Sci, 11(1), 43. https://doi.org/10.3390/brainsci11010043
[2]. Figeys, M., Villarey, S., Leung, A. W., Raso, J., Buchan, S., Kammerer, H., ... & Kim, E. S. (2022). tDCS over the left prefrontal cortex improves mental flexibility and inhibition in geriatric inpatients with symptoms of depression or anxiety: A pilot randomized controlled trial. Front. Rehabil. Sci., 3, 997531. https://doi.org/10.3389/fresc.2022.997531
[3]. Young, M. J., Lin, D. J., & Hochberg, L. R. (2021). Brain-Computer Interfaces in Neurorecovery and Neurorehabilitation. Semin Neurol, 41(2), 206–216. https://doi.org/10.1055/s-0041-1725137
[4]. Maiseli, B., Abdalla, A. T., Massawe, L. V., Mbise, M., Mkocha, K., Nassor, N. A., Ismail, M., Michael, J., & Kimambo, S. (2023). Brain-computer interface: trend, challenges, and threats. Brain Inf, 10(1), 20. https://doi.org/10.1186/s40708-023-00199-3
[5]. Peksa, J., & Mamchur, D. (2023). State-of-the-Art on Brain-Computer Interface Technology. Sensors, 23(13), 6001. https://doi.org/10.3390/s23136001
[6]. Houssein, E. H., Hammad, A., & Ali, A. A. (2022). Human emotion recognition from EEG-based brain–computer interface using machine learning: a comprehensive review. Neural Comput&Applic, 34(15), 12527-12557. https://doi.org/10.1007/s00521-022-07292-4
[7]. Alhalaseh, R., & Alasasfeh, S. (2020). Machine-learning-based emotion recognition system using EEG signals. Computers, 9(4), 95. https://doi.org/10.3390/computers9040095
[8]. Wang, X., Ren, Y., Luo, Z., He, W., Hong, J., & Huang, Y. (2023). Deep learning-based EEG emotion recognition: Current trends and future perspectives. Front. Psychol, 14, 1126994. https://doi.org/10.3389/fpsyg.2023.1126994
[9]. Morgenroth, E., Vilaclara, L., Muszynski, M., Gaviria, J., Vuilleumier, P., & Van De Ville, D. (2023). Probing neurodynamics of experienced emotions-a Hitchhiker's guide to film fMRI. Social Cognitive and Affective Neuroscience, 18(1), nsad063. https://doi.org/10.1093/scan/nsad063
[10]. Paley, B., & Hajal, N. J. (2022). Conceptualizing Emotion Regulation and Coregulation as Family-Level Phenomena. Clin Child Fam Psychol Rev, 25(1), 19–43. https://doi.org/10.1007/s10567-022-00378-4
[11]. He, Z., Li, Z., Yang, F., Wang, L., Li, J., Zhou, C., & Pan, J. (2020). Advances in Multimodal Emotion Recognition Based on Brain-Computer Interfaces. Brain Sci, 10(10), 687. https://doi.org/10.3390/brainsci10100687
[12]. Jubair, H., Islam, M., Mehenaz, M., Akter, F., & yeasmin, N. (2024). Neurofeedback: applications, advancements, and future directions. https://doi.org/10.21203/rs.3.rs-4842929/v1
[13]. Qiu, X., He, Z., Cao, X., & Zhang, D. (2023). Transcranial magnetic stimulation and transcranial direct current stimulation affect explicit but not implicit emotion regulation: a meta-analysis. Behav Brain Funct, 19(1), 15. https://doi.org/10.1186/s12993-023-00217-8
[14]. Erat, K., Şahin, E. B., Doğan, F., Merdanoğlu, N., Akcakaya, A., & Durdu, P. O. (2024). Emotion recognition with EEG-based brain-computer interfaces: a systematic literature review. Multimed Tools Appl, 1-48. https://doi.org/10.1007/s11042-024-18259-z
[15]. Crowell, A. L., Riva-Posse, P., Holtzheimer, P. E., Garlow, S. J., Kelley, M. E., Gross, R. E., Denison, L., Quinn, S., & Mayberg, H. S. (2019). Long-Term Outcomes of Subcallosal Cingulate Deep Brain Stimulation for Treatment-Resistant Depression. American journal of Psychiatry, 176(11), 949–956. https://doi.org/10.1176/appi.ajp.2019.18121427
[16]. Wen, D., Fan, Y., Hsu, S. H., Xu, J., Zhou, Y., Tao, J., Lan, X., & Li, F. (2021). Combining brain-computer interface and virtual reality for rehabilitation in neurological diseases: A narrative review. Annals of physical and rehabilitation medicine, 64(1), 101404. https://doi.org/10.1016/j.rehab.2020.03.015
[17]. Rawnaque, F. S., Rahman, K. M., Anwar, S. F., Vaidyanathan, R., Chau, T., Sarker, F., & Mamun, K. A. A. (2020). Technological advancements and opportunities in Neuromarketing: a systematic review. Brain Inf, 7(1), 10. https://doi.org/10.1186/s40708-020-00109-x
[18]. Mridha, M. F., Das, S. C., Kabir, M. M., Lima, A. A., Islam, M. R., & Watanobe, Y. (2021). Brain-Computer Interface: Advancement and Challenges. Sensors, 21(17), 5746. https://doi.org/10.3390/s21175746
[19]. Värbu, K., Muhammad, N., & Muhammad, Y. (2022). Past, Present, and Future of EEG-Based BCI Applications. Sensors, 22(9), 3331. https://doi.org/10.3390/s22093331
[20]. Douibi, K., Le Bars, S., Lemontey, A., Nag, L., Balp, R., & Breda, G. (2021). Toward EEG-Based BCI Applications for Industry 4.0: Challenges and Possible Applications. Front. Hum. Neurosci, 15, 705064. https://doi.org/10.3389/fnhum.2021.705064