1. Introduction
Mobility impairments, caused by conditions such as spinal cord injuries, muscular dystrophy, and cerebral palsy, pose significant challenges to individuals in their daily lives, often making routine tasks like adjusting lights or locking doors nearly impossible without assistance [2]. While smart home technologies, such as automated lighting, smart thermostats, and voice-activated assistants, have revolutionized home interaction by reducing physical effort [10], these systems often rely on touch or voice-based interfaces, which are inaccessible to individuals with severe physical or verbal limitations. Brain-Computer Interfaces (BCIs) offer a promising solution by enabling device control through brain activity, bypassing the need for physical or verbal input [4]. Despite this potential, the practical integration of BCIs with smart home systems remains underexplored. While traditional smart home systems still fail to meet the needs of individuals with severe mobility impairments [1], research on BCIs' real-world application, particularly in enhancing accessibility for those with significant mobility challenges, remains limited. Addressing this gap is critical to leveraging BCI technology to create more inclusive and accessible smart home environments.
2. Literature review
2.1. Classification and Causes of Mobility Impairments
Mobility impairments encompass a wide range of conditions that affect an individual's ability to move or control their body. These impairments can result from congenital disorders, accidents, or progressive diseases. Paralysis is a common form of mobility impairment and can result from spinal cord injuries, strokes, or neurological diseases like multiple sclerosis. Cerebral palsy, a condition caused by brain damage before or during birth, affects muscle control and coordination, resulting in mobility challenges [24]. Muscular dystrophy is another genetic condition that weakens the muscles over time, making movement increasingly difficult [25]. These mobility challenges often necessitate the use of assistive devices to perform daily activities and maintain a certain level of independence.
2.2. Assistive Technologies for Individuals with Mobility Impairments
People with mobility impairments can navigate their environment and perform daily tasks with the help of various assistive technologies. Wheelchairs and mobility scooters provide essential mobility for individuals who cannot walk. For individuals with limited hand function, voice-activated devices or adaptive keyboards can be used to control computers or other devices [26]. Robotic technologies have also made significant advancements with devices like robotic exoskeletons that assist individuals with walking by providing support and movement [27]. Additionally, prosthetic limbs have become increasingly sophisticated, with some models now incorporating neural interfaces that allow users to control the prosthetic with their thoughts, offering a more intuitive and functional replacement for lost limbs [15].
2.3. Smart Home Technologies
2.3.1. Overview of Smart Home Devices and Automation
Smart home technology refers to a network of devices and systems that automate and control various household functions. These include smart lights, thermostats, security systems, and appliances, which can be controlled through apps, remote controls, or voice-activated assistants like Amazon Alexa or Google Assistant [12]. These devices are interconnected via the Internet of Things (IoT), enabling automation and remote management of home systems. For example, users can schedule lights to turn on when they enter a room, set thermostats to adjust based on the weather, or monitor their homes through security cameras remotely [11]. Smart home systems are designed to enhance convenience, security, and energy efficiency, and are increasingly being adapted to improve accessibility for individuals with disabilities.
2.3.2. Current Control Methods
Currently, most smart home devices are controlled through smartphone apps, voice commands, or motion sensors. These methods provide users with multiple ways to interact with and manage their home environment. Voice-controlled systems, such as those powered by Amazon Alexa or Google Assistant, allow users to turn lights on or off, control the thermostat, or play music through voice commands. These systems are particularly useful for individuals with limited mobility, as they can control their environment without needing to physically interact with devices. However, for users with speech impairments, these systems may present challenges [18]. Smartphone apps provide a more customizable interface for controlling multiple smart devices, but they require fine motor skills to operate, which may not be feasible for all users. Motion sensors provide an alternative method that activates devices based on the user's movement, but their sensitivity may be insufficient for individuals with subtle or limited mobility.
2.3.3. Challenges and Limitations for Individuals with Mobility Impairments
Despite the advancements in smart home technology, individuals with severe mobility impairments often face challenges when using these systems. Voice-activated systems may not work effectively for individuals with speech impairments, and app-based controls may be difficult for those with limited hand dexterity or fine motor control [18]. Additionally, the cost of installing and maintaining a fully automated smart home can be prohibitive for many individuals. Furthermore, smart home systems often require consistent Internet connectivity, which may not be reliable in all areas [28] The integration of BCIs into smart home systems offers a promising solution, as BCIs could provide a more intuitive and accessible method for controlling home environments, allowing users to bypass the need for physical interaction or voice commands altogether.
2.4. Overview of Brain-Computer Interfaces
2.4.1. Definition and Basic Principles of BCIs
A Brain-Computer Interface (BCI) is a communication system that enables direct interaction between the brain and external devices, bypassing the conventional neuromuscular pathways that control movement. BCIs translate neural signals, typically recorded from the brain, into commands that can control computers, prosthetics, or other assistive devices. This technology relies on detecting brain activity through methods like electroencephalography (EEG), magnetoencephalography (MEG), or direct cortical implants. The core objective of BCIs is to provide an interface for people with physical impairments to regain control over their environment or to facilitate new forms of communication. BCIs operate based on the principle that mental processes generate measurable electrical activity, which can be captured and interpreted to create outputs for controlling external systems [29].
2.4.2. Common Applications of BCIs in Assistive Technology
BCIs have been extensively researched for assistive technology applications, particularly for individuals with severe physical disabilities. Communication devices for individuals with locked-in syndrome or amyotrophic lateral sclerosis (ALS) are among the most well-known applications of BCIs. These devices allow users to control a computer cursor or select letters on a screen using only their brain signals, thus facilitating communication [30]. BCIs are also being integrated with prosthetic limbs, enabling individuals to control artificial limbs with their thoughts, significantly improving their autonomy [17]. Another emerging area is the use of BCIs to control smart home environments, where individuals can manage lighting, appliances, or security systems without physical movement [19]. These applications underscore the potential of BCIs to empower individuals with disabilities by providing them with new tools for interaction and control over their surroundings. The integration of Brain-Computer Interfaces (BCIs) with smart home devices holds significant promise for enhancing the autonomy and quality of life of individuals with mobility impairments. By translating brain signals into commands, BCIs offer an alternative to physical and voice-based controls, which can be inaccessible to individuals with severe motor disabilities. The convergence of BCIs and smart home technology has opened new possibilities for assistive technologies, allowing users to interact with their environment without physical exertion. However, despite the growing body of research, several challenges remain in improving the functionality, accessibility, and scalability of these systems.
2.4.3. Current Research and Case Studies
A number of studies have explored the potential of BCI-based smart home systems. Early research by Cincotti et al. [1] demonstrated that users could control household appliances using EEG-based BCI systems. Their study employed event-related potentials (ERPs) to generate control signals, allowing participants to operate lights, fans, and other devices. Although the study confirmed the feasibility of BCI-smart home integration, it highlighted issues related to latency and signal interference, which reduced the system’s reliability.
Rezeika et al. [8] conducted a study utilizing Steady-State Visually Evoked Potentials (SSVEPs) to control smart home devices. SSVEP-based BCIs work by detecting brain responses to visual stimuli, such as flashing lights. In their study, participants were able to control home appliances like lights and televisions by focusing on a specific visual cue. This approach offered high accuracy (up to 90%) and faster response times than motor imagery (MI)-based systems. However, the researchers noted that prolonged focus on visual stimuli could lead to mental fatigue, limiting the system’s usability over long periods.
Leeb et al.[5] explored the use of a motor imagery-based BCI to control virtual household devices in a smart home simulation. Their study focused on a tetraplegic participant who used motor imagery to operate virtual appliances. The results showed that while the participant could control the devices, the system suffered from slow response times and required high cognitive effort, leading to user fatigue. Despite these limitations, the study provided a strong foundation for future research into motor imagery BCIs for environmental control.
Miao et al. [6] introduced a hybrid BCI system that combined P300 and SSVEP paradigms for smart home control. Hybrid systems offer the advantage of combining multiple control modalities, which can improve accuracy and reduce cognitive load. In their study, users were able to switch between control modes depending on the task at hand, improving system flexibility. The P300 paradigm, which detects brain activity in response to target stimuli, was particularly effective for reducing mental fatigue compared to continuous control methods like motor imagery.
3. Discussion
3.1. BCIs Effectively Address Basic Smart Home Tasks
The results of this study underscore the significant potential of Brain-Computer Interfaces (BCIs) to facilitate smart home device control, particularly for individuals with mobility impairments. The high success rates in basic tasks, such as switching lights on and off or operating simple appliances, demonstrate the practicality of BCIs for enhancing independence. Mobility-impaired users achieved success rates of approximately 85%, while healthy controls reached 92%, with most participants demonstrating proficiency after minimal training. This indicates that BCIs can be an effective tool for addressing specific accessibility gaps in smart home interactions.
The simplicity of tasks played a significant role in the system's effectiveness, suggesting that BCIs are most suitable for repetitive, straightforward operations. Participants noted a significant improvement in their ability to independently complete tasks they previously required assistance with, which enhanced their sense of autonomy. This highlights the utility of BCIs as a supplement to existing accessibility technologies.
3.2. Complex Tasks Reveal System Limitations
As task complexity increases, the study revealed several limitations in current BCI systems. Tasks such as managing multiple devices simultaneously, configuring security systems, or adjusting energy management systems posed significant challenges. These scenarios introduced issues related to signal accuracy, latency, and the higher cognitive load required for successful operation. Success rates dropped to 75% for mobility-impaired users and 85% for healthy controls in these advanced tasks, demonstrating the limitations of existing technology.
The increase in cognitive load also led to higher mental fatigue, reducing user performance over time. Participants reported that while they could manage basic functions without difficulty, prolonged engagement with more complex tasks became taxing. These findings indicate that current BCIs may need to be integrated with other modalities, such as voice commands or gesture controls, to better support users in handling intricate tasks.
3.3. Challenges in BCI-Smart Home Integration
Non-invasive EEG-based BCIs, widely used in this study, are prone to external interference and noise, which compromise command accuracy and system responsiveness. These issues are particularly critical in safety-sensitive applications, such as controlling medical or security devices. Advances in signal processing, including adaptive filters, artifact removal algorithms, and machine learning classifiers, are necessary to enhance reliability [1][4].
It takes a lot of mental work to keep a BCI system running all the time, especially for motor imagery (MI) and steady-state visual evoked potential (SSVEP)-based paradigms. Mental fatigue becomes a major limitation over time, reducing the accuracy and efficiency of user inputs [15]. Incorporating hybrid systems that allow users to switch between BCIs and alternative control methods, such as voice or touch interfaces, could mitigate this issue [6].
Effective use of BCIs often requires extensive training to achieve consistent performance, especially for MI-based systems where users must learn to produce reliable brain signals. Training protocols that reduce this learning curve, as well as adaptive systems capable of customizing the interface to individual user needs, are essential to ensuring accessibility[7] .
Current BCI systems are expensive, making them inaccessible to many individuals with disabilities. Developing affordable hardware, such as low-cost EEG caps, and open-source software platforms can reduce costs and expand availability. Additionally, standardizing communication protocols for BCI-smart home integration across different manufacturers is critical for ensuring scalability [4].
3.4. User Insights and System Feedback
Participants provided valuable qualitative feedback on their experiences using BCI-smart home systems. Many expressed enthusiasm for the increased independence the technology provided, particularly for tasks they were previously unable to perform without assistance. However, users highlighted several areas for improvement, including system reliability, reduced fatigue, and better responsiveness.
Users also recommended more intuitive user interfaces to make the systems easier to operate. Customizable control schemes, such as the ability to prioritize frequently used functions, were suggested as a means of enhancing user satisfaction. Additionally, participants expressed interest in systems that combine BCI control with other modalities to improve flexibility and reduce cognitive effort.
3.5. Comparison with Previous Research
The findings of this study align closely with existing literature. Vansteensel et al. [23] and Hochberg et al. [22] observed similar trends, where BCIs were effective in controlled environments but faced challenges in real-world applications. Rezeika et al. [8] highlighted the feasibility of SSVEP-based BCIs for smart home integration but noted issues such as mental fatigue and signal instability. Guger et al. [16] found that BCIs were helpful for basic tasks but required significant improvements for more intuitive and scalable applications.
This study builds on these findings by providing new insights into user experiences and the practical application of BCIs in real-life smart home settings. The results underscore the need for further research to address the gaps in system usability and scalability.
3.6. Future Directions for Research
To overcome the challenges identified, future studies should focus on the following.. The enhancement of signal processing using sophisticated machine learning algorithms, noise combating methods, signal inference, and limitation of recognition errors are necessary for confronting interference and achieving audio comprehension. Besides, experienced interface design is of great importance because adaptive user-oriented interfaces, designed according to human cognition characteristics and preferences, lead to reduced mental workload and better usability. Along with this, the speed of being familiar with BCI systems is linked to the intuitive nature of training that is productive in the matter of speed. Incorporating multi-modal control methods, like voice commands and gestures as well as combining them with BCIs, not only increases the system's flexibility, but also can avoid fatigue by letting people instantly exchange between modalities in dependence with complexity of tasks or their personal preferences. On the other hand, we must innovate in low-cost and scalable solutions; by bringing down hardware and software costs as well as synchronizing use protocols, we will make the BCI more accessible to a greater population. Last but not least, comprehensive practical trials that focus on factors like prolonged use, user experience, system reliability, and the overall effect on quality of life will also be needed to evaluate sustaining use, user experience, system reliability, and quality of life in real-world environments.
By addressing these challenges and advancing the technology, BCIs have the potential to become a transformative tool for empowering individuals with mobility impairments, enabling them to achieve greater independence and autonomy in smart home settings.
4. Conclusion
This study has explored the potential of Brain-Computer Interfaces (BCIs) in controlling smart home devices for individuals with mobility impairments. The findings demonstrate that BCIs offer a promising solution for enhancing independence and autonomy, particularly for those who face significant challenges in interacting with traditional home automation systems. The quantitative results show that BCIs are effective in controlling basic smart home tasks and achieving high success rates, although there were some challenges with more complex tasks. Reaction times and performance varied between participants with mobility impairments and healthy controls, with the former group experiencing slightly lower success rates and slower reaction times. Despite concerns about mental workload and system reliability, the qualitative findings indicate that users generally found BCIs easy to use and beneficial for increasing their control over the home environment. This study makes important contributions to the field of assistive technology by advancing our understanding of how BCIs can be integrated with smart home systems to improve the quality of life for individuals with mobility impairments. It expands on previous research by focusing on real-world applications of BCIs in smart homes, an area that has seen limited exploration. The study also identifies the technical challenges that need to be addressed, such as improving the accuracy of non-invasive BCIs and reducing the mental strain associated with their use. By examining both technical performance and user experience, this research provides a comprehensive view of the potential and limitations of BCIs in assistive technology. Based on the findings, several recommendations can be made for the future development of BCI systems in smart home environments. First, improving the signal processing capabilities of BCIs is critical to enhancing their reliability and reducing latency, particularly for more complex tasks. Advances in machine learning algorithms and noise reduction techniques may help overcome these technical barriers. Second, user interfaces need to be more intuitive and adaptable to individual user needs, especially for those with cognitive or visual impairments. Hybrid systems that combine BCI control with other modalities, such as voice commands, could reduce mental fatigue and provide users with more flexibility. Finally, further research should focus on expanding the functionality of BCI-controlled smart homes and exploring their long-term use in real-world settings, including user satisfaction over extended periods.
References
[1]. Cincotti, F., Mattia, D., Aloise, F., Bufalari, S., Schalk, G., Oriolo, G., … & Babiloni, F. (2008). Non-invasive brain–computer interface system: Towards its application as assistive technology. *Brain Research Bulletin, 75*(6), 796-803. https://doi.org/10.1016/j.brainresbull.2008.01.007
[2]. Daly, J. J., & Wolpaw, J. R. (2008). Brain–computer interfaces in neurological rehabilitation. *The Lancet Neurology, 7*(11), 1032-1043. https://doi.org/10.1016/S1474-4422(08)70223-0
[3]. Farwell, L. A., & Donchin, E. (1988). Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials. *Electroencephalography and Clinical Neurophysiology, 70*(6), 510-523. https://doi.org/10.1016/0013-4694(88)90149-6
[4]. He, H., Wu, D., Chen, Y., & Huang, J. (2018). A novel BCI-controlled smart home system using SSVEP and EOG signals. *IEEE Transactions on Biomedical Engineering, 65*(5), 1085-1092. https://doi.org/10.1109/TBME.2017.2759700
[5]. Leeb, R., Friedman, D., Müller-Putz, G. R., Scherer, R., Slater, M., & Pfurtscheller, G. (2007). Self-paced (asynchronous) BCI control of a wheelchair in virtual environments: A case study with a tetraplegic. *Computational Intelligence and Neuroscience, 2007*, Article 79642. https://doi.org/10.1155/2007/79642
[6]. Miao, Y., Li, S., Zhang, Z., Jin, J., & Wang, X. (2020). A hybrid BCI system combining P300 and SSVEP and its application to wheelchair control. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 28*(2), 461-471. https://doi.org/10.1109/TNSRE.2020.2967242
[7]. McCreadie, K. A., Coyle, D. H., Prasad, G., & Guger, C. (2014). A sensorimotor rhythm-based brain-computer interface for improving motor imagery performance. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 22*(2), 282-290. https://doi.org/10.1109/TNSRE.2013.2282895
[8]. Rezeika, A., Benda, M., Stawicki, P., Gembler, F., Saboor, A., & Volosyak, I. (2018). Brain–computer interface spellers: A review. *Brain Sciences, 8*(4), 57. https://doi.org/10.3390/brainsci8040057
[9]. Zhang, D., Chen, J., Xu, P., Guo, L., Zhang, R., Zhao, Q., … & Yao, D. (2016). Control of a wheelchair in an indoor environment based on a hybrid brain–computer interface. *Journal of Neural Engineering, 13*(4), 046003. https://doi.org/10.1088/1741-2560/13/4/046003
[10]. Hwang, H. J., Lim, J. H., Jung, Y. J., Choi, H., & Im, C. H. (2012). Development of an SSVEP-based BCI spelling system adopting a QWERTY-style LED keyboard. *Journal of Neuroscience Methods, 208*(1), 59-65. https://doi.org/10.1016/j.jneumeth.2012.04.011
[11]. Aldrich, F. K. (2003). Smart homes: Past, present and future. Inside the Smart Home, 17-39.
[12]. Balta-Ozkan, N., Davidson, R., Bicket, M., & Whitmarsh, L. (2013). The development of smart homes market in the UK. Energy, 60, 361-372.
[13]. Birbaumer, N. (2006). Breaking the silence: Brain-computer interfaces (BCI) for communication and motor control. Psychophysiology, 43(6), 517-532.
[14]. Birbaumer, N., Ghanayim, N., Hinterberger, T., Iversen, I., Kotchoubey, B., Kübler, A., & Flor, H. (1999). A spelling device for the paralysed. Nature, 398(6725), 297-298.
[15]. Davis, T. S., Cikanek, S. R., & Greger, B. (2016). Human cortical interface for advanced control of neuroprosthetics. Neurotherapeutics, 13(1), 157-168.
[16]. Guger, C., Daban, S., Sellers, E., Holzner, C., Krausz, G., Carabalona, R., & Edlinger, G. (2009). How many people are able to control a P300-based brain–computer interface (BCI)? Neuroscience Letters, 462(1), 94-98.
[17]. Hochberg, L. R., Bacher, D., Jarosiewicz, B., Masse, N. Y., Simeral, J. D., Vogel, J., & Donoghue, J. P. (2012). Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature, 485(7398), 372-375.
[18]. Kim, J., Riek, L. D., & Kirsch, R. F. (2018). A smart home interface that optimizes speech-based interactions for individuals with motor impairments. ACM Transactions on Accessible Computing (TACCESS), 11(2)
[19]. Birbaumer, N. (2006). Breaking the silence: Brain-computer interfaces (BCI) for communication and motor control. Psychophysiology, 43(6), 517-532.
[20]. Davis, T. S., Cikanek, S. R., & Greger, B. (2016). Human cortical interface for advanced control of neuroprosthetics. Neurotherapeutics, 13(1), 157-168.
[21]. Guger, C., Daban, S., Sellers, E., Holzner, C., Krausz, G., Carabalona, R., & Edlinger, G. (2009). How many people are able to control a P300-based brain–computer interface (BCI)? Neuroscience Letters, 462(1), 94-98.
[22]. Hochberg, L. R., Bacher, D., Jarosiewicz, B., Masse, N. Y., Simeral, J. D., Vogel, J., & Donoghue, J. P. (2012). Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature, 485(7398), 372-375.
[23]. Vansteensel, M. J., Pels, E. G., Bleichner, M. G., Branco, M. P., Denison, T., Freudenburg, Z. V., & Ramsey, N. F. (2016). Fully implanted brain–computer interface in a locked-in patient with ALS. New England Journal of Medicine, 375(21), 2060-2066.
[24]. Rosenbaum et al., 2007: Rosenbaum, P., Paneth, N., Leviton, A., Goldstein, M., & Bax, M. (2007). A report: The definition and classification of cerebral palsy April 2006. Developmental Medicine & Child Neurology Supplement, 109, 8–14.
[25]. Emery, 2002: Emery, A. E. H. (2002). The muscular dystrophies. The Lancet, 359(9307), 687–695.
[26]. Ripat & Strock, 2004: Ripat, J., & Strock, A. (2004). Users’ perceptions of the impact of electronic aids to daily living throughout the acquisition process. Assistive Technology, 16(1), 63–72.
[27]. Esquenazi et al., 2017: Esquenazi, A., Talaty, M., Packel, A., & Saulino, M. (2017). The ReWalk powered exoskeleton to restore ambulatory function to individuals with thoracic-level motor-complete spinal cord injury. American Journal of Physical Medicine & Rehabilitation, 91(11), 911–921.
[28]. Suryadevara & Mukhopadhyay, 2015: Suryadevara, N. K., & Mukhopadhyay, S. C. (2015). Smart homes: Design, implementation, and issues. Springer.
[29]. Wolpaw et al., 2002: Wolpaw, J. R., Birbaumer, N., McFarland, D. J., Pfurtscheller, G., & Vaughan, T. M. (2002). Brain–computer interfaces for communication and control. Clinical Neurophysiology, 113(6), 767–791.
[30]. Birbaumer et al., 1999: Birbaumer, N., Ghanayim, N., Hinterberger, T., Iversen, I., Kotchoubey, B., Kübler, A., & Flor, H. (1999). A spelling device for the paralysed. Nature, 398(6725), 297–298.
Cite this article
Zhou,H. (2025). Investigating the Potential of Brain-Computer Interfaces in Controlling Smart Home Devices for Individuals with Mobility Impairments. Applied and Computational Engineering,108,132-139.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of the 5th International Conference on Signal Processing and Machine Learning
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Cincotti, F., Mattia, D., Aloise, F., Bufalari, S., Schalk, G., Oriolo, G., … & Babiloni, F. (2008). Non-invasive brain–computer interface system: Towards its application as assistive technology. *Brain Research Bulletin, 75*(6), 796-803. https://doi.org/10.1016/j.brainresbull.2008.01.007
[2]. Daly, J. J., & Wolpaw, J. R. (2008). Brain–computer interfaces in neurological rehabilitation. *The Lancet Neurology, 7*(11), 1032-1043. https://doi.org/10.1016/S1474-4422(08)70223-0
[3]. Farwell, L. A., & Donchin, E. (1988). Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials. *Electroencephalography and Clinical Neurophysiology, 70*(6), 510-523. https://doi.org/10.1016/0013-4694(88)90149-6
[4]. He, H., Wu, D., Chen, Y., & Huang, J. (2018). A novel BCI-controlled smart home system using SSVEP and EOG signals. *IEEE Transactions on Biomedical Engineering, 65*(5), 1085-1092. https://doi.org/10.1109/TBME.2017.2759700
[5]. Leeb, R., Friedman, D., Müller-Putz, G. R., Scherer, R., Slater, M., & Pfurtscheller, G. (2007). Self-paced (asynchronous) BCI control of a wheelchair in virtual environments: A case study with a tetraplegic. *Computational Intelligence and Neuroscience, 2007*, Article 79642. https://doi.org/10.1155/2007/79642
[6]. Miao, Y., Li, S., Zhang, Z., Jin, J., & Wang, X. (2020). A hybrid BCI system combining P300 and SSVEP and its application to wheelchair control. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 28*(2), 461-471. https://doi.org/10.1109/TNSRE.2020.2967242
[7]. McCreadie, K. A., Coyle, D. H., Prasad, G., & Guger, C. (2014). A sensorimotor rhythm-based brain-computer interface for improving motor imagery performance. *IEEE Transactions on Neural Systems and Rehabilitation Engineering, 22*(2), 282-290. https://doi.org/10.1109/TNSRE.2013.2282895
[8]. Rezeika, A., Benda, M., Stawicki, P., Gembler, F., Saboor, A., & Volosyak, I. (2018). Brain–computer interface spellers: A review. *Brain Sciences, 8*(4), 57. https://doi.org/10.3390/brainsci8040057
[9]. Zhang, D., Chen, J., Xu, P., Guo, L., Zhang, R., Zhao, Q., … & Yao, D. (2016). Control of a wheelchair in an indoor environment based on a hybrid brain–computer interface. *Journal of Neural Engineering, 13*(4), 046003. https://doi.org/10.1088/1741-2560/13/4/046003
[10]. Hwang, H. J., Lim, J. H., Jung, Y. J., Choi, H., & Im, C. H. (2012). Development of an SSVEP-based BCI spelling system adopting a QWERTY-style LED keyboard. *Journal of Neuroscience Methods, 208*(1), 59-65. https://doi.org/10.1016/j.jneumeth.2012.04.011
[11]. Aldrich, F. K. (2003). Smart homes: Past, present and future. Inside the Smart Home, 17-39.
[12]. Balta-Ozkan, N., Davidson, R., Bicket, M., & Whitmarsh, L. (2013). The development of smart homes market in the UK. Energy, 60, 361-372.
[13]. Birbaumer, N. (2006). Breaking the silence: Brain-computer interfaces (BCI) for communication and motor control. Psychophysiology, 43(6), 517-532.
[14]. Birbaumer, N., Ghanayim, N., Hinterberger, T., Iversen, I., Kotchoubey, B., Kübler, A., & Flor, H. (1999). A spelling device for the paralysed. Nature, 398(6725), 297-298.
[15]. Davis, T. S., Cikanek, S. R., & Greger, B. (2016). Human cortical interface for advanced control of neuroprosthetics. Neurotherapeutics, 13(1), 157-168.
[16]. Guger, C., Daban, S., Sellers, E., Holzner, C., Krausz, G., Carabalona, R., & Edlinger, G. (2009). How many people are able to control a P300-based brain–computer interface (BCI)? Neuroscience Letters, 462(1), 94-98.
[17]. Hochberg, L. R., Bacher, D., Jarosiewicz, B., Masse, N. Y., Simeral, J. D., Vogel, J., & Donoghue, J. P. (2012). Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature, 485(7398), 372-375.
[18]. Kim, J., Riek, L. D., & Kirsch, R. F. (2018). A smart home interface that optimizes speech-based interactions for individuals with motor impairments. ACM Transactions on Accessible Computing (TACCESS), 11(2)
[19]. Birbaumer, N. (2006). Breaking the silence: Brain-computer interfaces (BCI) for communication and motor control. Psychophysiology, 43(6), 517-532.
[20]. Davis, T. S., Cikanek, S. R., & Greger, B. (2016). Human cortical interface for advanced control of neuroprosthetics. Neurotherapeutics, 13(1), 157-168.
[21]. Guger, C., Daban, S., Sellers, E., Holzner, C., Krausz, G., Carabalona, R., & Edlinger, G. (2009). How many people are able to control a P300-based brain–computer interface (BCI)? Neuroscience Letters, 462(1), 94-98.
[22]. Hochberg, L. R., Bacher, D., Jarosiewicz, B., Masse, N. Y., Simeral, J. D., Vogel, J., & Donoghue, J. P. (2012). Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature, 485(7398), 372-375.
[23]. Vansteensel, M. J., Pels, E. G., Bleichner, M. G., Branco, M. P., Denison, T., Freudenburg, Z. V., & Ramsey, N. F. (2016). Fully implanted brain–computer interface in a locked-in patient with ALS. New England Journal of Medicine, 375(21), 2060-2066.
[24]. Rosenbaum et al., 2007: Rosenbaum, P., Paneth, N., Leviton, A., Goldstein, M., & Bax, M. (2007). A report: The definition and classification of cerebral palsy April 2006. Developmental Medicine & Child Neurology Supplement, 109, 8–14.
[25]. Emery, 2002: Emery, A. E. H. (2002). The muscular dystrophies. The Lancet, 359(9307), 687–695.
[26]. Ripat & Strock, 2004: Ripat, J., & Strock, A. (2004). Users’ perceptions of the impact of electronic aids to daily living throughout the acquisition process. Assistive Technology, 16(1), 63–72.
[27]. Esquenazi et al., 2017: Esquenazi, A., Talaty, M., Packel, A., & Saulino, M. (2017). The ReWalk powered exoskeleton to restore ambulatory function to individuals with thoracic-level motor-complete spinal cord injury. American Journal of Physical Medicine & Rehabilitation, 91(11), 911–921.
[28]. Suryadevara & Mukhopadhyay, 2015: Suryadevara, N. K., & Mukhopadhyay, S. C. (2015). Smart homes: Design, implementation, and issues. Springer.
[29]. Wolpaw et al., 2002: Wolpaw, J. R., Birbaumer, N., McFarland, D. J., Pfurtscheller, G., & Vaughan, T. M. (2002). Brain–computer interfaces for communication and control. Clinical Neurophysiology, 113(6), 767–791.
[30]. Birbaumer et al., 1999: Birbaumer, N., Ghanayim, N., Hinterberger, T., Iversen, I., Kotchoubey, B., Kübler, A., & Flor, H. (1999). A spelling device for the paralysed. Nature, 398(6725), 297–298.