Research Article
Open access
Published on 16 December 2024
Download pdf
Zhu,Y. (2024). FASSLING: Transforming emotional and coaching support through artificial intelligence (AI) innovation. Journal of Clinical Technology and Theory,2,1-10.
Export citation

FASSLING: Transforming emotional and coaching support through artificial intelligence (AI) innovation

Yujia Zhu *,1,
  • 1 Sofia University, Palo Alto, United States

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/3049-5458/2024.18639

Abstract

The global mental health crisis is compounded by barriers such as cost, accessibility, and stigma, leaving millions without adequate support. FASSLING (fassling.ai), an innovative artificial intelligence (AI)-powered platform, addresses these challenges by providing free, 24/7 multilingual emotional and coaching support through text and audio interactions. Grounded in inclusivity and compassion, FASSLING bridges gaps in traditional mental health systems by offering immediate, non-clinical support while complementing professional services. This paper explores FASSLING's design and implementation, emphasizing its user-centered features, including cultural adaptability, trauma-informed care principles, and active listening techniques. The platform not only empowers users to navigate emotional challenges but also fosters resilience and empathy, creating a ripple effect of societal compassion. Ethical considerations, such as ensuring user privacy and managing the limitations of AI, are central to FASSLING’s mission. By integrating advanced AI technologies with psychological best practices, FASSLING sets a new standard for accessible and inclusive mental health support, positioning itself as a transformative tool for global well-being. This case study highlights FASSLING's potential to redefine emotional support systems and drive positive change in mental health care worldwide.

Keywords

virtual safe space, AI emotional and coaching support, mental health accessibility, multilingual support, psychological resilience

[1]. Mário, A., Cenedesi, J., Sakman, R., Lopes, K. L., Calderaro, J. G. D. F. (2024). Academic essay on equitable access to mental health services in Brazil. IOSR Journal of Humanities and Social Science. https://doi.org/10.9790/0837-2909082831

[2]. Daisy, R. S., Schleider, J. L., & Patel, V. (2023). Democratizing access to psychological therapies: Innovations and the role of psychologists. Journal of Consulting and Clinical Psychology. https://doi.org/10.1037/ccp0000850

[3]. Karger, D. N. (2022). Harmonize: A comprehensive patient and provider connectivity solution for the management of mental disorders. https://doi.org/10.1007/978-981-19-1610-6_75

[4]. Wang, J., Pasyk, S., Slavin-Stewart, C., & Olagunju, A. T. (2022). A scoping review on barriers to mental healthcare in Canada as identified by healthcare providers. British Journal of Psychiatry Open. https://doi.org/10.1192/bjo.2022.258

[5]. Plakun, E. M. (2020). The mental health crisis in America: Recognizing problems; working toward solutions: Part 3. Access to care. Journal of Psychiatric Practice. https://doi.org/10.1097/PRA.0000000000000466

[6]. Tavaragi, M. S., & C., S. (2017). Global burden of mental disorders: Quality of care and unmet needs for treatment of chronic mental illness. https://doi.org/10.4018/978-1-5225-0519-8.CH009

[7]. Mnookin, S. (2016). Out of the shadows: Making mental health a global development priority.

[8]. Rens, E., Michielsen, J., Dom, G., Remmen, R., & Van den Broeck, K. (2022). Normative and perceived unmet mental health needs, healthcare use and barriers to care for mental health problems in a general population sample. https://doi.org/10.21203/rs.3.rs-1327032/v1

[9]. Uwakwe, R., Jidda, S. M., & Bährer-Kohler, S. (2017). Access to mental health. https://doi.org/10.1007/978-3-319-59123-0_3

[10]. Wainberg, M. L., Scorza, P., Shultz, J. M., Helpman, L., Mootz, J. J., Johnson, K. A., Neria, Y., Bradford, M. A., Oquendo, M. R., & Arbuckle, M. (2017). Challenges and opportunities in global mental health: A research-to-practice perspective. Current Psychiatry Reports. https://doi.org/10.1007/S11920-017-0780-Z

[11]. Meiselbach, M. K., Ettman, C. K., Shen, K., Castrucci, B. C., & Galea, S. (2024). Unmet need for mental health care is common across insurance market segments in the United States. Health Affairs. https://doi.org/10.1093/haschl/qxae032

[12]. Van Lotringen, C., Lusi, B., Westerhof, G. J., Ludden, G. D. S., Kip, H., Kelders, S. M., & Noordzij, M. L. (2023). The role of compassionate technology in blended and digital mental health interventions: Systematic scoping review. JMIR Mental Health, 10, e42403. https://doi.org/10.2196/42403

[13]. Gus, L., Rose, J., & Gilbert, L. (2015). Emotion coaching: A universal strategy for supporting and promoting sustainable emotional and behavioral well-being.

[14]. Bunce, A., & Hendry, A. (2019). Compassionate, helpful, neighbourly – A connected community that cares. International Journal of Integrated Care. https://doi.org/10.5334/IJIC.S3269

[15]. Sherwell, C., Varley, D., Kinnane, C., Turner, W., Zimmerman, D., Kirby, J. N. (2024). Examining the impact of a brief compassion focused intervention on everyday experiences of compassion in autistic adults through psychophysiology and experience sampling. https://doi.org/10.31234/osf.io/a3yjz

[16]. Riebel, M., Rohmer, O., Lefebvre, F., Weibel, S., & Weiner, L. (2023). Compassion focused therapy (CFT) for the reduction of the self-stigma of mental disorders: The COMpassion for Psychiatric disorders And Self-Stigma (COMPASS) study protocol for a randomized controlled study. Research Square. https://doi.org/10.21203/rs.3.rs-2819810/v1

[17]. Ford, W., Tisoskey, S. P., & Locantore-Ford, P. (2023). Building trust: Developing an ethical communication framework for navigating artificial intelligence discussions and addressing potential patient concerns. Blood. https://doi.org/10.1182/blood-2023-190943

[18]. Parchmann, N., Hansen, D., Orzechowski, M., & Steger, F. (2024). An ethical assessment of professional opinions on concerns, chances, and limitations of the implementation of an artificial intelligence-based technology into the geriatric patient treatment and continuity of care. GeroScience. https://doi.org/10.1007/s11357-024-01229-6

[19]. Ayhan, Y. (2023). The impact of artificial intelligence on psychiatry: Benefits and concerns—An assay from a disputed ‘author’. Turkish Journal of Psychiatry. https://doi.org/10.5080/u27365

[20]. Bhattacharjee, S., Ahmad, P. M., González Vallejo, R., Shahzadi, I., Rahman, M. A. (2024). Exploring ethical dimensions of AI assistants and chatbots. Advances in Computational Intelligence and Robotics Book Series. https://doi.org/10.4018/979-8-3693-9173-0.ch011

[21]. Shiradkar, S., Rabelo, L., Alasim, F., & Nagadi, K. (2021). Virtual world as an interactive safety training platform. Information: An International Interdisciplinary Journal, 12(6), 219. https://doi.org/10.3390/INFO12060219

[22]. Miño-Puigcercós, R., Rivera-Vargas, P., & Cobo, R. (2019). Virtual communities as safe spaces created by young feminists: Identity, mobility, and sense of belonging. In M. T. Khine & M. A. P. (Eds.), Research on teaching and learning in virtual environments (pp. 107-118). Springer. https://doi.org/10.1007/978-3-319-96113-2_8

[23]. Klementyeva, M. V. (2023). Virtual environment as a life space of the modern person. Гуманитарные науки, 12(5), 63–69. https://doi.org/10.26794/2226-7867-2022-12-5-63-69

[24]. Gemiharto, I., & Masrina, D. (2024). User privacy preservation in AI-powered digital communication systems. Jurnal Communio, 13(2), 9420. https://doi.org/10.35508/jikom.v13i2.9420

[25]. Dorafshanian, M., Aitsam, M., Mejri, M., & Di Nuovo, A. (2024). Beyond data collection: Safeguarding user privacy in social robotics. Proceedings of the IEEE International Conference on Information Technology (ICIT). https://doi.org/10.1109/icit58233.2024.10540743

[26]. Boina, R., & Achanta, A. (2023). Balancing language brilliance with user privacy: A call for ethical data handling in ChatGPT. International Journal of Science and Research, 10(6), 657-711. https://doi.org/10.21275/sr23903065711

[27]. Alijoyo, F. A., Sneha, S. S., Rao, P. A., Yuldashev, D., & Valavan, M. (2024). Ethical considerations in explainable AI: Balancing transparency and user privacy in English language-based virtual assistants. Proceedings of the IEEE International Conference on ICICV, 62344. https://doi.org/10.1109/icicv62344.2024.00069

[28]. Dozio, E. (2023). Emotional stabilization interventions for people exposed to chronic traumatic events in humanitarian settings. European Psychiatry. https://doi.org/10.1192/j.eurpsy.2023.429

[29]. McLean, J., Shields, J., Wildman, J. M., Hamid, A., MacGregor, A. J., Best, C., Duncan, E., McNicol, S., Fenocchi, L., Mason, H., MacIntyre, D. J., Melson, A. J., & O'Connor, R. T. (2024). Impact of a distress brief intervention on suicidal ideation, suicide attempts, and self-harm in the immediate, short, and longer term: A mixed method evaluation study protocol. NIHR Open Research. https://doi.org/10.3310/nihropenres.13592.1

[30]. Everly, G. S., & Lating, J. M. (2013). Crisis intervention and psychological first aid. In Psychological First Aid: Guide for the Mental Health Professional (pp. 221-231). Springer. https://doi.org/10.1007/978-1-4614-5538-7_22

[31]. Echterling, L. G., Presbury, J., & McKee, E. (2004). Crisis intervention: Promoting resilience and resolution in troubled times. Brooks/Cole.

[32]. Yardley, P., McCall, A., Savage, A., & Newton, R. (2019). Effectiveness of a brief intervention aimed at increasing distress tolerance for individuals in crisis or at risk of self-harm. Australasian Psychiatry, 27(3), 267–274. https://doi.org/10.1177/1039856219848835

[33]. Peri-Glass, Y. (2007). Online personal coach for software applications.

[34]. Kim, I. B., & Szabó, P. (2005). Brief coaching for lasting solutions. Norton & Company.

[35]. Manikanda Prabu, P., Aravinth, M., Kannan, M., Sanjai, P., Sankar, R. (2024). Career guidance using machine learning. International Journal of Advanced Research in Science, Communication and Technology, 13(2), 17841. https://doi.org/10.48175/ijarsct-17841

[36]. Erbe, R., & Walch, G. (1976). A general application guidance system for the problem solver. In Proceedings of the International Conference on Artificial Intelligence (pp. 158-169). Springer. https://doi.org/10.1007/978-3-642-95289-0_20

[37]. Attridge, M., Pawlowski, D. E., & Fogarty, S. (2023). Mental health coaching from employee assistance program improves depression and employee work outcomes: Longitudinal results from CuraLinc Healthcare 2020-2022. International Journal of Scientific and Research Publications, 13(2), 13438. https://doi.org/10.29322/ijsrp.13.02.2023.p13438

[38]. Asha, P., Adhithya, B., Hariharan, R., Srinivasan, N., Grace, J. L., Ronald, K. A., & Doni, L. (2024). Efficient mental health therapist chatbot assisted by artificial intelligence. Proceedings of the IEEE International Conference on ICACCS, 60874. https://doi.org/10.1109/icaccs60874.2024.10716823

[39]. Benton, S. A., Donaldson, J., Lee, G., Shaw, B. M., & Thomas, A. O. (2015). Therapist assisted mental health treatment management system and method. US Patent 9,243,999.

[40]. Paget, M., Choksi, A., Quigley, C., Williams, M., & Stevenson, A. J. (2023). A26 Empathic simulation: A novel simulation design to develop empathy in healthcare students. International Journal of Healthcare Simulation, 8(1), 33-42. https://doi.org/10.54531/xjck3778

[41]. Yao, H., de Siqueira, A. G., Bafna, A., Peterkin, D., Richards, J., Rogers, M. L., Foster, A., Galynker, I. S., & Lok, B. (2022). A virtual human interaction using scaffolded ping-pong feedback for healthcare learners to practice empathy skills. Proceedings of the ACM on Human-Computer Interaction, 6(2), 1-20. https://doi.org/10.1145/3514197.3549621

[42]. Szalai-Szolcsányi, J., Warta, V., & Eklics, K. (2022). Empathic communication skill training in medical education. Health Education, 22(2), 14647. https://doi.org/10.4995/head22.2022.14647

[43]. Riches, S., Iannelli, H., Reynolds, L. M., Fisher, H. L., Cross, S., & Attoe, C. (2022). Virtual reality-based training for mental health staff: A novel approach to increase empathy, compassion, and subjective understanding of service user experience. Advances in Simulation, 7(1), 17. https://doi.org/10.1186/s41077-022-00217-0

[44]. Omarov, B., Zhumanov, Z., & Gumar, A. (2023). Artificial intelligence enabled mobile chatbot psychologist using AIML and cognitive behavioral therapy. International Journal of Advanced Computer Science and Applications, 14(6), 616. https://doi.org/10.14569/ijacsa.2023.0140616

[45]. Rani, K. (2023). A mental health chatbot delivering cognitive behavior therapy and remote health monitoring using NLP and AI. Proceedings of the IEEE International Conference on Digital Technologies (ICDT), 10150665. https://doi.org/10.1109/ICDT57929.2023.10150665

[46]. de Filippis, R., & Al Foysal, A. (2024). Chatbots in psychology: Revolutionizing clinical support and mental health care (Preprint). Journal of Medical Internet Research, 6(2), 57193. https://doi.org/10.2196/preprints.57193

[47]. Chen, T. (2024). The world of VUCA. International Perspectives on Social Policy, Administration, and Practice, 1(1), 1-12. https://doi.org/10.1007/978-3-031-56756-8_1

[48]. Sari, D. N., Soamole, A., & Marsella, P. E. (2024). Preparing students' competencies to face the challenges of the VUCA (volatility, uncertainty, complexity, ambiguity) era. Journal of Development Economics and Digitalization Tourism Economics, 1(3), 926. https://doi.org/10.59407/jdedte.v1i3.926

[49]. Börner, D., & Zohmann, A. (2022). VUCA. Hands On, 21(3), 17-25. https://doi.org/10.1055/a-1942-6128

[50]. Majewski, J., & Leja, K. (2023). New challenges, new solutions: How does an IT company navigate in the VUCA era? E-Mentor, 18(3), 1614. https://doi.org/10.15219/em100.1614

[51]. Minciu, M., Veith, C., Dobrea, R. C., & Ionescu, V. C. (2024). Adaptive strategies and sustainable investments: Navigating organizations through a VUCA environment in and after COVID-19. Technological and Economic Development of Economy, 30(3), 22058. https://doi.org/10.3846/tede.2024.22058

[52]. Ojha, A. K. (2023). Technological innovations in mental health: Enhancing access and affordability for improved well-being. Journal of Mental Health Issues and Behavior, 33(5), 11. https://doi.org/10.55529/jmhib.33.5.11

[53]. Avalos, M. R. A., & Aguilera, A. (2022). Digital equity and inclusion in technology-based mental health services. In Routledge eBooks (pp. 115–127). https://doi.org/10.4324/9781003312208-11

[54]. Sharma, P. (2023). Mental health guidance and support bot. International Journal of Software Engineering and Management, 13(28). https://doi.org/10.55041/isjem01328

[55]. Sien, S. W. (2023). Designing for inclusivity and accessibility of mental health technologies. Proceedings of the ACM Conference on Human Factors in Computing Systems, 3544549. https://doi.org/10.1145/3544549.3577038

[56]. Clarkson, J., & Coleman, R. (2010). Inclusive design. Design Studies, 31(2), 123-135. https://doi.org/10.1080/09544821003693689

[57]. Benda, N. C., Montague, E., & Valdez, R. S. (2020). Design for inclusivity. In Handbook of Digital Inclusion (pp. 45-57). https://doi.org/10.1016/B978-0-12-816427-3.00015-4

[58]. Goonetilleke, T. S. (2003). Towards inclusive design through constraint modelling and computer aided ergonomics. International Journal of Human-Computer Interaction, 15(4), 293–308.

[59]. Lawn, E. C. R., Laham, S. M., Zhao, K., Christensen, A. P., & Smillie, L. D. (2023). Where the head meets the heart: ‘Enlightened’ compassion lies between big five openness/intellect and agreeableness. Collabra Psychology, 9(1), 74468. https://doi.org/10.1525/collabra.74468

[60]. Marques, J. (2020). Awakened leaders and conscious followers: Leading mindful change. In Management for Professionals (pp. 19–32). Springer. https://doi.org/10.1007/978-3-030-38129-5_2

[61]. Marques, J. (2018). Awakened leadership: A mindful roadmap for perpetual design thinking. In Management for Professionals (pp. 3–20). Springer. https://doi.org/10.1007/978-3-319-72221-4_1

[62]. Khawaja, Z., & Bélisle-Pipon, J. C. (2023). Your robot therapist is not your therapist: Understanding the role of AI-powered mental health chatbots. Frontiers in Digital Health, 6(1), 1278186. https://doi.org/10.3389/fdgth.2023.1278186

[63]. Montemayor, C., Halpern, J., & Fairweather, A. (2021). In principle obstacles for empathic AI: Why we can’t replace human empathy in healthcare. AI & Society, 36(4), 1239-1249. https://doi.org/10.1007/S00146-021-01230-Z

[64]. Arjanto, P., Feronika, F., & Wiwenly, S. (2024). Literature review on the double-edged sword of AI in mental health: A deep dive into ChatGPT's capabilities and limitations. Journal of Community Mental Health and Public Policy, 6(2), 144. https://doi.org/10.51602/cmhp.v6i2.144

[65]. Kurian, N. (2023). AI's empathy gap: The risks of conversational artificial intelligence for young children's well-being and key ethical considerations for early childhood education and care. Contemporary Issues in Early Childhood, 24(3), 306-314. https://doi.org/10.1177/14639491231206004

[66]. Grodniewicz, J. P., & Hohol, M. (2023). Waiting for a digital therapist: Three challenges on the path to psychotherapy delivered by artificial intelligence. Frontiers in Psychiatry, 14, 1190084. https://doi.org/10.3389/fpsyt.2023.1190084

Cite this article

Zhu,Y. (2024). FASSLING: Transforming emotional and coaching support through artificial intelligence (AI) innovation. Journal of Clinical Technology and Theory,2,1-10.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Journal:Journal of Clinical Technology and Theory

Volume number: Vol.2
ISSN:3049-5458(Print) / 3049-5466(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).