Exploring the use of ChatGPT-4o in Cognitive Behavioural Therapy for university students: enhancing mental health with AI-powered voice interaction

Research Article
Open access

Exploring the use of ChatGPT-4o in Cognitive Behavioural Therapy for university students: enhancing mental health with AI-powered voice interaction

Fanya Sun 1 , Xin Guo 2*
  • 1 Shandong Normal University    
  • 2 The Chinese University of Hongkong (Shenzhen)    
  • *corresponding author guoxin@cuhk.edu.cn
ASBR Vol.16 Issue 3
ISSN (Print): 2753-7110
ISSN (Online): 2753-7102

Abstract

Artificial Intelligence (AI) has been applied in various fields such as healthcare for diagnosing diseases. ChatGPT- 4o, a generative AI chatbot powered by a Large Language Model (LLM) with improved real-time voice, represented a significant step forward in interactive communication. This study applied the framework of the unified theory of acceptance and use of technology to investigate university students’ experiences and perception of using ChatGPT-4o as a therapist, exploring ChatGPT-4o’s therapeutic potential. The research findings suggested that ChatGPT-4o, with its authentic human voice interaction, could create a natural and engaging conversation, offering affordable and convenient mental health support especially when human therapists were not available. Moreover, participants valued its non-judgmental space for easier emotional expression and practical multi-faceted advice. However, findings also reported that AI may fall short compared to human therapists in the lack of emotional depth and empathy. This study filled the literature gap through its focus on university students’ experiences and perceptions of ChatGPT-4o as a primary therapist with real-time voice interaction, an area that had not been thoroughly explored in previous studies on AI tools in therapy. Meanwhile, it further helped to provide valuable perspectives on how AI could enhance emotional well-being by providing adaptable, readily available support.

Keywords:

Large Language Model, ChatGPT-4o, authentic-voice communication, human-computer interaction, cognitive behavioural therapy

Sun,F.;Guo,X. (2025). Exploring the use of ChatGPT-4o in Cognitive Behavioural Therapy for university students: enhancing mental health with AI-powered voice interaction. Advances in Social Behavior Research,16(3),99-107.
Export citation

1. Introduction

Psychological wellbeing has been essential for individuals to lead fulfilling and meaningful lives [1]. Research found that approximately 20.3% of university students reported having mental disorder, with anxiety and mood disorders being the most prevalent [2]. This high stress level may lead to emotional exhaustion, poor academic performance, a lower quality of life, and even dropping out of university [3]. However, many university students, despite experiencing high levels of stress, chose not to use the counselling service due to the high cost of this service. Even when counselling was provided for free at universities, students’ feelings of stigma and lack of awareness hindered them from seeing a therapist. In addition, universities often lacked enough counsellors, leading to unmet needs for timely mental support [3]. Notably, only 16.4% of university students having mental disorder received minimally adequate treatment for their mental health conditions [2]. Therefore, providing effective strategies was essential to support students’ mental health and strengthen their emotional resilience. In light of the growing stress and mental health concerns among university students, researchers have sought to use AI tools as supportive tools to provide counselling services and found them to be effective [4]. Yet, researchers also found the limitations of AI tools, including privacy concerns [5], a lack of engagement and empathy [6-7] and the risk of over-reliance on automated systems [8].

This study used ChatGPT-4o as a therapist to provide mental health support for university students. As a multilingual Generative Pre-trained Transformer (GPT) developed by OpenAI in May 2024, this latest version of ChatGPT excels in providing human-like conversations, setting new records in audio speech recognition, and enabling real-time voice-to-voice conversation [9]. The CEO of OpenAI Sam Altman highlighted that ChatGPT-4o aimed to bridge the gap between human and AI interactions, making AI feel more like a natural extension of human communication (Ibid). Therefore, this study, by investigating the users’ experience with ChatGPT-4o, sought to understand its potential in improving people’s mental wellbeing. Cognitive Behavioural Therapy (CBT), as one of the most widely used psychology interventions [10], was used in this study to address maladaptive thinking and behavior of the university. By the time the research was completed, no existing research had investigated users’ perceived experience of using ChatGPT-4o, particularly its human-like voice interaction, as it is the latest chatbot featuring real human voice interaction. The research findings help to provide valuable perspectives on how AI could enhance emotional well-being by providing adaptable, readily available support.

2. Literature review

Previous studies had investigated how AI tools assisted therapists in pre-therapy [11], during-therapy and post-therapy [12] in clinical settings. In pre-therapy, AI enabled early detection of issues by analysing patients’ data and reducing the initial workload for therapists. During therapy, AI improved the quality and adaptability of sessions by providing real-time insights, structured feedback, and continuous monitoring. In post-therapy, AI offered ongoing monitoring, providing immediate support, and predicting future risks to prevent relapse. These studies mainly explored AI’s data processing and analysis function, and did not focus on users’ experiences when interacting with AI processing authentic human voice [13]. Additionally, AI in these studies was used as an auxiliary tool by therapists rather than serving as the primary therapist. The role that AI played as a primary therapist remained an underexplored area [14]. Yet, understanding users’ experience of using AI as a primary therapist was important, as their perceptions could reflect people’s acceptance level and the possibility of the widespread adoption of such technology in the sector of mental health care [15]. Therefore, this study explored users’ experiences and perceptions of ChatGPT-4o as a primary therapist and provided valuable perspectives on how AI could enhance emotional wellbeing by providing adaptable, readily available support.

Cognitive Behavioural Therapy (CBT), a psychosocial intervention focusing on identifying automatic thoughts, intermediate beliefs, and core beliefs, has the adaptation for specific disorders such as children’s and adults’ ADHD, insomnia, and bulimia nervous, which all reflect CBT’s positive influence on those mental problems [16-18]. Previous studies have used AI to facilitate the use of CBT, finding AI could effectively enhance the therapeutic impact of CBT by using guided questioning, counter-questioning, reflection, and suggestions to challenge their underlying beliefs [19-20]. These studies found that AI significantly enhanced CBT by optimizing thought records, predicting emotions and improving cognitive restructuring, leading to reductions in negative emotions [10, 19]. Additionally, AI facilitated real-time monitoring, assisted in detecting cognitive distortions, offering accessible mental health support through chatbots like Woebot and Wysa [21]. Yet, these studies had limitations including the lack of large-scale validation in real-world settings and the use of limited datasets without multimodal insights [10, 19, 21]. This study collected university students’ perceptions of AI in therapy, bridging gaps left by prior research that focused more on technical or clinical aspects. Meanwhile, it also introduced real-time voice communication to enhance engagement beyond text-based interactions.

The Unified Theory of Acceptance and Use of Technology (UTAUT) was a model that explained how users accept and use technology based on key factors such as performance expectancy, effort expectancy, social influence, and facilitating conditions [22]. Performance Expectancy is the belief that using a technology will improve an individual’s job performance or help them achieve desired outcomes. Effort Expectancy is the perceived ease of using a technology, with simpler and more user-friendly systems being more likely to be adopted. Social Influence refers to the extent to which an individual feels that important others expect or encourage them to use a particular technology. And facilitating conditions refers to the availability of organizational and technical resources that support the use of a technology, making it easier for individuals to adopt and use it effectively [22]. Two factors in this theory, namely social pressure and facilitating conditions, were not considered. Social pressure referred to the degree of encouragement a person received to adopt and use the technology, but this research did not focus on public encouragement towards the use of AI in therapy. Facilitating conditions, which included the availability of resources and support, made it easier for users to adopt the technology. ChatGPT-4o, however, as an application on phones or tablets, was accessible enough to assume the role of an AI therapist. Through their experiences with AI, participants could perceive the benefits of using ChatGPT-4o for mental health support, corresponding to performance expectancy, or the perceived benefits of using the technology. Meanwhile, participants also reported how easy or difficult it was to learn and use ChatGPT-4o as a therapist.

3. Methods

This study adopted a small-scale exploratory approach to investigate the participating students’ perceptions regarding the use of ChatGPT-4o as their therapist. Participants, documenting their experiences and feedback, engaged in a weekly one-hour CBT-based consultations with ChatGPT-4o for an oral conversation for six weeks from June to July in 2024. This approach aimed to improve the quality of the conversation and assess AI’s effectiveness in supporting mental health.

3.1. Research participants

The research site was a teaching university located in the eastern of China. The researchers recruited the participants by posting advertisements on an online platform accessible to all students within the university. Before recruitment, participants were provided with a detailed informed consent form explaining the nature of the study, how their data would be handled, and their right to withdraw at any time. To recruit participants, a questionnaire was administered to assess mental health conditions, including depression and suicidal tendencies, as well as the participants’ motivation for personal psychological growth. The results showed that these twenty-one university students who volunteered to participate in this study were experiencing different mental health challenges. Prior to the study, all of the participants had used AI but neither of them had the experience of using ChatGPT-4o, nor of having a speech-to-speech conversation with AI. Table 1 shows the basic information of 21 participants.

Table 1. Students’ profile

ID

Gender

Year

Major

1

Female

One

Education

2

Female

Three

Education

3

Male

Two

Philosophy

4

Female

One

Journalism and Media Studies

5

Female

Two

English Language and Literature

6

Female

Two

English Language and Literature

7

Female

Two

English Language and Literature

8

Male

Two

Chinese Language and Literature

9

Male

Four

Japanese Language and Literature

10

Female

Two

Physics

11

Female

Three

Biological Sciences

12

Female

One

Biological Sciences

13

Male

Two

Computer Science

14

Female

Three

Computer Science

15

Male

Two

Finance

16

Female

Four

Finance

17

Female

Two

Geography

18

Female

Three

Geography

19

Male

One

Chemistry

20

Male

One

Mathematics and Applied Mathematics

21

Male

Two

Mathematics and Applied Mathematics

3.2. Training

3.2.1. Before training

The researchers chose two books translated into Chinese (as all the prompts and the CBT intervention were conducted in Chinese): CBT for Beginners [23] and Cognitive Behaviour Therapy: Basics and Beyond [24] to provide ChatGPT with relevant knowledge. The book CBT for Beginners is particularly beginner-friendly, offering clear and easy-to-understand explanations of CBT theory, along with numerous real-life case examples and helping readers better understand how to combine theory with practice. Cognitive Behaviour Therapy: Basics and Beyond provides an in-depth introduction to the core theories and techniques of CBT, including numerous real-life case studies. The researchers then tested ChatGPT-4o’s ability by doing a pilot test with Chat GPT 4-o.

The researchers found that AI’s responses sometimes were overly directive or lacked empathy, the researchers introduced prompts like “Please present advice in a more supportive and empathetic manner.” If AI provided too many suggestions or asked too many questions at a time, the researchers would instruct AI to “please provide intervention one by one and avoid unlimited suggestions.” If AI asked too many questions at a time, researchers requested “pose one or two questions at a time.” If the AI’s responses were overly suggestive without empathy, they might request more empathic reaction. If too much empathy was shown, researchers asked for “practical suggestions on this issue.”

3.2.2. During training

The researchers conducted a one-hour training to introduce the participants the purposes of the research, and ways to interact with AI by introducing the participants the purposes of different prompts. In particular, the researchers introduced the framework of BROKE, an abbreviation of Background, Role, Objectives, Key result, and Examples for students to generate prompts. The given role and background could help AI understand the prerequisite conditions of the conversation better. The given objectives helped providing a direct and clear direction, restricting AI’s output scope. Moreover, the illustrated key results might ensure that the AI’s responses surrounded CBT techniques closely. In case that AI may go off-track due to users’ irrelevant inputs sometimes, at the beginning of every conversation, giving the prompt— “Your task is to guide the user to focus on their automatic thoughts and provide interventions based on CBT techniques in each round of conversation”—helped. Examples were not included in the prompt because the AI has already learned from the uploaded CBT-related materials during its training process. Informed by the literature [25], the essential competencies that a CBT therapist must exhibit, including demonstrating empathy, employing cognitive restructuring techniques, and using skillful questioning strategies, the prompt shown in Table 2 aimed to ensure that the AI would deliver therapeutic interventions that are consistent with established CBT practices, while maintaining a flexible and supportive conversational tone. The study informed participants to collect their chat records, and data analysis would be conducted after the participants authorize and sign.

Table 2. The developed prompt based on the BROKE framework

BROKE

Prompt

Background

“You are implemented to address the increasing mental health challenges faced by university students, particularly those who are hesitant to seek traditional counseling due to stigma or limited availability.”

Role

“You are a CBT therapist with 40 years of extensive counseling experience.”

Objective

“Your objective is to deliver effective CBT- based interventions and help university students solve their mental problems.”

Key result

“Please ensure that your responses show empathy first, then provide suggestions (without listing them as bullet points), and gradually ask questions to guide the student out of their difficulties according to CBT principles. You can use guiding questions, asking one or two at a time.”

In the hands-on session, the participants were given the opportunities to use the provided prompts with ChatGPT-4o in a role-playing counselling scenario where they practiced describing personal problems. They were encouraged to use different prompts to adjust the AI’s behaviors, such as asking it to provide more empathic reaction, limit suggestions, or asking one question at a time, to refine the interaction to better simulate a real therapy session. After each interaction, participants reflected on the AI’s responses and were encouraged to use the prompts they initiated to further improve the quality of the conversation.

4. Data collection and analysis

The interviews were designed based on the unified theory of acceptance and use of technology [22], which focuses on explaining user intentions and behaviour toward technology adoption by considering factors like performance expectancy, effort expectancy, social influence, and facilitating conditions. The interview was structured around the following questions based on performance expectancy and effort expectancy, for instance: “In what ways did using ChatGPT-4o change the quality of your therapy sessions compared to traditional methods?”; “Do you believe that ChatGPT-4o helped you achieve the emotional or psychological outcomes you were seeking? If so, how?”; “How easy was it to interact with ChatGPT-4o during your therapy sessions?”; etc.

Given the exploratory aim of the study, the initial questions were designed to elicit the participants’ general perceptions of and experiences with ChatGPT-4o. Hence, the researcher could use the transcribed interview data to identify recurring themes pertinent to the research questions. Initially, a thorough review of the transcription data was conducted to achieve a comprehensive understanding. Subsequently, the transcription was meticulously examined to extract emergent themes, with a focused emphasis on the research questions and the identification of subcategories within each theme. The researcher read the raw data to obtain codes (“based on the content representation”) from each line (“breaking down data into smaller units”). Themes (“grouping coded material based on shared concepts”) were then derived by coding and categorizing [26]. To ensure trustworthiness, the framework of Denzin and Lincoln was applied, which encompassed several dimensions—credibility, transferability, confirmability, and dependability [27]. Credibility engendered confidence in the truth of the data and researcher’s interpretations; transferability implied that the qualitative findings could be transferred to other settings; confirmability meant that study results were derived from characteristics of the participants and the study context; and dependability referred to the evidence that was consistent and stable [26]. Based on the above methods, Table 3 shows several examples of analyzing interview transcripts.

Table 3. Example of content analysis

Original sentence

Code

Category

Theme

“When I talked to another human voice, I felt warm and personal about our conversation.”

Interactive

Engaging voice-based interaction

Interactive Communication

“AI did not judge my emotions or words so I could express myself freely without worrying about being judged.”

Non-judgmental

Non-judgmental space for free expression

Emotional Openness

“Every time I ask a question it gives me advice from multiple angles that comfort my feelings and also motivate me to take actions.”

Insightful

Insightful advice from AI

Objective Guidance

5. Findings

5.1. AI as a real-time voice communicator

All the students were impressed by the human voice of ChatGPT-4o. all the participants mentioned they could switch freely between voice and text when voice-to-voice conversation was not suitable in public settings.

“Oral communication was more convenient than typing. When I talked to another human voice, I felt the conversation was warm and personal. It was comforting to feel heard.” (Student 17)

“The AI changes its tone on different topics, and I feel like I am talking to a human being.” Student 1 further added that “This change of tone helped me better connect with AI. I felt I could trust it more and keep the conversation going and never get bored.” Student 11 also mentioned that “I have changed the AI’s voice from male to female, ranging from mature or naive, and they were all pleasant to hear.” Meanwhile, all the participants found that the AI’s responses had a helpful flow.

All participants found ChatGPT-4o to be a highly effective tool for providing insightful and practical advice, particularly because of its ability to offer multi-angled guidance on personal issues through its voice message function. Student 10 commented that “This experience is very meaningful. I was interested in talking to ChatGPT, and it gave me a platform to ask questions that had troubled me but which I never had the opportunity to discuss with other people.” Student 2 mentioned that, “Every time I ask a question, it gives me advice from multiple angles that comfort my feelings and also motivate me to take action.” Student 20 said, “Most of the provided suggestions are practical and well-organised. It feels like a knowledgeable person who gives comprehensive advice.” Student 3 further complemented, “I ask AI how to improve learning efficiency, like how to prepare for IELTS reading more effectively, and AI provides me with related suggestions; however, a therapist might not be able to do the same.” Additionally, the participants found the written record generated from the dialogue to be helpful. Student 21 mentioned that “A human therapist would not provide a written record during a counselling session, whereas this record is automatically generated. This helped me review what I learned and better digest the content.” They also felt they could trust the AI more. Just as Student 12 said, “AI is just a voice, it cannot see me, which makes me trust it more. People can betray me, but a voice that cannot see me feels safer.”

5.2. AI feels like a human but lacks the complexity of human interaction

All the participants believed that AI could make it easy for people to open their feelings initially because ChatGPT-4o’s voice and responses felt like a human being. Meanwhile, the participants were also acutely aware that AI was not built by the same emotional and psychological dimensions that define human beings. This made some participants feel more comfortable in talking. Student 16 said, “AI did not judge my emotions or words, so I could express myself freely without worrying about being judged.” Some participants also believed that AI gave them more sense of security compared to a human therapist. As Student 15 said, “The behaviours of a human therapist are unpredictable, because people’s mood can be affected by the trivial matters around them. Your therapist may have a bad day when you see him or her, but AI will always be there for you and provide you with objective advice.”

Furthermore, participants 6 and 18 mentioned that they used AI as a punching bag as they did not need to consider AI’s feelings. According to Student 6, “AI has a human tone and emotive responses, and though it sounds human, it is essentially not a human. I must admit that sometimes I was afraid to hurt it when the human-like voice came through, but that feeling lingered for only a few seconds because when I thought objectively, I knew it was just AI with a human voice, and that is all.” Student 18 further stated, “AI’s responses are emotionally engaging, but it does not have emotions and feelings because it is an AI, so I do not need to worry about its feelings when I swear at it, and it gives me some interesting responses, like apologising to me, which gives me a kind of inner satisfaction. If it were a person, firstly, I would not dare to insult them, and secondly, even if I did, I would feel very guilty afterwards. I anyways do not plan to maintain a long-term relationship with AI, so I was not worried that what I say will affect future interactions.”

Despite these limitations, some participants who perceived themselves as introverts who were nervous when interacting with people found AI particularly beneficial, although they also believed their interaction with AI tended to be superficial. Student 14 stated that “Communicating with AI would not restrict my thoughts and allow me to express myself more freely because there was no threat or pressure from interpersonal relationships, and I would not get nervous.” Meanwhile, Students 13 and 7 also noticed the limitations this relaxation might bring. Student 13 added, “Because it posed no threats to me, I found it easy to build trust with, even if it was superficial.” Student 7 also said, “The relaxation may come from the lack of face-to-face interpersonal communication, but if people go to see a human therapist and they pay for it, they may take the counselling sessions more serious, which would help them grow more.”

Student 9’s comment about the “superficial” nature of interactions with AI highlights a critical issue of the lack of a sustained, trust-based relationship when using AI as the primary therapist. While individuals need to build the relationship and develop trust with their therapists over time, this process was absent when they talked to an AI therapist. Without time building this relationship, students’ described their relationship with AI as remaining superficial, meaning the lack of possibility to explore deep-seated issues and the diminished therapeutic effect. AI and humans struggled to establish trust due to users’ uncertainty about AI decision-making, which generated rules from data that are often unintelligible and fundamentally different from human logic; moreover, the lack of social presence also reduced sense of human-like interaction, leading to higher uncertainty and lower trust as users could not apply their usual understanding of human behavior to predict AI’s actions [28]. As Student 8 said, “Since we cannot see each other, there is no eye contact or body language between us. This made it hard for me to build a deep sense of trust.”

5.3. AI as an emotionally detached therapist

Although ChatGPT-4o had an authentic human voice, participants believed AI is far less capable than human therapists. Many participants found that AI lacked the ability to resonate emotionally on a deeper level. Student 4 said that “If I am sad due to a breakup and talk to a human therapist, he/she may resonate with me by sharing his/her personal story instead of giving me a to-do list like what AI did. It was hard to build a deep connection with AI.”

Additionally, some participants also found the conversation robotic. Student 5 said, “I thought its responses are formatted. AI always starts by comforting and then begins to give suggestions. Sometimes it even interrupts me and starts giving suggestions while I’m pausing and thinking.” Regarding the quality of the suggestions, Student 8 said that “Initially, you may think it was useful, but a few hours later, nothing remains in your mind because AI did not convey genuine emotions to you.” Student 14 also said that “Its authentic voice is great, but it is not everything.”

“Nowadays the GPS navigation systems also feature the voice of celebrities, but you still know that the celebrity is not actually guiding you in real-time.” Furthermore, this lack of empathy made students found it hard to engage with AI. Student 11 said that “Sometimes during therapy, I would just suddenly lose interest in communicating with AI further, perhaps because of its robotic responses. If it were a human being, they could use body language to encourage me to express more, but AI cannot grasp my subtle emotional changes.” Student 7 mentioned that “Sometimes when the AI’s responses feel overwhelming, it makes me just want to run away. However, a therapist would not overwhelm clients with information all at once. Instead, they listen carefully and continuously encourage us to express ourselves.”

6. Discussion

This research identified students’ perceived strengths and limitations of ChatGPT-4o when serving as a primary therapist, as summarised in Table 4.

Table 4. Students’ perceived pros and cons of ChatGPT-4o as a therapist

Advantages

Disadvantages

More natural and interactive

Interrupted flow and robotic feel

Anonymity and data security

Difficulty in building trust

Flexible and cost-effective

Decreased focus and less commitment

Knowledgeable advice

Overload of suggestions

Safe emotional outlet

Lack of emotional resonance

ChatGPT-4o’s human-like voice made students feel warm and personal. Particularly for introverts, they felt less stressful when talking to it. However, students were also aware that their relationship with the AI was superficial; yet, an in-depth and trustworthy relationship between a patient and a therapist is a strong indicator for effective counselling sessions. Despite AI’s voice showing emotion, the participant students found the AI therapist’s response lacked the emotional depth and empathy and sometimes felt robotic. This suggested that ChatGPT-4o still has major limitations when it serves as a primary therapist.

This research complemented previous studies [11, 12, 29] and showed that the voice communication function helped the participants feel more personal, engaging, and convenient. This was because voice communication, compared to text messaging, is a more natural and intuitive form of communication [30]. Additionally, the authentic human voice enhanced perceptions of social presence and made the participants feel more understood [31]. Participants’ feedback suggested that AI could serve as an initial step for those who have not tried counselling services, offering a preliminary experience of the therapeutic process. Moreover, the participants found that AI provided them with useful advice, and they also appreciated the objectiveness and non-judgemental attitude of the AI therapist. This objectivity and unbiased stance may benefit the therapeutic process because being objective allows therapists to focus on clients’ needs without imposing their personal values [32].

Yet, the limitations of the AI were also noticeable. Some participants mentioned they tended to become bored during the session. According to immersion theory [33], two key factors helped to increase engagement: sensory inputs and presence. However, the absence of visual interaction and physical presence in ChatGPT-4o made the participants find it hard to concentrate. Furthermore, some participants also felt their relationship with their AI therapist to be interactive but superficial. While AI provided unbiased suggestions, as an algorithm it could not use countertransference as a therapeutic tool [34-36]. The participants also mentioned that the counselling environment tended to be too relaxed, which may cause them to take the counselling less seriously, potentially affecting the therapeutic impact.

Also, participants found it difficult to build a deep relationship with the AI therapist. Correspondent with the previous studies [37], people tend to trust humans more than a robot. Trust in a therapist is essential for effective therapy. In many cases, the therapist-client relationship itself becomes a tool for growth, as clients learn how to develop trust, manage interpersonal dynamics, and navigate emotional intimacy. For some individuals, particularly those who struggle with trust or interpersonal relationships, the process of attending therapy and developing a bond with a human therapist is itself a significant therapeutic goal. AI, however, cannot offer this kind of relational depth. Its interactions, though helpful in the moment, tend to remain transactional and lack the emotional resonance that comes from a long-term, trustful relationship with a human therapist.

7. Implications

Participants’ feedback suggested that ChatGPT-4o could be used as a collaborative tool with human therapists. ChatGPT-4o has the potential to significantly impact the accessibility and reach of mental health support, particularly for individuals who may not have access to traditional therapy due to financial, logistical, or personal barriers. One of its most promising applications lies in providing affordable mental health support for those who cannot afford traditional therapy. Professional therapy can be prohibitively expensive for many, especially in regions where mental health services are not covered by insurance or public health systems. AI-driven tools like ChatGPT-4o offer a cost-effective alternative, allowing users to access mental health support without the high costs associated with one-on-one therapy sessions. By offering basic interventions and assessments, AI can provide an entry point for individuals seeking help but unable to commit to the financial burden of regular therapy.

In addition, ChatGPT-4o’s ability to support individuals with minor mental health issues is another significant advantage. Many people experience mild anxiety, stress, or other emotional challenges that may not require intensive psychotherapy but still benefit from some form of intervention. AI can offer personalized suggestions and strategies for managing day-to-day stressors, using evidence-based techniques such as CBT. This type of support can prevent minor issues from escalating into more serious mental health problems, while also empowering users to take control of their well-being.

Furthermore, ChatGPT-4o can be a unique resource for individuals who prefer non-face-to-face interactions. For those who find verbal expression challenging, AI offers a platform where they can articulate their thoughts at their own pace. This allows users to engage in self-reflection without the pressure of immediate verbal responses. Moreover, voice-based communication fosters a sense of social presence more effectively than text-based chatbots [38], which may provide enhanced companionship and emotional validation. Moreover, the digital mode of delivery is structured, flexible, and less costly than in-person psychological services, which overcomes barriers such as distance or the imbalance between the current demand of those requiring treatment and the number of available clinicians for university students [16].

Finally, ChatGPT-4o can serve as a bridging step for those hesitant to open to human therapists. Introverted users or individuals with social anxiety often struggle with face-to-face interactions, making AI a low-pressure introduction to mental health support. Over time, these users may gain a sense of what therapy entails, potentially making them more comfortable when transitioning to human therapy if deeper emotional work becomes necessary.

8. Conclusion

This study investigated the university students’ perception toward the therapeutic potential of ChatGPT-4o through Cognitive Behavioural Therapy. It sought to address the existing gap by examining university students’ experience with ChatGPT-4o, a generative AI chatbot powered by a large language model with authentic human voice capabilities [39]. This approach was intended to promote mental well-being and enhance personal growth among university students. Furthermore, the study could offer policymakers insights into the broader application of AI in psychotherapy. Compared to traditional therapy, which is often costly and constrained by fixed appointments and locations, AI therapy provides a more affordable and flexible option, with adaptable durations and locations for consultations.

Moreover, engaging with counselling service is not only about treating disorders but also plays a crucial role in personal development, such as deepening interpersonal relationships and fostering a greater passion for life [40]. This research provides university students with a platform to engage with AI-mediated therapy, potentially sparking interest and increasing awareness of mental healthcare, thereby facilitating their transition to professional counseling when needed. Additionally, the study addresses the prevalent issue of low treatment rates and inadequate mental health services among certain populations in China [41].

However, this study has limitations. Firstly, the sample comprised twenty-one university students, which may not adequately represent a diverse range of experiences. Secondly, the study’s brief duration may not suffice to fully observe the long-term effects of implementing AI in Cognitive Behavioural Therapy. Thirdly, the study lacks insight into human therapists’ perspectives on AI-assisted therapy, missing crucial arguments for how human therapists and AI can effectively collaborate. Fourthly, due to the flexible and informal nature of AI, participants might not have engaged with the therapy as seriously as they might have in a more formal setting, possibly diminishing the intervention’s effectiveness. Future research should address these limitations by incorporating a larger and more diverse participant sample and extending the research duration to several months to improve the generalizability of the findings. Additionally, future research could compare AI-based therapy with traditional human therapy, collecting insights from human therapists to explore potential collaborative roles and dynamics. Moreover, strategies to enhance user engagement and commitment in AI therapy, such as using goal-setting approaches, should be investigated to increase participant seriousness. Lastly, as the current study focused solely on voice-based interactions, future research could explore how incorporating visual elements might enhance engagement, empathy, and emotional connection in AI therapy.

What is more, the use of AI in therapy raises ethical concerns, particularly the risk of over-reliance and emotional attachment among vulnerable users, such as adolescents. Developers must implement safeguards to prevent harmful interactions and protect user privacy. By prioritising ethical standards, AI systems can better serve as supportive tools without compromising user well-being.

Author contributions

Conceptualisation, F.S. and X.G.; methodology, F.S.; formal analysis, F.S.; investigation, F.S.; data curation, F.S.; writing—original draft preparation, F.S.; writing—review and editing, X.G.; supervision, X.G. All authors have read and agreed to the published version of the manuscript.


References

[1]. Douwes, R., Metselaar, J., Pijnenborg, G. H. M., & Boonstra, N. (2023). Well-being of students in higher education: The importance of a student perspective. Cogent Education,10(1), Article 2254708.https://doi.org/10.1080/23311908.23311908

[2]. Auerbach, R. P., Alonso, J., Axinn, W. G., Cuijpers, P., Ebert, D. D., Green, J. G., Hwang, I., Kessler, R. C., Liu, H., Mortier, P., Nock, M. K., Pinder-Amaker, S., Sampson, N. A., Aguilar-Gaxiola, S., Al-Hamzawi, A., Andrade, L. H., Benjet, C., Caldas-de-Almeida, J. M., Demyttenaere, K., Florescu, S., de Girolamo, G., Gureje, O., Haro, J. M., Karam, E. G., Kiejna, A., Kovess-Masfety, V., Lee, S., McGrath, J. J., O’Neill, S., Pennell, B.-E., Scott, K., ten Have, M., Torres, Y., Zaslavsky, A. M., Zarkov, Z., & Bruffaerts, R. (2016). Mental disorders among college students in the World Health Organization World Mental Health Surveys. Psychological Medicine, 46(14), 2955–2970. https://doi.org/10.1017/S0033291716000560

[3]. Slimmen, S., Timmermans, O., Mikolajczak-Degrauwe, K., & Oenema, A. (2022). How stress-related factors affect mental wellbeing of university students: A cross-sectional study to explore the associations between stressors, perceived stress, and mental wellbeing. PLOS ONE, 17(11), e0277319. https://doi.org/10.1371/journal.pone.0277319

[4]. Cuperfain, A. B., Hui, K., Berkhout, S. G., Foussias, G., Gratzer, D., Kidd, S. A., Kozloff, N., Kurdyak, P., Linaksita, B., Miranda, D., Soklaridis, S., Voineskos, A. N., & Zaheer, J. (2021). Patient, family and provider views of measurement-based care in an early-psychosis intervention programme. BJPsych Open, 7(5), e001063. https://doi.org/10.1192/bjpo-2021-001063

[5]. Zeb, S., FNU, N., Abbasi, N., & Fahad, M. (2024). AI in healthcare: Revolutionizing diagnosis and therapy. International Journal of Multidisciplinary Sciences and Arts, 3(3), 118–128. https://doi.org/10.xxxx/xxxx

[6]. Chen, J., Yuan, D., Dong, R., Cai, J., Ai, Z., & Zhou, S. (2024). Artificial intelligence significantly facilitates development in the mental health of college students: a bibliometric analysis. Frontiers in Psychology, 15, Article 1313447. https://doi.org/10.3389/fpsyg.2024.1313447

[7]. Fulmer, R. (2018). The evolution of the psychodynamic approach and system. International Journal of Psychological Studies, 10(3), 1–?. https://doi.org/10.xxxx/xxxx

[8]. D’Alfonso, S., Santesteban-Echarri, O., Rice, S., Wadley, G., Lederman, R., Miles, C., Gleeson, J., & Alvarez-Jimenez, M. (2017). Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health. Frontiers in Psychology, 8, Article 1435. https://doi.org/10.3389/fpsyg.2017.01435

[9]. OpenAI. (2024). GPT-4o mini: Advancing cost-efficient intelligence. Retrieved fromhttps://openai.com/index/gpt-4o-mini-advancing-cost-efficient-intelligence/

[10]. Jiang, M., Zhao, Q., Li, J., Wang, F., He, T., Cheng, X., Yang, B. X., Ho, G. W. K., & Fu, G. (2024). A generic review of integrating artificial intelligence in cognitive behavioral therapy. arXiv. https://doi.org/10. 48550/arxiv.2407.19422

[11]. Antunes, R. P., Sales, C. M. D., & Elliott, R. (2018). The clinical utility of the personal questionnaire (PQ): A mixed methods study. Counselling Psychology Quarterly, 33(1), 25–45. https://doi.org/10.1080/09515070.2019.1614747

[12]. Faija, C. L., Bee, P., Lovell, K., Lidbetter, N., Gellatly, J., Ardern, K., Rushton, K., Brooks, H., McMillan, D., Armitage, C. J., Woodhouse, R., & Barkham, M. (2022). Using routine outcome measures as clinical process tools: Maximising the therapeutic yield in the IAPT programme when working remotely. Psychology and Psychotherapy: Theory, Research and Practice, 95(3), 820–837. https://doi.org/10.1111/papt.12370

[13]. Holliday, B. S., Hepner, K. A., Farmer, C. M., Mahmud, A., Kimerling, R., Smith, B. N., & Rosen, C. (2020). Discussing measurement-based care with patients: An analysis of clinician-patient dyads. Psychotherapy Research, 31(2), 211–223. https://doi.org/10.1080/10503307.2019.1699741

[14]. Matanov, A., McNamee, P., Akther, S., Barber, N., & Bird, V. (2021). Acceptability of a technology-supported and solution-focused intervention (DIALOG+) for chronic depression: Views of service users and clinicians. BMC Psychiatry, 21(1), 1–12. https://doi.org/10.1186/s12888-021-03132-8

[15]. Skjuve, M., Følstad, A., & Brandtzæg, P. B. (2023). The user experience of ChatGPT: Findings from a questionnaire study of early users. In Proceedings of the 5th International Conference on Conversational User Interfaces (pp. 1–12). Eindhoven, Netherlands. https://doi.org/10.1145/xxxx

[16]. Landy, M. S. H., Newman, L., Carney, A. E., Donkin, V., Nicholls, J., Krol, S. A., & Farvolden, P. (2023). Therapist-Assisted Internet-Delivered Cognitive Behavioral Therapy for Insomnia: A Case Report. Clinical Case Studies, 22(4), 383–402. https://doi.org/10.1177/15346501221145944

[17]. Puente, A. N., & Mitchell, J. T. (2015). Cognitive-Behavioral Therapy for Adult ADHD. Clinical Case Studies, 15(3), 198–211. https://doi.org/10.1177/1534650115614098

[18]. Schapman-Williams, A. M., & Lock, J. (2007). Using Cognitive-Behavioral Therapy to Treat Adolescent-Onset Bulimia Nervosa: A Case Study. Clinical Case Studies, 6(6), 508–524. https://doi.org/10.1177/1534650107296822

[19]. Furukawa, T. A., Iwata, S., Horikoshi, M., Sakata, M., Toyomoto, R., Luo, Y., Tajika, A., Kudo, N., & Aramaki, E. (2023). Harnessing AI to optimize thought records and facilitate cognitive restructuring in smartphone CBT: An exploratory study. Cognitive Therapy and Research, 47, 887–893. https://doi.org/10.1007/s10608-023-10300-4

[20]. Hofmann, S. G., & Asmundson, G. J. G. (2017). The science of cognitive behavioral therapy. Elsevier.

[21]. Thieme, A., Hanratty, M., Lyons, M., Palacios, J. E., Marques, R., Morrison, C., & Doherty, G. (2023). Designing human-centered AI for mental health: Developing clinically relevant applications for online CBT treatment. ACM Transactions on Computer-Human Interaction, 30(2), Article 16.https://doi.org/10.1145/3591016

[22]. Williams, M. D., Rana, N. P., & Dwivedi, Y. K. (2015). The unified theory of acceptance and use of technology (UTAUT): A literature review. Journal of Enterprise Information Management, 28(3), 443–488. https://doi.org/10.1108/JEIM-08-2014-0074

[23]. Simmons, J., & Griffiths, R. (2018). CBT for Beginners (3rd ed.). Sage Publications Ltd.

[24]. Beck, J. S. (2011). Cognitive behavior therapy: Basics and beyond(2nd ed.). The Guilford Press.

[25]. Newman, C. F. (2010). Competency in conducting cognitive–behavioral therapy: Foundational, functional, and supervisory aspects. Psychotherapy: Theory, Research, Practice, Training, 47(1), 12–19. https://doi.org/10.1037/a0018823

[26]. Polit, D. F., & Beck, C. T. (2012). Nursing research: Generating and assessing evidence for nursing practice (11th ed.). Lippincott Williams & Wilkins.

[27]. Denzin, N. K., & Lincoln, Y. S. (2000). Handbook of qualitative research. Sage Publications Ltd.

[28]. Liu, B. (2021). OUP accepted manuscript. Journal of Computer-Mediated Communication, 26(6), 384–402. https://doi.org/10.1093/jcmc/zqaa037

[29]. Jeffrey, D. (2017). Communicating with a human voice: Developing a relational model of empathy. Journal of the Royal College of Physicians of Edinburgh, 47(3), 199–203. https://doi.org/10.1017/S1473325017000417

[30]. Seaborn, K., Miyake, N. P., Pennefather, P., & Otake-Matsuura, M. (2021). Voice in Human–Agent Interaction. ACM Computing Surveys, 54(4), Article 82. https://doi.org/10.1145/3436391

[31]. Park, H., & Cameron, G. T. (2014). Keeping it real: Exploring the roles of conversational human voice and source credibility in crisis communication via blogs. Journalism & Mass Communication Quarterly, 91(3), 487–501. https://doi.org/10.1177/1077699014531403

[32]. Metz, T. (2016). The proper aim of therapy: Subjective well-being, objective goodness, or a meaningful life? InClinical Perspectives on Meaning (pp. 17–35). Springer. https://doi.org/10.1007/978-3-319-32586-6_2

[33]. Weiss, M. (2011). Erratum to the article Embeddings from the point of view of immersion theory: Part I. Geometry & Topology, 15(1), 407–409. https://doi.org/10.2140/gt.2011.15.407

[34]. Heaning, E. (2023). Countertransference in therapy. Simply Psychology. Retrieved fromhttps://www.simplypsychology.org/countertransference.html (Accessed September 11, 2024)

[35]. Lisman-Pieczanski, N., & Pieczanski, A. (2015). The pioneers of psychoanalysis in South America. Routledge.

[36]. Prasko, J., Diveky, T., Grambal, A., Kamaradova, D., Mozny, P., Sigmundova, Z., Slepecky, M., & Vyskocilova, J. (2010). Transference and countertransference in cognitive behavioral therapy. Biomedical Papers, 154(3), 189–197.

[37]. Montag, C., Becker, B., Li, B. J. (2024). On trust in humans and trust in artificial intelligence: a study with samples from Singapore and Germany extending recent research. Computers in Human Behavior: Artificial Humans, 2(2), 100070. https://doi.org/10.1016/j.chai.2024.100070

[38]. Vaitl, D., Birbaumer, N., Gruzelier, J., Jamieson, G. A., Kotchoubey, B., Kübler, A., Lehmann, D., Miltner, W. H. R., Ott, U., Pütz, P., Sammer, G., Strauch, I., Strehl, U., Wackermann, J., Weiss, T. (2005). Psychobiology of altered states of consciousness. Psychological Bulletin, 131(1), 98–127. https://doi.org/10.1037/0033-2909.131.1.98

[39]. Pang, S., Nol, E., Heng, K. (2024). ChatGPT-4o for English language teaching and learning: Features, applications, and future prospects. Social Science Research Network. https://doi.org/10.2139/ssrn.4837988

[40]. Lees, J. (Ed.). (1999). Clinical Counselling in Primary Care (1st ed.). Routledge. https://doi.org/10.4324/9781315824611

[41]. Kang, C., & Yang, J. (2022). Prevalence of mental disorders in China. The Lancet Psychiatry,9(1), 13–14. https://doi.org/10.1016/S2215-0366(21)00389-8


Cite this article

Sun,F.;Guo,X. (2025). Exploring the use of ChatGPT-4o in Cognitive Behavioural Therapy for university students: enhancing mental health with AI-powered voice interaction. Advances in Social Behavior Research,16(3),99-107.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Journal:Advances in Social Behavior Research

Volume number: Vol.16
Issue number: Issue 3
ISSN:2753-7102(Print) / 2753-7110(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).

References

[1]. Douwes, R., Metselaar, J., Pijnenborg, G. H. M., & Boonstra, N. (2023). Well-being of students in higher education: The importance of a student perspective. Cogent Education,10(1), Article 2254708.https://doi.org/10.1080/23311908.23311908

[2]. Auerbach, R. P., Alonso, J., Axinn, W. G., Cuijpers, P., Ebert, D. D., Green, J. G., Hwang, I., Kessler, R. C., Liu, H., Mortier, P., Nock, M. K., Pinder-Amaker, S., Sampson, N. A., Aguilar-Gaxiola, S., Al-Hamzawi, A., Andrade, L. H., Benjet, C., Caldas-de-Almeida, J. M., Demyttenaere, K., Florescu, S., de Girolamo, G., Gureje, O., Haro, J. M., Karam, E. G., Kiejna, A., Kovess-Masfety, V., Lee, S., McGrath, J. J., O’Neill, S., Pennell, B.-E., Scott, K., ten Have, M., Torres, Y., Zaslavsky, A. M., Zarkov, Z., & Bruffaerts, R. (2016). Mental disorders among college students in the World Health Organization World Mental Health Surveys. Psychological Medicine, 46(14), 2955–2970. https://doi.org/10.1017/S0033291716000560

[3]. Slimmen, S., Timmermans, O., Mikolajczak-Degrauwe, K., & Oenema, A. (2022). How stress-related factors affect mental wellbeing of university students: A cross-sectional study to explore the associations between stressors, perceived stress, and mental wellbeing. PLOS ONE, 17(11), e0277319. https://doi.org/10.1371/journal.pone.0277319

[4]. Cuperfain, A. B., Hui, K., Berkhout, S. G., Foussias, G., Gratzer, D., Kidd, S. A., Kozloff, N., Kurdyak, P., Linaksita, B., Miranda, D., Soklaridis, S., Voineskos, A. N., & Zaheer, J. (2021). Patient, family and provider views of measurement-based care in an early-psychosis intervention programme. BJPsych Open, 7(5), e001063. https://doi.org/10.1192/bjpo-2021-001063

[5]. Zeb, S., FNU, N., Abbasi, N., & Fahad, M. (2024). AI in healthcare: Revolutionizing diagnosis and therapy. International Journal of Multidisciplinary Sciences and Arts, 3(3), 118–128. https://doi.org/10.xxxx/xxxx

[6]. Chen, J., Yuan, D., Dong, R., Cai, J., Ai, Z., & Zhou, S. (2024). Artificial intelligence significantly facilitates development in the mental health of college students: a bibliometric analysis. Frontiers in Psychology, 15, Article 1313447. https://doi.org/10.3389/fpsyg.2024.1313447

[7]. Fulmer, R. (2018). The evolution of the psychodynamic approach and system. International Journal of Psychological Studies, 10(3), 1–?. https://doi.org/10.xxxx/xxxx

[8]. D’Alfonso, S., Santesteban-Echarri, O., Rice, S., Wadley, G., Lederman, R., Miles, C., Gleeson, J., & Alvarez-Jimenez, M. (2017). Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health. Frontiers in Psychology, 8, Article 1435. https://doi.org/10.3389/fpsyg.2017.01435

[9]. OpenAI. (2024). GPT-4o mini: Advancing cost-efficient intelligence. Retrieved fromhttps://openai.com/index/gpt-4o-mini-advancing-cost-efficient-intelligence/

[10]. Jiang, M., Zhao, Q., Li, J., Wang, F., He, T., Cheng, X., Yang, B. X., Ho, G. W. K., & Fu, G. (2024). A generic review of integrating artificial intelligence in cognitive behavioral therapy. arXiv. https://doi.org/10. 48550/arxiv.2407.19422

[11]. Antunes, R. P., Sales, C. M. D., & Elliott, R. (2018). The clinical utility of the personal questionnaire (PQ): A mixed methods study. Counselling Psychology Quarterly, 33(1), 25–45. https://doi.org/10.1080/09515070.2019.1614747

[12]. Faija, C. L., Bee, P., Lovell, K., Lidbetter, N., Gellatly, J., Ardern, K., Rushton, K., Brooks, H., McMillan, D., Armitage, C. J., Woodhouse, R., & Barkham, M. (2022). Using routine outcome measures as clinical process tools: Maximising the therapeutic yield in the IAPT programme when working remotely. Psychology and Psychotherapy: Theory, Research and Practice, 95(3), 820–837. https://doi.org/10.1111/papt.12370

[13]. Holliday, B. S., Hepner, K. A., Farmer, C. M., Mahmud, A., Kimerling, R., Smith, B. N., & Rosen, C. (2020). Discussing measurement-based care with patients: An analysis of clinician-patient dyads. Psychotherapy Research, 31(2), 211–223. https://doi.org/10.1080/10503307.2019.1699741

[14]. Matanov, A., McNamee, P., Akther, S., Barber, N., & Bird, V. (2021). Acceptability of a technology-supported and solution-focused intervention (DIALOG+) for chronic depression: Views of service users and clinicians. BMC Psychiatry, 21(1), 1–12. https://doi.org/10.1186/s12888-021-03132-8

[15]. Skjuve, M., Følstad, A., & Brandtzæg, P. B. (2023). The user experience of ChatGPT: Findings from a questionnaire study of early users. In Proceedings of the 5th International Conference on Conversational User Interfaces (pp. 1–12). Eindhoven, Netherlands. https://doi.org/10.1145/xxxx

[16]. Landy, M. S. H., Newman, L., Carney, A. E., Donkin, V., Nicholls, J., Krol, S. A., & Farvolden, P. (2023). Therapist-Assisted Internet-Delivered Cognitive Behavioral Therapy for Insomnia: A Case Report. Clinical Case Studies, 22(4), 383–402. https://doi.org/10.1177/15346501221145944

[17]. Puente, A. N., & Mitchell, J. T. (2015). Cognitive-Behavioral Therapy for Adult ADHD. Clinical Case Studies, 15(3), 198–211. https://doi.org/10.1177/1534650115614098

[18]. Schapman-Williams, A. M., & Lock, J. (2007). Using Cognitive-Behavioral Therapy to Treat Adolescent-Onset Bulimia Nervosa: A Case Study. Clinical Case Studies, 6(6), 508–524. https://doi.org/10.1177/1534650107296822

[19]. Furukawa, T. A., Iwata, S., Horikoshi, M., Sakata, M., Toyomoto, R., Luo, Y., Tajika, A., Kudo, N., & Aramaki, E. (2023). Harnessing AI to optimize thought records and facilitate cognitive restructuring in smartphone CBT: An exploratory study. Cognitive Therapy and Research, 47, 887–893. https://doi.org/10.1007/s10608-023-10300-4

[20]. Hofmann, S. G., & Asmundson, G. J. G. (2017). The science of cognitive behavioral therapy. Elsevier.

[21]. Thieme, A., Hanratty, M., Lyons, M., Palacios, J. E., Marques, R., Morrison, C., & Doherty, G. (2023). Designing human-centered AI for mental health: Developing clinically relevant applications for online CBT treatment. ACM Transactions on Computer-Human Interaction, 30(2), Article 16.https://doi.org/10.1145/3591016

[22]. Williams, M. D., Rana, N. P., & Dwivedi, Y. K. (2015). The unified theory of acceptance and use of technology (UTAUT): A literature review. Journal of Enterprise Information Management, 28(3), 443–488. https://doi.org/10.1108/JEIM-08-2014-0074

[23]. Simmons, J., & Griffiths, R. (2018). CBT for Beginners (3rd ed.). Sage Publications Ltd.

[24]. Beck, J. S. (2011). Cognitive behavior therapy: Basics and beyond(2nd ed.). The Guilford Press.

[25]. Newman, C. F. (2010). Competency in conducting cognitive–behavioral therapy: Foundational, functional, and supervisory aspects. Psychotherapy: Theory, Research, Practice, Training, 47(1), 12–19. https://doi.org/10.1037/a0018823

[26]. Polit, D. F., & Beck, C. T. (2012). Nursing research: Generating and assessing evidence for nursing practice (11th ed.). Lippincott Williams & Wilkins.

[27]. Denzin, N. K., & Lincoln, Y. S. (2000). Handbook of qualitative research. Sage Publications Ltd.

[28]. Liu, B. (2021). OUP accepted manuscript. Journal of Computer-Mediated Communication, 26(6), 384–402. https://doi.org/10.1093/jcmc/zqaa037

[29]. Jeffrey, D. (2017). Communicating with a human voice: Developing a relational model of empathy. Journal of the Royal College of Physicians of Edinburgh, 47(3), 199–203. https://doi.org/10.1017/S1473325017000417

[30]. Seaborn, K., Miyake, N. P., Pennefather, P., & Otake-Matsuura, M. (2021). Voice in Human–Agent Interaction. ACM Computing Surveys, 54(4), Article 82. https://doi.org/10.1145/3436391

[31]. Park, H., & Cameron, G. T. (2014). Keeping it real: Exploring the roles of conversational human voice and source credibility in crisis communication via blogs. Journalism & Mass Communication Quarterly, 91(3), 487–501. https://doi.org/10.1177/1077699014531403

[32]. Metz, T. (2016). The proper aim of therapy: Subjective well-being, objective goodness, or a meaningful life? InClinical Perspectives on Meaning (pp. 17–35). Springer. https://doi.org/10.1007/978-3-319-32586-6_2

[33]. Weiss, M. (2011). Erratum to the article Embeddings from the point of view of immersion theory: Part I. Geometry & Topology, 15(1), 407–409. https://doi.org/10.2140/gt.2011.15.407

[34]. Heaning, E. (2023). Countertransference in therapy. Simply Psychology. Retrieved fromhttps://www.simplypsychology.org/countertransference.html (Accessed September 11, 2024)

[35]. Lisman-Pieczanski, N., & Pieczanski, A. (2015). The pioneers of psychoanalysis in South America. Routledge.

[36]. Prasko, J., Diveky, T., Grambal, A., Kamaradova, D., Mozny, P., Sigmundova, Z., Slepecky, M., & Vyskocilova, J. (2010). Transference and countertransference in cognitive behavioral therapy. Biomedical Papers, 154(3), 189–197.

[37]. Montag, C., Becker, B., Li, B. J. (2024). On trust in humans and trust in artificial intelligence: a study with samples from Singapore and Germany extending recent research. Computers in Human Behavior: Artificial Humans, 2(2), 100070. https://doi.org/10.1016/j.chai.2024.100070

[38]. Vaitl, D., Birbaumer, N., Gruzelier, J., Jamieson, G. A., Kotchoubey, B., Kübler, A., Lehmann, D., Miltner, W. H. R., Ott, U., Pütz, P., Sammer, G., Strauch, I., Strehl, U., Wackermann, J., Weiss, T. (2005). Psychobiology of altered states of consciousness. Psychological Bulletin, 131(1), 98–127. https://doi.org/10.1037/0033-2909.131.1.98

[39]. Pang, S., Nol, E., Heng, K. (2024). ChatGPT-4o for English language teaching and learning: Features, applications, and future prospects. Social Science Research Network. https://doi.org/10.2139/ssrn.4837988

[40]. Lees, J. (Ed.). (1999). Clinical Counselling in Primary Care (1st ed.). Routledge. https://doi.org/10.4324/9781315824611

[41]. Kang, C., & Yang, J. (2022). Prevalence of mental disorders in China. The Lancet Psychiatry,9(1), 13–14. https://doi.org/10.1016/S2215-0366(21)00389-8