Volume 117
Published on September 2025Volume title: Proceedings of the 6th International Conference on Educational Innovation and Psychological Insights
Music, a universal language that can be understood all over the world, has a powerful power to communicate the rich feelings and emotions of human beings. The melody of music will stimulate people's physiological reactions, such as speeding up the heartbeat or relaxing muscles, and then affect people's emotions. Therefore, this paper will focus on exploring the influence of music on emotions. This article discusses how music affects mood from many important angles. Through the related research based on music and neuroscience, this paper reviews the whole process of music influencing emotions. In addition, the article also makes an exploratory analysis of the application of music in therapy. It is of great significance to have a deep understanding of these factors, which helps employees to skillfully make music an effective tool to improve their emotional state and mental health quality in the work scene and people's daily life in general.

Deep-synthesis technology—an artificial intelligence technique used primarily for audiovisual content—has been misused in ways that are systematically deconstructing the trust ecosystem of social media. This study innovatively unpacks social media trust into three dimensions (platform trust, information trust, and user trust) and, using a survey-experiment design (n = 522), examines the differentiated erosive effects of deep-synthesis content within social media platforms. Taking Douyin (the Chinese version of TikTok) as the context, we employ five categories of deep-synthesis videos as stimuli and control for topic-related confounds. The findings show that deep-synthesis content exerts a negative impact on social media trust, with information trust being the most severely damaged, followed by platform trust; user/interpersonal trust exhibits a lagging and comparatively weak effect. These results reveal differentiated pathways through which technological alienation disrupts trust mechanisms and provide a theoretical basis for platform governance and user-level cognitive interventions.

With the continuing rise in psychological stress among young people and the widespread adoption of artificial intelligence technologies, chatbots have gradually become a new channel for emotional expression and emotional support among youth. Drawing on the ABC Attitude Model, this study applies grounded theory to analyze semi-structured interview data from 17 young users aged 15–34, exploring their attitudinal structure and behavioral responses when using chatbots for emotional support. The findings show that young users exhibit both instrumental endorsement and emotional dependence on chatbots, while simultaneously maintaining vigilance and skepticism regarding their empathic capacity, response quality, and privacy/security—forming a contradictory attitude in which “dependence and vigilance” coexist. On this basis, users develop behavioral regulation strategies such as boundary setting and control of usage frequency. The article constructs an interactive mechanism of “cognition–ambivalent attitude–behavioral intention,” revealing a pattern of reflexive dependence in youths’ practices of digital emotional support, and providing theoretical references and practical implications for the design of AI affective products and for youth mental-health services.