1. Introduction
In the age of digital and social media, algorithms for personalized recommendation have emerged as the driving force behind how users respond to content—particularly in short video sites such as Douyin and TikTok. Artificial intelligence and machine learning-driven algorithms recommend content tailored to each user in real time, rendering platforms extremely interactive and addictive. Although personalization strengthens user experience and platform expansion, it also sparks critical issues surrounding content diversity, data privacy, and algorithmic fairness.
This study endeavors to examine the impact of personalized recommendation algorithms in short video platforms on the behavior, experience, and perception of users. Through an integration of literature review with original data from user survey, this paper addresses the following questions: In what way do the algorithms shape the engagement and screen time of users? To what degree do they sway opinions and restrict access to alternative perspectives? And how knowledgeable and capable are users in controlling their algorithmic experience?
Through this analysis, the paper makes contributions to wider debates about platform design's social responsibility, the ethics of algorithmic decision-making, and balancing personalization and public good in ecosystems of digital media.
2. Literature review
The prevalence of personalized recommendation algorithms has revolutionized the way content is consumed on short video platforms. These systems rely on artificial intelligence and deep learning to curate content that closely matches the interest of individual users, maximizing engagement and platform retention [1]. There are two principal algorithmic strategies that rule the landscape: collaborative filtering, which recommends content in terms of similarity among users, and content-based filtering, which considers item attributes. Most applications employ hybrid models that address weaknesses such as the cold-start problem—where there is limited data available about new users that can influence recommendation accuracy [2]. Guy et al. also discuss how individuals and tags can shape recommendation systems [3]. Their research underscores the value of exploiting social networks and tagging systems to enhance the accuracy of recommendations through the introduction of social dynamics and user-provided metadata into algorithmic systems. This strategy increases personalization since the recommendation can now not only be influenced by user behavior but also rely on the collective wisdom inherent in social interactions.
These algorithmic methods greatly increase screen time and user satisfaction. Platforms curate content to have the highest probability for engagement, with real-time feedback loops driving the delivery of content where users are likely to interact with the material [4] . This hyper-personalization, though, does have disadvantages. Researchers have created the term "rabbit hole" effect, where users are exposed to the same type of content over and over, with little exposure to outside sources or perspectives [5]. This trend plays into the filter bubble effect, which cements users' pre-existing beliefs and personal preferences, and can create potential digital echo chambers [6].
There are also concerns regarding algorithmic bias. Platform recommendation systems have been proven to privilege leading narratives over niche or under-represented material. Peralta et al. discovered in 2021 that biased filtering can distort public discourse and feed into polarization in online spaces [7]. Shin et al. in 2022 described the threat of disinformation proliferation through biased algorithmic amplification, threatening democratic discourse and critical thought [8].
Fairness and privacy are also urgent ethical issues. Platforms like short video ones collect enormous amounts of data about their users, including behavioral signals, location, and device metadata, to make algorithmic predictions more accurate. Although this makes content targeting more effective, it poses enormous data privacy concerns [1]. On the other hand, there is an unbalanced playing field for producers, with algorithms preferring viral or popular content, which makes it challenging for new or niche ones to find an audience [9]. To prevent this, experts propose the application of explainable AI and reinforcement learning models that enhance content diversity and put users in control of what they view [4] [2].
In short, the literature highlights an intricate duality: whereas recommendation algorithms make recommendation systems more efficient and engaging, they also raise very real concerns about fairness, bias, content diversity, and autonomy for the users. These observations are the basis for further examination in this paper of how users respond to and experience these systems.
3. Research questions
This research aims to explore the effect of personalized recommendation algorithms on short video platforms' users' experience, opinion formation, and fairness and privacy perceptions. The study is informed by the literature and survey findings, and addresses the following questions:
1.In what ways do personalized recommendation systems influence user engagement and screen time on short video platforms?
2.To what extent do these algorithms influence users’ perceptions, opinions, and content diversity?
3.To what extent are users knowledgeable about algorithmic systems, and what are their concerns about privacy, fairness, and control?
4. Methodology
To investigate these questions, this study utilized a mixed-method strategy that incorporates secondary and primary data. The secondary data were sourced from an exhaustive literature review of academic sources that have considered the architecture and ramifications of personalized recommendation systems. Some of the notable references are research works on algorithmic design and fairness [2] [1], user engagement and filter bubbles [5] [6], and ethical issues like privacy, fairness, and transparency [9] [7].
For the primary data, we sent an online structured survey via the Wenjuanxing platform, with the resultant 70 usable responses. The survey aimed at users of short video platforms, especially Douyin and comparable apps, with questions regarding usage behavior, knowledge of recommendation systems, satisfaction with personalized content, and privacy issues. The participant group skewed towards young adults aged 18–24 years old (58.57%), with female dominance in the sample group (55.71%). The respondents mostly have or are undertaking university-level education (60%) and are students (45.71%).
The survey utilized both Likert-scale and multiple-choice items. Some of the topics revolved around time devoted to short video platforms, views about content variety, impact of suggestions on opinions or purchases, and privacy concerns. The survey also queried, with regard to each, whether one had attempted to curate content choices and, in their perception, whether such efforts were effective.
The combination of literature with novel survey findings offers an enriched, holistic perspective about user experience and social issues surrounding personalization.
5. Findings and discussion
The research draws upon both an analysis of over 70 survey respondents' responses and the extant literature to examine the roles that personalized recommendation systems play in influencing behavior and attitudes among video platform users. Some of the dominant themes in the data are increased screen time, narrow content range, and concerns regarding privacy and control, as well as algorithmic bias. These data are consistent with issues highlighted in academic literature, forming a consistent narrative around the algorithm-user experience.
5.1. Personalized approaches raise screen time
The survey found that 71.43% of the respondents agreed or strongly agreed that recommendation systems of short video platforms made them spend more time on the app than they originally expected. This supports the contention of Zhang [1], who believes that algorithmic feedback loops enhance retention by maximizing engagement. This is according to Cosse [5], who calls this the “rabbit hole effect” since users are fed the same type of content over and over, which makes them use the app for longer. Supporting this, the survey data shows that while personalization works to attract attention, it can also lead to compulsive behavior.
5.2. Content diversity is an ubiquitous concern
More than half of users (58.57%) complained that the content recommended to them was too repetitive, and 61.43% concurred that algorithms restrict their exposure to varied content. This is consistent with filter bubble and echo chamber literature [6] [7], which cautions that recommendation algorithms have the tendency to perpetuate users' preferred choices, limiting exposure to novel ideas or points of view.
5.3. Strong algorithmic influence is perceived by users
87.15% of the users indicated that they have been influenced in their opinions or consuming habits by suggested content. This reinforces the persuasive power of personal systems, as noted in Kim [6], and provokes questions regarding the ethics of content manipulation and opinion forming.
5.4. Mixed awareness and limited user control
While over 50% of users claimed they understood how recommendation systems work, only 40% reported ever adjusting their algorithm settings. Moreover, 56.67% said they were unsure whether those adjustments were even effective. This reflects concerns about algorithmic opacity [9] [4] and the need for explainable AI to improve user trust and agency.
5.5. Privacy and data use are key concerns
77.14% of the respondents have concerns regarding data privacy. Zhang and Guo et al. caution that large-scale data gathering, though helpful to enhance recommendation, can hinder trust among users, particularly when there is insufficient transparency [1,2].
5.6. Algorithmic fairness and creator inequality
Several users complained that music popular among fans takes precedence over lesser-known artists. Literature supports that algorithmic amplification rewards viral trends at the expense of lesser voices [9]. This supports the necessity for fairness-oriented algorithm design.
6. Conclusion
The aim of this study was to examine the effect of personalized recommendation algorithms both on the experience of users and the societal perception of short video platforms. The research finds evidence of a two-sided narrative: these systems do enhance engagement and relevance of content, yet they are also introducing issues with content diversity, algorithmic discrimination, and users' autonomy.
Most respondents accounted for higher screen use and acknowledged the circularity of suggested content. These findings are consistent with the "rabbit hole" and "filter bubble" effects that have been debated in the more recent literature. Even when more than half of the participants answered that they were familiar with how the recommendation algorithm works, many of them harbored skepticism regarding their power or capacity to regulate or transform it effectively—echoing issues in transparency and power over technology. Privacy likewise proved to be the overriding concern, with more than 77% of users reporting concern with data-gathering procedures—highlighting the ethical facet of personalization in online environments.
Nonetheless, this study has several limitations. First, the sample size is relatively small and demographically skewed toward younger users, limiting generalizability. Second, the study relies on self-reported data, which may not fully capture actual user behavior. Future research could incorporate platform-based behavioral data and qualitative interviews for deeper insight.
In summary, while personalized recommendation algorithms are at the forefront of influencing users' experience with short video platforms, future developments need to prioritize transparency, fairness, and diversity in content in order to create an ethical and empowering digital media space.
References
[1]. Zhang, R. (2024). The investigation and discussion related to recommendation systems in video social platforms. Advances in Computer Science Research, 573–579. https://doi.org/10.2991/978-94-6463-540-9_57
[2]. Guo, Y., Wang, M., & Li, X. (2017). An interactive personalized recommendation system using the hybrid algorithm model. Symmetry, 9(10), 216. https://doi.org/10.3390/sym9100216
[3]. Guy, I., Zwerdling, N., Ronen, I., Carmel, D., & Uziel, E. (2010). Social media recommendation based on people and tags. Proceedings of the 33rd International ACM SIGIR Conference on Research and Development in Information Retrieval. https://doi.org/10.1145/1835449.1835484
[4]. Anandhan, A., Shuib, L., Ismail, M. A., & Mujtaba, G. (2018). Social Media Recommender Systems: Review and open research issues. IEEE Access, 6, 15608–15628. https://doi.org/10.1109/access.2018.2810062
[5]. Cosse, C. (2024). Recommendation systems of short video platforms: Auditing algorithms of short format video platforms to understand the rabbit hole effect on YouTube Shorts. Delft University of Technology Master Thesis.
[6]. Kim, Sang Ah. (2017). Social media algorithms: why you see what you see. Georgetown Law Technology Review, 2(1), 147-154. https://heinonline.org/HOL/Page?handle=hein.journals/gtltr2&div=12&g_sent=1&casa_token=XITJcllkg7cAAAAA:1GHsoep1i2TB1UAwO25uNkId3D20cpe4gJKjLzaNem0irHNMfhi_LXkh4vNBPUrTemFVbU4joxU&collection=journals
[7]. Peralta, A. F., Kertész, J., & Iñiguez, G. (2021). Opinion formation on social networks with algorithmic bias: Dynamics and bias imbalance. Journal of Physics: Complexity, 2(4), 045009. https://doi.org/10.1088/2632-072x/ac340f
[8]. Shin, D., Hameleers, M., Park, Y. J., Kim, J. N., Trielli, D., Diakopoulos, N., Helberger, N., Lewis, S. C., Westlund, O., & Baumann, S. (2022). Countering algorithmic bias and disinformation and effectively harnessing the power of AI in media. Journalism & Mass Communication Quarterly, 99(4), 887–907. https://doi.org/10.1177/10776990221129245
[9]. Figueiredo, C., & Bolaño, C. (2017). Social Media and algorithms: Configurations of the lifeworld colonization by new media. The International Review of Information Ethics, 26. https://doi.org/10.29173/irie277
Cite this article
Yin,S. (2025). Personalized Recommendation Algorithms on Short Video Platforms: User Experience, Ethical Concerns, and Social Impact. Lecture Notes in Education Psychology and Public Media,90,89-93.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of the 3rd International Conference on Global Politics and Socio-Humanities
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).
References
[1]. Zhang, R. (2024). The investigation and discussion related to recommendation systems in video social platforms. Advances in Computer Science Research, 573–579. https://doi.org/10.2991/978-94-6463-540-9_57
[2]. Guo, Y., Wang, M., & Li, X. (2017). An interactive personalized recommendation system using the hybrid algorithm model. Symmetry, 9(10), 216. https://doi.org/10.3390/sym9100216
[3]. Guy, I., Zwerdling, N., Ronen, I., Carmel, D., & Uziel, E. (2010). Social media recommendation based on people and tags. Proceedings of the 33rd International ACM SIGIR Conference on Research and Development in Information Retrieval. https://doi.org/10.1145/1835449.1835484
[4]. Anandhan, A., Shuib, L., Ismail, M. A., & Mujtaba, G. (2018). Social Media Recommender Systems: Review and open research issues. IEEE Access, 6, 15608–15628. https://doi.org/10.1109/access.2018.2810062
[5]. Cosse, C. (2024). Recommendation systems of short video platforms: Auditing algorithms of short format video platforms to understand the rabbit hole effect on YouTube Shorts. Delft University of Technology Master Thesis.
[6]. Kim, Sang Ah. (2017). Social media algorithms: why you see what you see. Georgetown Law Technology Review, 2(1), 147-154. https://heinonline.org/HOL/Page?handle=hein.journals/gtltr2&div=12&g_sent=1&casa_token=XITJcllkg7cAAAAA:1GHsoep1i2TB1UAwO25uNkId3D20cpe4gJKjLzaNem0irHNMfhi_LXkh4vNBPUrTemFVbU4joxU&collection=journals
[7]. Peralta, A. F., Kertész, J., & Iñiguez, G. (2021). Opinion formation on social networks with algorithmic bias: Dynamics and bias imbalance. Journal of Physics: Complexity, 2(4), 045009. https://doi.org/10.1088/2632-072x/ac340f
[8]. Shin, D., Hameleers, M., Park, Y. J., Kim, J. N., Trielli, D., Diakopoulos, N., Helberger, N., Lewis, S. C., Westlund, O., & Baumann, S. (2022). Countering algorithmic bias and disinformation and effectively harnessing the power of AI in media. Journalism & Mass Communication Quarterly, 99(4), 887–907. https://doi.org/10.1177/10776990221129245
[9]. Figueiredo, C., & Bolaño, C. (2017). Social Media and algorithms: Configurations of the lifeworld colonization by new media. The International Review of Information Ethics, 26. https://doi.org/10.29173/irie277