From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs

Research Article
Open access

From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs

Yuyue Xie 1*
  • 1 Macau University of Science and Technology    
  • *corresponding author 1220020688@student.must.edu.mo
Published on 19 August 2025 | https://doi.org/10.54254/2753-7048/2025.NE26084
LNEP Vol.116
ISSN (Print): 2753-7048
ISSN (Online): 2753-7056
ISBN (Print): 978-1-80590-331-4
ISBN (Online): 978-1-80590-332-1

Abstract

Nowadays, Algorithmic technology has already fully penetrated all aspects of people's lives. And perceptions of algorithmic threats in today's academia are still relatively optimistic, there is still a gap in the research on exactly how the irreversible impact of the information cocoon constructed by algorithms is practised on the masses and what the flaws in the crowd's perception of algorithms are. This paper explores the deep logic behind the interplay of the information cocoon through an in-depth analysis of algorithmic technology, information cocoon, filtering bubbles, cognitive construction and algorithmic boycott behaviour. It examines how the information cocoon trains human cognitive construction from the nature of commercial algorithms. The paper also explains the joint role of algorithms in building a personalised filtering system through content and collaborative filtering, as well as the three major paradoxes of the “closed-loop” mechanism between algorithms and users. Finally, it discusses the adverse effects of the cocoon effect on the construction of human cognition and how individuals can resist it. Elaborate on the following topics: the commercial nature of algorithms; how algorithms build their personalised filtering system through the joint action of content filtering and collaborative filtering; the three major paradoxes of the 'closed-loop' mechanism between algorithms and users; the adverse effects of the cocoon effect on the construction of human cognition; and how individuals should resist it. From a logical point of view, algorithms can't achieve invisible regulation without a push from humans themselves.

Keywords:

Algorithmic Technology, Information Cocoon, Filtering Bubbles, Cognitive Construction, Algorithmic Resistance Behavior

Xie,Y. (2025). From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs. Lecture Notes in Education Psychology and Public Media,116,24-31.
Export citation

1.  Introduction

The 21st century is already an era inseparable from data algorithms. With the increasing development of mobile data and network technology, new media platforms have become an important field for contemporary young people's daily life, socialization, and even the construction of worldviews and values. The platform quietly creates an invisible “algorithmic gaze” through precise calculations and pushes. The individual's sense of being, identity, and worldview and values are subconsciously catered to and obeyed constantly. “Algorithms” carefully shape an “information cage” for each individual that is uniquely theirs, called an “information cocoon”. In the past, the vast majority of individuals were trapped in this “prison” without realizing it, allowing themselves to be shaped by the values they received and filtered by algorithms, presenting the current generation with the problem of information overload or extremes perceived by users of general information, while today, more and more people are defying the algorithm's discipline, realizing and trying to rebel against the big trap of “information cocoon”. Existing research on the “information cocoon” effect focuses on the “algorithmic resistance behavior” of users, but pays less attention to the related chain changes brought about by its effect on both the personal and social levels, as well as the intrinsic psychological contributors to the realization of the information cocoon phenomenon. The purpose of this paper is to fill the gap in the academic discussion on the important driving force for the formation of the information cocoon, to further help individuals to enrich the cognitive basis of understanding of the “information cocoon” field, to gain a deeper understanding of its operation mechanism and behavioral motivation, and to be alert to the harms of the information cocoon. In this paper, we will mainly adopt the literature research method to explore the inner mechanism of algorithmic recommendation system and the operation logic of two different filtering modes. It will clarify the impact of the existence of the “information cocoon” on the cognitive construction of individuals and society as a whole, and criticize and warn against the harm it may cause.

2.  The perplexing veil of information cocooning under algorithmic construction

2.1.  The manipulation of homogeneity behind the mask of commercialization

“Algorithmic techniques” are very widely defined in the academic community as a series of steps taken to solve a particular problem or achieve a particular result. However, the definition of the term “algorithmic technology” has taken on greater and more critical significance in the context of the ever-changing digital technology era. According to the China Internet Network Information Center's January 17, 2025 release of the “Statistical Report on the Status of China's Internet Development”, as of December 2024, the size of China's Internet users had reached 1.108 billion people, and the Internet penetration rate was also as high as 78.6 percent [1]. In other words, few people in China today live without access to the Internet, and algorithms have long since permeated every aspect of an individual's social life.

Algorithms to build the initial purpose of the Internet platform that depends on the technical means to continue to continue the commercial value of traditional media. According to the available history, from the end of 1990s to the beginning of 2000s, a number of traditional media such as paper media are facing a huge crisis in the industry, the user's access to information is quietly shifting from “passive reception” to “active search”, and traditional media has long been unable to meet the user's desire for information. Traditional media has long been unable to meet the user's desire for information. Coinciding with the advent of the Internet era, the era of algorithmic application of media integration is thus unfolding.

Nowadays, the popularity of the Internet in society is increasing, and the scope and frequency of its use is gradually expanding. Words like convenient, easy to use and fast have long been the eye-catching manifestations of Internet platforms. As time passes, the initial role and purpose of algorithms and platforms is being lost on a group of people, and the Internet is seen more by individuals as an irreplaceable tool for quickly improving themselves, drawing in information, and shortening the knowledge gap. While it is true that algorithms and platforms bring many positive benefits, it is also true that the forgetfulness of their origins can harbor a great deal of crisis. The homogenized cocoon of information constructed by algorithms in the age of big data is the algorithm's punishment for lost individuals.

2.2.  Algorithms realize the critical path to information cocooning

From information homogenization to information personalization, the channels through which algorithms build cocoons have long been increasingly diverse. In 2011, American scholar Eli Pariser first proposed the “Information Finds Me Theory” in his book “Filter Bubbles: What the Internet is Hiding from You” [2]. The theory reveals how algorithmic recommendations can technically reshape the way people access information, with the dominant reshaping technique commonly referred to in today's academia as the “Filter Bubble”, i.e., during their information seeking people tend to filter out all the parts of the available information that do not fit their existing beliefs or opinions [3]. Regarding the study of personalized recommendation means behind the filter bubble, the academic community usually divides him into two major segments - content filtering with user behavior as the core recommendation logic and collaborative filtering based on the similarity of user clusters deduction.

The content filtering mechanism has been explained as a recommendation algorithm that analyzes user activity and profile data to provide personalized recommendations for content that matches a user's interests and preferences in academic circles by Siti Hashim et al [4]. That is, based on the content characteristics of the item and the user's interest preferences, we analyze the descriptive information of the item (e.g., text, labels, categories, etc.) and the user's historical behavior to build a user preference model, and then recommend items that are similar to the user's preferences. For example, the “Allow app tracking” option that pops up automatically after the first download of most software in our lives is one of the typical manifestations of content filtering. With shopping platforms as the main focus, many software programs use “content filtering” as the dominant algorithm of the platform. For example, when a user has browsed a certain commodity on a platform such as Jitterbug and exited the platform, the news of discount or low price of the same commodity on other platforms (e.g., Taobao, Jingdong, Pinduoduo, etc.) will immediately pop up automatically. By utilizing real-time access to information and lower prices, these platforms can skillfully attract the user source of other platforms to their platforms, complete the transaction, and achieve the business objectives while expanding their own platform's competition.

Unlike content filtering mechanisms, the main operational logic of collaborative filtering mechanisms is based on memory-based and model-based techniques [5], through the behavior of “similar users” to push content, without the need for in-depth analysis of the content of the item and feature extraction, the core of the behavior is to capture the similarity of the user clusters. Collaborative filtering is mainly divided into two types of filtering, one is “user-user collaborative filtering” and the other is “item-item collaborative filtering”. Music software, for example, is a typical “user-user collaborative filtering”, relying on the precise calculation of the similarity between users, such as NetEase Cloud Music, QQ Music and other mainstream music software, usually using the “guess your favorite” approach to the user to recommend songs. The platform collects, organizes and summarizes data from a large number of users, and finds that, for example, most of the users who like to listen to r&b style also like to listen to rap style, and these laws are basically applicable to most groups. Thus, the phenomenon of “the algorithm prioritizes rap-type songs on the recommendation page for users who listen to r&b a lot” occurs. Typical “item-item collaborative filtering” is also very common, for example, after a user accidentally buys the book “Bead Math Skills” on a shopping platform, there will always be “Math Learning Tools” on the home page of individual platforms in the relevant recommendation column, “Dot Card” (the American classic number game) and other similar products are pushed, which is based on the similarity calculation between the items and reached.

However, whether it is content filtering that takes user behavior as the core recommendation logic but only exposes users to content similar to their existing interests and lacks diversity, or collaborative filtering that takes user cluster similarity as the core recommendation logic but is susceptible to data sparsity and cold-start problems, relying on a single filtering logic will inevitably have many limitations. In order to overcome the limitations of single filtration methods, researchers have proposed hybrid filtration methods. Scholars Feng Zhao et al. stated “Resources in cloud computing platforms such as Amazon, Google AppEngine, and Microsoft Azure are a natural fit to remedy the lack of local resources in mobile devices, which creates a new space of mobile search to improve the availability of cloud resources. The hybrid filtering mechanism is proposed to eliminate irrelevant or less relevant results for personalized mobile search, which combines content-based filtering and collaborative filtering. The former filters the results according to the mobile user's feature model generated from the user's query history, and the latter filters the results using the user's social network, which is constructed from the user's communication history [6].”  The short video platform in the new era is a typical representative of mixed filtering and superimposed operation. Taking Jitterbug as an example, the infinite scrolling attention allocation mode can clearly reflect the platform's algorithmic logic of more accurate feeding by means of the joint role of “users” and “objects”. In addition to the user's own choices, the platform provides users with reading content almost exclusively through algorithmic recommendations. Content filtering works by recommending other content from bloggers that the user has independently liked, commented on, or retweeted, as well as other similar content on a particular topic that the user has liked and read. This is coupled with collaborative filtering to recommend other users' favorite content or accounts that are similar to the user's preferences. A very unique and rich “interest cage” is then created by the algorithm more efficiently and firmly.

3.  The specificity of the information cocoon: the paradoxical presentation of the “closed-loop” mechanism in algorithms and users

However, strictly speaking, algorithms are not the only culprits in the emergence of the information cocoon, but the users themselves are also a major culprit. 1975, Foucault, in his book “Discipline and Punish”, mentioned the interesting idea that discipline “makes” the individual. “It is a particular technique of power that sees the individual as both an object and a tool of manipulation. It is not the kind of power that is so smug that it thinks it is all-powerful because it is so powerful. It is a humble and suspicious power, a calculated and persistent mechanism of operation. In other words, power is essentially accomplished through individual self-regulation [7].” The “creation” of information cocoons by algorithms could also be aptly characterized by this statement. The existence of the information cocoon can be said to be so impenetrable that it is difficult for the average person to detect it. Fundamentally, the emergence of such a phenomenon is the confirmation of the existence of individuals behind the cocoon who have acquiesced to the power of algorithms, i.e., individuals are in fact actively participating in their own continuous domestication.

3.1.  Selective attention and cognitive catering

At the first level, the algorithm captures the individual's “selective attention”, People usually only care about what they choose and information that pleases them physically and mentally. Through the massive amount of information reading, and then calculate and deduce the specific user expects to see the content is the algorithm to build a cocoon of one of the main means, this way is the algorithm of “personalized filtering”. The underlying logic behind this is that the algorithm and the user share the same channel approach, each meeting different target needs. On an algorithmic level, this approach can help platforms capture users' attention very smoothly. From the user's point of view, in today's high-pressure social environment, there is a channel to meet their own trivial leisure information capture needs is difficult to refuse, and the information cocoon happens to have a unique personalized construction, further strengthening the “win-win”.

3.2.  Circularization and the echo chamber effect

On another level, algorithms utilize the platform to attract different users to build different interest clusters, thus achieving the effect of enhancing user stickiness, which is exactly the reverse of the individual “circle effect”. The “social circle” is not natural, it usually refers to a group of people with similar interests, values or identities, after several times of common emotional connection - such as treating a certain thing with common anger or common joy, by the “interest group” upgraded to “interest group”. “interest group” to “emotional community”. Such an emotional community often has a very strong emotional connection, through the strengthening of the information cocoon within the group, they are very easy to produce a strong sense of internal identity and sense of belonging, while at the same time, the resistance to the outside world and cut off is likely to become deeper and deeper. Academics define the role of the information cocoon as an “echo chamber”. From the perspective of evolutionary and basic psychology, the core drive for circling behavior stems from human instinct and is an outward manifestation of the human sense of belonging, security, and group identity. This instinctive grasping and utilization precisely breaks through the profound loneliness and helplessness felt by individuals in the chaotic society. As a result, it is almost impossible for an individual to actively break away from the security cocoon constructed for them by the algorithm, whether it is an external technical means or an internal psychological need of the individual, which makes it another insurmountable closed loop between the algorithm and the user.

3.3.  Big data collection and the spiral of silence

Not only that, the constant collection of big data by algorithms is actually one of the main strategies to build an algorithm-user closed loop. From a macroscopic view, big data not only implies a large information text, but also a rich user base. When massive amounts of users and texts are stacked onto a platform, the individual inevitably becomes a flat boat. With the flood of intricate information texts coming in, when seeing different statements or opinions, to speak or to remain silent has become a problem that contemporary Internet users have to struggle with and face every day. Based on the principles of Neumann's “spiral of silence” theory, it is easy to conclude that in complex social climates, individuals tend to be less likely to express minority opinions in social interactions for fear of isolation, and tend to follow the mainstream expression instead. However, users often forget that algorithms are a key tool for leading and even controlling the online climate. Under the construction of the algorithm, the individual accepts the selected social trends and social winds, and naturally believes that they are the social reality, and after one round of regulation after another, the individual almost completely surrenders the control of his/her own cognition.

4.  Cognitive regulation and warning in the information cocoon: the implicit manipulation of cognitive constructs by closed-loop mechanisms

Scholar Mengqi Duan once mentioned in her research article, “Recommendation algorithms are constantly digging into the deepest depths of users' interests to refine their personal profiles, which in fact hinders the delivery of complete information, inhibits users' desire to explore the unknown, and potentially restricts the autonomy of choice and free will of human beings by intelligent algorithms [8].”

4.1.  Blunting and polarizing tendencies of individual cognitive abilities

In terms of algorithmic technological alienation, what individuals need to clarify is that algorithms are not neutral, and that their very design implies a particular values orientation. The information cocoon constructed under such value tendency is more likely to cause individuals to lose the ability to analyze multiple viewpoints in confinement, and then gradually form self-cognitive bias. Algorithms are essentially a special way of capturing commercial value, not for the benefit of human beings, and they control the perception of the audience by controlling the way information is filtered. “Most definitions of algorithmic bias and fairness encode decision-maker interests, such as profits, rather than the interests of disadvantaged groups (e.g. racial minorities): bias is defined as a deviation from profit maximization [9]. ” In today's fast-paced and convenience-oriented era, it is extremely easy for individuals to invariably develop a strong reliance on algorithms. This reliance is reflected, for example, in various aspects such as allowing AI to replace work, generating solutions to many problems with a single click, or autonomously seeking medical care through apps such as Baidu and Xiaohongshu after falling ill. On the one hand, this is undoubtedly a good sign of the rapid development of the times, but on the other hand, it is also very likely to be a potential threat leading to the gradual weakening of the ability of individual human beings to think on their own. Just like the law of biological evolution, the unsuitable will eventually be eliminated, and the dulling of an individual's cognitive ability will not only lead to the gradual loss of human beings in the rich cognitive world, but may even produce a dangerous situation of polarization of ideological concepts. Under the spiral of the masses, individuals mistakenly equate the consensus within the cocoon with the correct cognition with the general cognition of the society, and thus the self-perception picture is distorted and the social perception picture is misinterpreted step by step.

4.2.  Group cognitive tears and the breakdown of social dialogue

At the social level, the process of regrouping individuals is a process of modularization of society as a whole. An undesirable modularization process may lead to a widening of the differences in the perception of the same facts by different groups, and even to the formation of intergenerational, hierarchical or territorial conflicts that are difficult or impossible to mediate, and public dialogue will become increasingly difficult to achieve. The information cocoon makes use of powerful personalized features to customize differentiated “factual basis” for different groups in the society, and then more firmly encircle individuals into the information cocoon. For example, during the U.S. election, algorithms will formulate different information feedback for different groups with different attributes, and there are numerous examples where supporters and opponents will receive completely different bases of support on their respective information platforms. “Algorithmic decision-making systems are increasingly used throughout the public and private sectors to make important decisions or assist humans in making these decisions with real social consequences. While there has been substantial research in recent years to build fair decision-making algorithms, there has been less research seeking to understand the factors that affect people's perceptions of fairness in these systems [10].” Avoiding cognitive civil wars and algorithmic dictatorships, and not becoming puppets under the control of algorithms, should be the new social consensus.

4.3.  Individual countermeasures to counteract numeracy laws and regulations

Nowadays, due to the increasingly extensive power held by individuals, without relying on the government and related policies, individuals have been able to realize defensive behaviors against algorithms at all levels through many means. First, individuals must improve their algorithmic literacy, which can be done by understanding the principles, recognizing algorithmic traces, and understanding biases, among other perspectives. Learn how recommender systems, personalized ads work in algorithms, and make it clear that algorithms are used to model and predict user behavior by collecting data about the user (clicks, browsing, searches, location, purchases, etc.); Discover everything in the algorithm that resembles a “guess your favorite” style of algorithmic operations tag; It is also important to understand that algorithms are tools based on data and the intent of their designers, and can be biased by race, gender, geography, and so on. Second, individuals should proactively control data entry by making changes to the algorithm's data collection, such as “privacy settings” - critically reviewing and adjusting the privacy settings of all apps, websites, operating systems, and browsers. Turn off unnecessary, non-core permissions such as location, microphone, contacts, photos, etc; “Minimize account association” - avoid using the same account to log in to a large number of third-party websites or apps, use different email addresses to register for different purposes; “browser privacy restrictions”, etc. Restricted Behavior.

5.  Conclusion

The study discusses the related chain of changes at both the individual and social levels brought about by the information cocoon effect, as well as the inherent psychological contributors to the realization of the information cocoon phenomenon. Fills a timely gap in the scholarly discussion of the importance of the individual's contribution to the formation of the information cocoon. Helps individuals recognize the specific ways in which algorithmic techniques and information cocoons enable implicit regulation of human cognitive constructions. It not only helps individuals to further enrich their cognitive foundation for understanding the field of “information cocoon”, but also deepens their understanding of algorithms and the operation mechanism and behavioral motivation of the information cocoon, and plays a positive role in alerting the public to the potential hazards of the information cocoon at the right time.

Research has shown that algorithmic technology to build information cocoon and then realize the process of stealth regulation of human cognitive construction, a number of human self-training also “contributed to”. In this day and age, there are still a large number of people who do not or cannot realize the potentially sinister threat behind algorithms, and if this phenomenon persists for a long time, it is very likely to have an irreversible negative impact on individuals and even on society as a whole. Nowadays, there are many effective paths for the masses to “escape” or 'resist’ the information cocoon, whether it is the “digital literacy” of users or the “decentralization” of platforms. “Decentralization”, or even the “inverse algorithm mechanism” developed by R&D are all good strategies to help human beings get rid of the information cocoon and algorithmic discipline as soon as possible. However, human beings also need to understand that the most fundamental and important way to truly break free from algorithmic discipline is to start from the consciousness itself, and to avoid the “making” of power, which is the beginning of realizing self-control.


References

[1]. Ministry of Industry and Information Technology of the People's Republic of China, China's netizens reach 1.108 billion people, Internet penetration rate rises to 78.6%, 2025-01-17, 2025-07-22.

[2]. Eli Pariser, Filter bubbles: the hidden manipulation of us by the Internet, People's University of China Press, Beijing, 95-115, 2020

[3]. Aicher, A., Kornmüller, D., Minker, W., & Ultes, S. (2023). Self-imposed filter bubble model for argumentative dialogues. In Proceedings of the 5th international conference on conversational user …, 2023.

[4]. Hashim, S., & Waden, J. (2023). Content-based filtering algorithm in social media. Wasit Journal of Computer and Mathematics Science, 2(1), 14–17.

[5]. Fareed, A., Hassan, S., Belhaouari, S. B., & Halim, Z. (2023). A collaborative filtering recommendation framework utilizing social networks. Machine Learning with Applications, 14, 100495.

[6]. Zhao, F., Yan, F., Jin, H., Yang, L. T., & Yu, C. (2017). Personalized Mobile Searching Approach Based on Combining Content-Based Filtering and Collaborative Filtering. IEEE Systems Journal, 11(1), 324–332. https: //doi.org/10.1109/jsyst.2015.2472996

[7]. Michel Foucault, Surveiller et punir, SDX Joint Publishing Company, Beijing, 173, 2019

[8]. Mengqi, D. (2023). Discipline and Resistance: User Autonomous Awakening and Resistance Practices under Algorithmic Recommendations. Radio & TV Journal, (09), 133-136. doi: 10.19395/j.cnki.1674-246x.2023.09.009.

[9]. Kasy, M. (2024). Algorithmic bias and racial inequality: a critical review. Oxford Review of Economic Policy, 40(3), 530–546.

[10]. Wang, R., Harper, F. M., & Zhu, H. (2020). Factors Influencing Perceived Fairness in Algorithmic Decision-Making: Algorithm Outcomes, Development Procedures, and Individual Differences (Version 1). arXiv.


Cite this article

Xie,Y. (2025). From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs. Lecture Notes in Education Psychology and Public Media,116,24-31.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceeding of ICIHCS 2025 Symposium: Exploring Community Engagement: Identity, (In)equality, and Cultural Representation

ISBN:978-1-80590-331-4(Print) / 978-1-80590-332-1(Online)
Editor:Enrique Mallen, Nafhesa Ali
Conference date: 29 September 2025
Series: Lecture Notes in Education Psychology and Public Media
Volume number: Vol.116
ISSN:2753-7048(Print) / 2753-7056(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).

References

[1]. Ministry of Industry and Information Technology of the People's Republic of China, China's netizens reach 1.108 billion people, Internet penetration rate rises to 78.6%, 2025-01-17, 2025-07-22.

[2]. Eli Pariser, Filter bubbles: the hidden manipulation of us by the Internet, People's University of China Press, Beijing, 95-115, 2020

[3]. Aicher, A., Kornmüller, D., Minker, W., & Ultes, S. (2023). Self-imposed filter bubble model for argumentative dialogues. In Proceedings of the 5th international conference on conversational user …, 2023.

[4]. Hashim, S., & Waden, J. (2023). Content-based filtering algorithm in social media. Wasit Journal of Computer and Mathematics Science, 2(1), 14–17.

[5]. Fareed, A., Hassan, S., Belhaouari, S. B., & Halim, Z. (2023). A collaborative filtering recommendation framework utilizing social networks. Machine Learning with Applications, 14, 100495.

[6]. Zhao, F., Yan, F., Jin, H., Yang, L. T., & Yu, C. (2017). Personalized Mobile Searching Approach Based on Combining Content-Based Filtering and Collaborative Filtering. IEEE Systems Journal, 11(1), 324–332. https: //doi.org/10.1109/jsyst.2015.2472996

[7]. Michel Foucault, Surveiller et punir, SDX Joint Publishing Company, Beijing, 173, 2019

[8]. Mengqi, D. (2023). Discipline and Resistance: User Autonomous Awakening and Resistance Practices under Algorithmic Recommendations. Radio & TV Journal, (09), 133-136. doi: 10.19395/j.cnki.1674-246x.2023.09.009.

[9]. Kasy, M. (2024). Algorithmic bias and racial inequality: a critical review. Oxford Review of Economic Policy, 40(3), 530–546.

[10]. Wang, R., Harper, F. M., & Zhu, H. (2020). Factors Influencing Perceived Fairness in Algorithmic Decision-Making: Algorithm Outcomes, Development Procedures, and Individual Differences (Version 1). arXiv.