Name: Zhuoya Gui
ID: 530163378

Image source: https://www.vox.com/technology/2018/10/1/17882340/how-algorithms-control-your-life-hannah-fry
Do you often browse fixed content on social media? Like food? Travel? Makeups? Pet? … What do you do when you see content or topics that you do not usually pay attention to? Swipe away quickly, or open it curiously? Do you often feel that social media has become the person who knows you best – who can always accurately predict your preferences and needs? Are we falling for the trap of social media? Have we really been “kidnapped” by the Algorithms?
Some of you may say YES, while some of you may say NO.
My answer is NOT REALLY.
Next, in this blog, I will clarify people’s misunderstanding of the statement that “algorithms lead to filter bubbles”. Meanwhile, I will take the Little Red Book as an example to explore how algorithms work in social media. Moreover, the blog will include what methods should we take to avoid falling into the predicament of “filter bubbles”.
Context
In the era of intelligent communication, algorithm recommendation technology with artificial intelligence as the core has been widely used, but the information cocoon effect has become a hidden worry.
According to Flew Terry (2021), algorithms refer to the established guidelines and processes that are used for tasks such as computation, data processing, and automated reasoning.
“Information Cocoons” were first proposed by Sunstein in Infotopia: How Many Minds Produce Knowledge (2008). Sunstein (2008) noted that when it comes to sharing information in networks, individuals tend to selectively focus on content that aligns with their preferences and makes them feel good, rather than seeking out comprehensive information. This behaviour can lead to the formation of echo chambers, where people are confined to a limited perspective, akin to a silkworm cocoon, over time.
Later, in 2011, the concept of “Filter Bubble” was born. It was invented by Eli Pariser, a well-known Internet activist in the United States, and became widely known with his book “The Filter Bubble: What The Internet Is Hiding From You” and a 9-minute TED talk. Pariser believes that algorithms represented by search engines filter diverse information by understanding user preferences, creating a personalized information world for users while building a “wall” so that they are in an environment of “Internet bubbles”, impeding the exchange of different viewpoints.

Image Source: Eli Pariser: Beware of ‘filter bubbles’ online https://www.youtube.com/watch?v=B8ofWFx525s
Strictly speaking, the “information cocoon” focuses on the factual information acquisition behaviour of individuals and has an obvious personal bias, while “filter bubbles” focus on the “filtering” of information caused by algorithmic techniques. However, as two important concepts describing the phenomenon of information narrowing due to information preferences, both of them point to the fact that the main cause of this phenomenon is the recommendation algorithm based on information efficiency.
The origins and scenes of these words are different, but they are gradually being mixed up by people. In the context of today’s information age, such a concept, whether “Information Cocoons” or “filter bubbles”, focuses on describing the phenomenon that the information (news, product recommendations, ideas, etc.) received by Internet users has gradually become single under the intervention of algorithms.
In the discussion of the social impact of intelligent algorithm recommendation technology, “information cocoons” is one of the most concerned topics (Guoming Yu, 2021). As Guoming Yu mentioned, people are worried that personalized recommendations based solely on user preferences tend to intensify the homogenization of information users encountered, making people only see what they want to see, hear only the views they agree with, and eventually become a person who can only hear their own voices in the “secret room”, which even increases the challenge for public information dissemination, social opinions integration, and social consensus integration.
Is the algorithm in social media really the prime culprit that leads to the phenomenon of filter bubbles?
Yu Guoming (2021) argued it is untenable that the algorithm is the prime culprit of the “information cocoon”. He mentioned that Sunstein’s proposal of “information cocoon” is based on the concern that new technologies will reduce the diversity and polarization of political information in the context of American bipartisan politics. Actually, “Information cocoon” Sunstein’s proposed is a metaphor, a kind of hypothesis (Guoming, 2021).
At the same time, in my opinion, people’s concern comes from misunderstanding and unknown about algorithms and themselves in social media. Next, I will explore how the algorithms work in social media platforms, taking the Little Red Book as an example, and the relationship between users and “filter bubbles” to clarify and solve people’s misunderstanding of algorithms.
Misunderstanding #1 – Algorithms
Algorithms narrow information.
The Little Red Book, is a popular social media app among young generations in China. According to Liumei (2022), by the end of 2021, the Little Red Book had over 43 million inscribed accounts and the exposure of daily notes over 10 billion, making it a mega cyber community. In the page layout and design of Little Red Book, the recommendation page mixes graphic & text posts and video posts in the form of an information waterfall. Different from the single-column layout in Tiktok, the main recommendation page in the Little Red Book consists of 2 columns, allowing users to get more information in a shorter period. The algorithmic system of the Little Red Book obtains initial user feedback from user actions, such as selecting, browsing, dwelling, etc. At the same time, it scans the content, like graphics and videos, published by the creator frame by frame and pushes the content to the target group. Keeping this way, the algorithm will continue to improve and refine, generating a more personalised profile of the user and delivering the content more accurately.

Image Source: Layout of The Little Red Book Screenshot by Zhuoya Gui
Some scholars, including Shimei (2022), believe that such an algorithmic system, controlling the initiative, gradually narrows the information that users can obtain. Then, the overexposure effect will become more and more serious in the feedback loop of “system recommendation-user feedback-system re-recommendation”. The overexposure effect means that similar recommended content appears repeatedly on the timeline, causing user fatigue which will reduce user activity and retention from a long-term perspective (Chongming,2022). In fact, this kind of outcome runs counter to the platform’s goals and they will not allow this to happen.

Image Source: https://zhuanlan.zhihu.com/p/530441122
At the 2019 Alibaba Cloud Summit, Yi Guo, head of The Little Red Book’s real-time recommendation team, shared the recommendation system architecture, offline processing and other related real-time computing scenarios in the recommendation business. Yi (2019) explained that the Little Red Book utilizes a comprehensive online-to-offline recommendation system. Once a user is recommended content by the algorithm, they interact with the notes by providing information such as exposure, likes, and clicks. This data is then collected to create user profiles, which are used as training samples to develop prediction models. These prediction models are integrated into the online recommendation algorithm, forming a closed-loop system. Additionally, algorithm engineers or strategy engineers analyze the generated analysis reports to adjust the recommendation strategy, which is then implemented in the online recommendation process. In addition, algorithm systems such as real-time stream processing also provide support for The Little Red Book.

Image Source: https://baijiahao.baidu.com/s?id=1640908092642090901&wfr=spider&for=pc
Translated by Zhuoya Gui
Guoming (2021) argued that from a market-oriented perspective, shopping mall managers would not want customers to solely focus on a single category of products during each visit. Likewise, algorithmic platforms aim to avoid narrowing down user interests to a single focus. This means that even from a commercial standpoint, algorithms are not designed to compress the information space, but rather to gradually uncover untapped potential for information consumption among individuals through continuous updates and iterations. For instance, collaborative filtering algorithms, which generate recommendations based on the preferences of similar users, can expose users to a wide range of content that they may not have previously considered (Guoming Yu & Keren Fang, 2019).
The discussions above prove that the algorithms are not monolithic or even dictatorial. It is limited that algorithms narrow information. Nowadays, combined recommendation systems that integrate multiple algorithms and associate larger data are the mainstream of social media platforms. Therefore, in today’s era of diversified algorithms, it is irrational and imprecise to rashly define algorithms as the culprit of “filter bubbles”.
Misunderstanding #2 – User
Users are always passive facing “‘filter bubbles’ caused by algorithms”.
Lazarsfeld mentioned in The People’s Choice that the audience is not indiscriminate when exposed to mass communication information, but is more willing to choose to contact those content that is consistent or close to their attitude, and tends to avoid the opposite or conflicting content. It can be seen from this that when we face all kinds of information recommended to us by social media or algorithms, we have the right to choose, but most of the time, we choose what we prefer to read and watch.
Similarly, Xin Yu and Jinpeng Wang (n.d.) proposed the concept of “user-led” information cocoons. The concept points out that technologies such as catalogues and search engines can be classified as “user-initiated customization”, which gives users the right to actively choose customized media content.
In addition, social platforms are abundant nowadays. Users have a wide range of choices and forms to obtain information, and are unlikely to rely on only one platform for information. Apart from social platforms, books, movies, families, friends, etc. can also be our sources of information.
Therefore, from the user’s point of view, it is difficult to form an absolute “information cocoon”, and users always hold the initiative in the process of selecting information.
According to the above discussion, we can conclude that the algorithm is limited to generating “information cocoons”, and at the same time, users have the choice to decide whether to read or not facing “filter bubbles”. Reasons formation is multi-faceted. So how do we avoid “information cocoons”? I will explore ways to avoid “filter bubbles” from the three dimensions of users/individuals, platforms/algorithms, and governments in the following part.
Ways to avoid “filter bubble”
First of all, in order to get rid of the influence of the “information cocoon”, it is the most critical for users to give full play to the inner initiative. Users need to improve their data literacy. On the one hand, try “information diet” and be wary of information automatically recommended by the system. On the other hand, actively learn to understand knowledge in other fields, break self-enclosed, and maintain the balance and timeliness of the information.
Secondly, social media platforms should actively explore various forms of algorithms to avoid the formation of “information cocoons” while providing personalized recommendations for audiences. Guoming Yu put forward a good example in his article, Toutiao, a comprehensive information platform in China, the primary goal of the platform is to facilitate connections between people and information, enabling efficient and accurate distribution of high-quality and diverse content, and promoting the value creation of information. The platform’s algorithm has undergone significant adjustments and upgrades, resulting in improved adaptability to societal needs.
Last but not least, the government plays an indispensable role in guiding, supervising and governing. The government should establish a sound network civilization supervision mechanism and deepen network ecological governance. For example, in 2022, four departments including the National Cyberspace Administration of China jointly issued the “Regulations on the Administration of Algorithm Recommendations for Internet Information Services“, which regulates algorithm recommendation activities for Internet information services with clear laws and regulations, including information service specifications, user rights protection, supervision and management system, etc.
Conclusion
Even if we worry about the negative impact of algorithms, we should rationally consider the concept of “information cocoon” or “filter bubbles”. Although the psychology of selective exposure is difficult to avoid, we can always maintain curiosity and thirst for knowledge. In many cases, what really closes the mind and narrows the horizon is not technology or algorithms, but ourselves.
Reference
Flew, Terry (2021) Regulating Platforms. Cambridge: Polity, pp. 79-86.
Gao, C., Lei, W., Chen, J., Wang, S., He, X., Li, S., Li, B., Zhang, Y., & Jiang, P. (2022, April 4). CIRS: Bursting filter bubbles by Counterfactual Interactive Recommender System. arXiv.org. Retrieved from https://arxiv.org/abs/2204.01266
Guo, Y. (2019). How does Xiaohongshu achieve efficient recommendations? The big data computing platform architecture behind the decryption.
Illing, S. (2018, October 1). How algorithms are controlling your life. Vox. Retrieved April 11, 2023, from https://www.vox.com/technology/2018/10/1/17882340/how-algorithms-control-your-life-hannah-fry
Lazarsfeld, P. F., Berelson, B., & Berelson, B. (2017, February 22). The People’s Choice. Columbia University Press.
The National Cyberspace Administration of China. (2022). Regulations on the Administration of Algorithm Recommendations for Internet Information Services. 互联网信息服务算法推荐管理规定_国家互联网信息办公室_中国政府网. Retrieved from http://www.gov.cn/zhengce/2022-11/26/content_5728941.htm
Shi, L. (2022). The Phenomenon of “Filter Bubble” in Xiaohongshu APP and the Way of “Breaking the Bubble”. Weipuzixun.
Sunstein, C. R. (2006, August 24). Infotopia: How many minds produce knowledge. Google Books.
YouTube. (2011, May 2). Beware online “filter bubbles” | eli pariser. YouTube. Retrieved from https://www.youtube.com/watch?v=B8ofWFx525s
Yu, G. (2021). Algorithm recommendation and the solution to the “information cocoon room”.
Yu, G., & Fang, K. (2019). Does algorithm recommendation necessarily lead to “Information cocoon” effect.
Yu, X., & Wang, J. (n.d.). Re-understanding “Information Cocoon Room”–Study on the Coexisting Mechanism of Instrumental Rationality and Value Rationality in the Smart Media Era. Retrieved from http://qxcm.tsjc.tsinghua.edu.cn/pc/folder125/2022-07-01/13AVwNyKrs9zJ9aA.html
Be the first to comment