The Relationship Between Algorithm and the Information Cocoons

(Free photo from PIXABAY,  https://pixabay.com/illustrations/social-distancing-social-distance-5030026/)

(Free photo from PIXABAY,  https://pixabay.com/illustrations/social-distancing-social-distance-5030026/)

Abstract

With the popularization of algorithm technology, the “information cocoons(Sunstein, 2006)” have become a concern triggered by algorithms and has attracted more attention and discussion. In the past, many scholars have discussed a series of problems related to the “information cocoons”, including hindering personal comprehensive development, the trend of group polarization and social polarization. While these problems may indeed exist in our society, algorithm technology does not necessarily lead to the “information cocoons” because  the diverse channels of people for receiving information, the complexity of their social relationships, and the improving of algorithm. Additionally, the “information cocoons” may not only have negative effects. Humans should learn to train algorithms, broaden information channels, and improve media literacy.

The concern about the information cocoons

With the arrival of big data comes the “information explosion”, it requires the public to filter and select from massive amounts of information. However, people’s time, energy, and abilities are limited, while digital media also needs to capture audience attention at a faster pace. As a result, algorithm recommendation technology has emerged, driven by both the public demand for benefits and commercial interests.

“Information cocoons” is one of the impacts raised by algorithm. During information dissemination, users only pay attention to the areas that appeal them and bring them pleasure, because their own information needs are not all-around. Over time, they may confine themselves within a “cocoon” of limited information.

Previously, scholars and critics had found many negative impacts of information cocoons. Firstly, in the aspect of individual, they viewed that the “information cocoon” inhibited the overall development of the individual. Due to algorithmic recommendation technology, users with similar interests are easily grouped together, leading most members of the group believing the information they are exposed. But in fact, the information is not equal to the reality all the time. The majority of users become immersed in this seemingly “open” but in reality, enclosed environment, lacking the depth of critical thinking.

Secondly, in the aspect of groups, scholars pointed out that Information cocoons may lead to group polarization. Groups are formed by differentiation and clustering on the internet. Once the “information cocoon” is formed, members within the group are easily immersed in their own opinions and rejected the reasonable views of others. Over time, people living in the “information cocoon” can develop negative psychology such as narrow-mindedness and blind confidence, and even evolve into extreme behaviors.

From my perspective, we need to dialectically treat the group polarization. On the one hand, the group polarization can increase the group cohesion. It is really helpful when the ideas aligns with the mainstream ideology. On the other hand, it can lead to increasingly extreme and erroneous judgments and decisions.

Take an example of the negative one. If you are a follower of Donald Trump, you are more likely to be sent the news that supporting him. However, if you are a follower of Biden, the news maybe more be advantageous for him, but not for Donald Trump. In this way, people will feel a great sense of support because there are many viewpoints of the same stance, so that opinions within the group are constantly reinforced. Therefore,

What’s more, once members within the group break away from the internal group and communicate with members from external groups, they often find it is difficult to understand the ideas between different parties. Also, it is a big challenge to integrate multiple thoughts. Hence, this is one of the reasons that why it is difficult to reach a consensus between different political parties.

Thirdly, the information cocoon caused by algorithms potentially lead to isolation between different groups and exacerbate social polarization rather than uniting them.

Groups are divided into in-groups and out-groups(Wikipedia, 2023). Psychologically, people tend to trust members from an internal group and are more willing to collaborate with them, compared to an external group. To some extent, this leads to higher cohesion and relatively fewer conflicts. However, when looking at society as a whole, the “information cocoon” effect significantly reduces communication between different groups, losing the opportunity to exchange with other ideas.

During the US election, politicians often express their views and post advertisements through social media that based on algorithm. Many netizens cannot accept the other side’s positions, further exacerbating the conflict between the two parties. Many demonstrations and protests have erupted during the election, some of which have even escalated into violent incidents.

For example, according to reports, some extremist organizations that support Trump have launched campaigns on social media to gather crowds to protest at state government buildings, and even attempted to storm the Capitol building (Farivar, 2021).

Algorithm does not necessarily lead to the “information cocoons

Although the conclusion and examples above probably be real, it is not comprehensive. “Information cocoon” effect is not a normalized state of media usage for all users, and is influenced by multiple factors such as individual, social, and technological factors. It overestimates the negative impact of algorithm and implies a “technological determinism” perspective. Besides, it also ignores the complexity of human decision-making and merely explores the relationship between technology and communication phenomena, resulting in oversimplification of the conclusion.

To begin with, the information environment is a pseudo-environment, according to Walter Lippmann. But the pseudo-environment is not equivalent to the real world. Digital information does influence people’s thoughts and behaviors, but information can be received both online and offline. Therefore, their information reception is not solely provided by algorithms, which suggests that algorithmic recommendation does not necessarily lead to the negative effects of information cocoons, including series of impacts that come with it.

(Free photo from PIXABAY,  https://pixabay.com/photos/playing-with-phone-cell-phone-smart-5103221/)

In addition, the social relationships of humans are complex. Humans live within the different cultural, social, economic, and political environments, and all of these leads to different values, ideas, and behaviors of different people. Therefore, algorithm is just one of the multiple factors that affect their thoughts or behaviors, but not necessarily resulting in the information cocoons.

And diversities and individual differences are supposed to be taken into account. Everyone has different personalities, interests, experiences, and educational backgrounds, which affect their way to select and receive information online and offline.

What’s more, human beings are highly intelligent animals so we are not completely passive in receiving.

For example, people with well-educated may give more critical and insightful thinking about events, instead of merely receiving them recommended by the algorithm. Actively, we are able to seek multiple reports from different institutions or parties.

Thirdly, the results, as well as accuracy, maybe diverse by different algorithm systems. The algorithm systems are typically divided into four types: content-based algorithm, collaborative filtering algorithm, network structure algorithm, and hybrid algorithm.

Traditionally, content-based algorithm was quite popular. It refers to the information recommended are similar to what users had liked in the past. The system cannot discover a user’s potential interests because it only recommend the information related to past. For example, if you had watched videos related to games previously, it will only recommend more content related to that. It fails to guess your other interests like singing or debating and so on.

However, algorithm technology is constantly improving, and collaborative filtering algorithms help to solve the problem of information becoming too narrow. Its main principle is to infer the content that the current user may be interested in based on the past behaviors of the existing user groups. This recommendation mode breaks away from the “personal daily news” style. For example, if both A and B have read and liked article X, the system will recommend article P, which A has read, to B as well. In the practical application, two or more algorithm system are often used to meet the needs of content recommendation and interest seeking of users.

Above, we have discussed the relationship between algorithms and the “information cocoons”. On the one hand, algorithms may lead to filter bubbles, which can result in restricted personal development, group polarization, social division, and other issues. On the other hand, algorithms do not necessarily lead to filter bubble effects. This is because social relationships of human are complex, information channels are diverse, and algorithmic technology itself is improving constantly.

The impacts of the information cocoons not always be negative

The premise of the information cocoons effect is that people always choose content related to their interests and filter out those that are irrelevant. However, most people overlook a crucial factor: the quality of the keywords you have typed into the search engine. From my perspective, a brilliant search do not narrow one’s perspective but also increase his or her depth of knowledge in a particular field or multiple fields. In other words, if you are addicted to entertainment information such as celebrity gossip or content related to pornography every day, it would take up an amount of your time. But ultimately, you probably be in vain. However, if you had made a brilliant choices–it means that you absolutely know what materials on the internet are beneficial to your future study or career development and just tell algorithm about your opinions– the outcome is quite different.

If you are not clear about what fields you want to be exposed to, then even the best algorithm technology cannot help you solve this problem. We must be aware that algorithms cannot replace human thinking in all aspects given its programmatic execution mode, especially in matters of values, ethics, emotions. When facing complex problems, it may not be able to handle them as flexibly as humans. In other words, what we need to do is to “train” the algorithm system better so that it can serve us by pushing content that not only matches our interests but also benefits our academic or career development, rather than surrendering all decision-making power to algorithms.

How should people combat the information cocoons effects

Firstly, learn how to train algorithms. For example, if you are a student who wants to improve your oral English on Bilibili, you can enter keywords such as “pronunciation,” “English speaking” and “pronunciation techniques” in the search engine. Then you just select certain videos to watch and give likes . The system will then recommend you a large number of videos related to English learning, including not only English speaking but also reading and writing. By receiving this kind of information for an extended period of time, our knowledge of the field would be greatly increased. If you concern about the broadness of knowledge, you can try searching for videos in different fields, follow several accounts in that field, and give likes for their videos. The algorithm will then push content from different fields at the same time.

Secondly, receiving information from diverse sources. In addition to receiving personalized information recommended by algorithms, one should also actively seek information from different channels. For example, following different media outlets, reading books from different fields, attending lectures and social practice activities, etc.

Thirdly, improving the media literacy. Media literacy refers to the ability of individuals to select, understand, question, evaluate, create, and respond to various information presented in the media(Literacy., 2019), proposed by the American Center for Media Literacy in 1992. We ought to learn how to effectively filter and evaluate the reliability and value of information, and avoid being misled by low-quality or false information. When facing comments from groups with same interests, we should also maintain critical thinking and not blindly being herd.

Conclusion

Technology is a double-edged sword. On one hand, algorithms do have potential impacts of the “information cocoon” effect, resulting in negative influences on individuals, groups, and society. On the other hand, we should get rid of the stereotype of the “information cocoon” and reconsider that what important roles do human play in the process. We need to reconsider the value and status of humans, and learn how to better utilize technology to adapt to societal development, rather than rejecting or complaining about it. At the same time, people need to maintain an open-minded attitude, engage with diverse information and perspectives, and form more objective, comprehensive, and diverse perspectives. By integrating human intelligence, the human-centric algorithms not only offer people a varied and logical information environment but also pave a constructive path for the progress of algorithmic technology.

Reference

Farivar, M. (2021). Researchers: More Than a Dozen Extremist Groups Took Part in Capitol Riots. VOA. https://www.voanews.com/a/2020-usa-votes_researchers-more-dozen-extremist-groups-took-part-capitol-riots/6200832.html

Literacy., C. f. M. (2019). Media Literacy: A Definition and More. Medialit.org. https://www.medialit.org/media-literacy-definition-and-more

Sunstein, C. R. (2006). Infotopia: How Many Minds Produce Knowledge (Annotated ed.). Oxford University Press, USA.

Wikipedia. (2023). In-group and out-group. https://en.wikipedia.org/wiki/In-group_and_out-group

Be the first to comment

Leave a Reply