Nowadays, human have created many technologies to help ourselves, including computer, algorithm, AI, virtual reality etc. We live with these buzzwords, but we really know about them? Therefore, this post will explain what algorithm is, how personalization works, what the possible consequence would be, and what relevant regulations have been conducted.
What is algorithm?
According to Terry Flew (2021), algorithms refer to the set of guidelines and procedures created for various tasks including computation, automated reasoning, and data processing. Let’s put it simply: Algorithm, is a method of data processing, including input data, transform data, and output data. The computers will never do what people have not tell them to, so what they do is following the algorithm – just imagine how you would tell a 5-year-old boy to get dressed in the morning (Denny, 2020).

From the perspective of a computer, input refers to the data required to arrive at decisions. Let’s get back to solve the dressing stuff. when we are about to dress up in the morning, the first thing to check is the weather. Is it sunny, raining, snowing, or windy? If it’s sunny, we should choose t-shirt or dress; if it’s raining we should put on rain coat etc. In general, what we wear depends on different weather. Applying the idea into algorithm, a kind of weather represents a number, and match it to a specific dressing style, which is also a number. That is the first step of algorithm.
Now, the second step is the core of algorithm – computing. Let’s continue to choose clothes for your little nephew. The temperature is also important for dressing. Thus, we can tell him “if the temperature is below 20 degrees, put the hoodie on” or “if the temperature is higher than 30 degrees, do not wear sweater, put the t-shirt on”. It can be seen clearly that temperature is a representative element in which can be marked as numbers. Thus, in computer language, the algorithm filter data that does not fit the requirement and show consistent results.
The last step is output, what we can see in the end – we finally choose a suitable outfit for your nephew. If he does not satisfy with it, the process will be repeated. When it comes to algorithm, the output can be more visually various, like animation, special effect and 3D modeling etc.
Simply speaking, algorithm is a 3-step process of computing data, but it is more complex in the reality. Actually, algorithm has been implemented into many aspects in our lives, such as GPS navigation, facial recognition of our cellphones, traffic lights and so on. A small button we press every day may consist of huge computing, so we do rely on it and benefit from it.
What is personalization?
When it comes to algorithms we use every day, personalization must be the one we frequently consume but seldom realize.
Personalization, which can also be referred to as customization, involves adapting a product or service to meet the specific needs of individuals or groups of people based on their characteristics or preferences. It has ancient roots in the field of rhetoric, where communicators aimed to be responsive to their audience’s needs. However, the rise of mass communication during the industrialization era resulted in a decline in the practice of personalizing messages. Despite this, the growing number of mass media outlets that rely on advertising as their main source of revenue has led to a renewed interest in personalization. In order to attract customers, these media outlets have made efforts to gather information on the specific demographic (e.g., age, gender, income) and psychographic characteristics (e.g., values, interests, attitudes) of their readers and viewers (Turow, 2010).
When we connect personalization with algorithms, both show powerful functions in terms of their accuracy in sorting information and accurately targeting specific people.
For example, Netflix applies personalized algorithm to provide most relevant movie and TV show recommendations to its users. It analyzes the viewing history of each user and recommends similar titles based on the genre, actors, directors, and other factors.
Besides, personalization algorithms are also implemented in e-commerce like Amazon, which uses a variety of personalization algorithms to provide a customized shopping experience to its users. It uses collaborative filtering to recommend products based on the purchase history and browsing behavior of other users with similar interests. It also uses content-based filtering to recommend products based on the user’s previous purchases, search history, and wishlist. According to Chen et al. (2022), proponents of personalized algorithms argue that they can help consumers make faster and more cost-effective purchasing decisions by analyzing their preferences.

(Photo by Erica)
In music industry, personalization of streaming service has been a popular and common way to audience. Music companies like Spotify also utilizes a blend of collaborative and content-based filtering techniques to offer tailored music suggestions to its audience. It analyzes the listening history of each user and recommends similar songs and artists based on the genre, mood, tempo, and other factors.
In conclusion, personalization has a long history in communication and has experienced a resurgence with the rise of mass media outlets. Besides, the use of personalization algorithms has become increasingly prevalent in various industries, including entertainment, e-commerce, and music streaming. These algorithms can provide accurate recommendations based on user data, improving the user experience and potentially leading to more efficient purchasing decisions.
However, the use of personalization algorithms has also raised concerns about privacy and the potential for filter bubbles, where individuals are only exposed to information and products that align with their existing preferences.
What is filter bubble?
Filter bubbles, which means personalization algorithms use data from past behavior, preferences, and choices to provide personalized recommendations. If a user consistently engages with content that confirms their existing beliefs, the algorithm will use that information to recommend more content of a similar nature, effectively creating a filter bubble around the user.

(photo by Goodreads)
The idea of filter bubble comes from Information Cocoons proposed by Cass Sunstein in 2006. He indicated that the algorithms might cause negative impact on the public sphere and critical thinking. In 2010, Eli Pariser expanded on this idea and introduced the concept of the “Filter Bubble” for personalized searches. Pariser argued that the personalized algorithms of the Internet could have a detrimental effect on civil discourse and reduce ideological diversity. Some people who agree with him suggest that the filter bubble risks the democracy, particularly in relation to major events such as presidential elections. In conclusion, critics of personalization are built of the reason that it makes people only see information that confirms their existing beliefs or opinions, while being sheltered from opposing viewpoints.
For example, if a user consistently engages with news articles that have a particular political bias, the algorithm will recommend more articles of that bias, effectively creating a filter bubble around the user. Over time, the user may become more entrenched in their existing beliefs, as they are only exposed to information that confirms those beliefs.
The concerns of personalized algorithm also exist in marketing scope that consumer decision quality maybe impacted by narrowing down their visible options. For instance, a study conducted by Sihua Chen et al. (2022) on the relationship between AI recommendations and information cocoons suggests that AI recommendations can influence the strength of the relationship between consumer preferences and information cocoons. Specifically, the study found that higher levels of AI recommendation can lead to stronger information cocoons and lower-quality decisions made by consumers.
Arguments of filter bubbles
While algorithms are often blamed for creating filter bubbles, the reality is more complex than the idealized model presented in public discourse. As Borgesius et al. (2016) suggest, it is nearly impossible for anyone to be completely isolated in an absolute information cocoon. The reason is that users have agency in shaping their own online experiences. They actively seek out and engage with a variety of sources and perspectives, and that algorithms simply help them navigate and filter through the abundance of information available online. In other words, the Internet is not the only way they absorb knowledge. Instead, individuals can acquire information about current events by engaging in dialogues with their colleagues, friends, or family members who hold differing opinions. Such conversations can expose individuals to diverse perspectives and information, enabling them to view an event or news item more comprehensively.
What’s more, there is still a significant amount of diverse information available online, and that algorithms actually help users discover and access this information. For instance, algorithms can suggest content that is related to a user’s interests, but is not necessarily aligned with their existing beliefs or opinions.
Furthermore, filter bubbles are not solely created by algorithms, but are also influenced by other factors such as users’ own behavior and preferences. Users tend to seek out information that confirms their existing beliefs, and that this behavior can lead to the formation of filter bubbles regardless of the presence of algorithms. In essence, while algorithms can play a role in shaping users’ online experiences, they are not solely responsible for the formation of filter bubbles, and that users also have agency in seeking out diverse perspectives and information.
As discussed above, the filter bubble is controversial issue in terms of the dark side of personalized algorithms. However, with the development of AI and algorithms that enables people to seek various information in different ways, it has not proved that it creates filter bubble. But then again, we need to be aware of the impact of personalized algorithms.
Therefore, it is important for companies that use personalization algorithms to realize of the potential and take steps to mitigate their effects, such as including diverse viewpoints in recommendations and providing users with the option to explore different perspectives. The governments and regulatory bodies should also take steps to address related issues such as data privacy, transparency, and algorithmic fairness.
What regulations have been put?
Currently, there are no specific regulations that directly address the issue of filter bubbles caused by personalization algorithms. However, some countries have implemented or proposed regulations to address the spread of misinformation and promote diverse viewpoints.
- France: In 2019, France passed a law that requires social media platforms to remove hate speech within 24 hours of receiving a complaint. In 2021, France also proposed a new law that would require social media platforms to remove certain types of content that are deemed harmful or illegal, such as child pornography and hate speech.
- Australia: In 2021, Australia passed a law that requires social media platforms to pay news organizations for the use of their content. This law is intended to address the power imbalance between social media platforms and news organizations, and to promote diverse viewpoints by ensuring that news organizations are fairly compensated for their content.
- Germany: In 2017, Germany passed a law that requires social media platforms to remove hate speech and other illegal content within 24 hours of receiving a complaint. The law also requires social media platforms to report on their efforts to combat illegal content.
- United States: The US government has proposed several bills and regulations in recent years to address the spread of misinformation and promote diverse viewpoints. For example, the Honest Ads Act would require social media platforms to disclose information about political ads, while the PACT Act would require social media platforms to take steps to combat the spread of misinformation.
There have been some efforts by countries and organizations to address the issue of filter bubbles, including the implementation of regulations that require social media platforms to disclose information about political ads and the targeting criteria used for personalized content. However, there is still much work to be done to ensure that individuals have access to a diverse range of perspectives and information.
This post only shows the surface of algorithms, but we can see through it that we have benefited a lot from it, but the new technologies extended by algorithms are not entirely harmless to us. There is still a long way to go to regulate the process and consequences of its use.
References
Agence France-Presse in Paris. (2019). France online hate speech law to force social media sites to act quickly. Retrieved from https://www.theguardian.com/world/2019/jul/09/france-online-hate-speech-law-social-media
Borgesius, F. J., & Trilling, D., & Möller, J., & Bodó, B., & de Vreese,
C. H., & Helberger, N. (2016). Should we worry about filter bubbles?, Internet Policy Review, 5(1). https://doi.org/10.14763/2016.1.401
Burns, J. (2017). Germany To Social Media Sites: Remove Hate Speech In 24 Hours Or Face $57 Million Fines. Retrieved from https://www.forbes.com/sites/janetwburns/2017/06/30/germany-now-allows-up-to-57m-in-fines-if-facebook-doesnt-remove-hate-speech-fast/?sh=2228f977761d
Choudhury, S. R. (2021). Australia passes new media law that will require Google, Facebook to pay for news. Retrieved from https://www.cnbc.com/2021/02/25/australia-passes-its-news-media-bargaining-code.html
Chen, S., & Qiu, H., & Zhao, S., & Han, Y., & He, W., & Siponen, M., & Mou, J., & Xiao, H. (2022). When more is less: The other side of artificial intelligence recommendation, Journal of Management Science and Engineering, Volume 7, Issue 2, 2022, Pages 213-232, ISSN 2096-2320, DOI: https://doi.org/10.1016/j.jmse.2021.08.001.
Denny, J. (2020). What is an algorithm? How computers know what to do with data. Retrieved from https://theconversation.com/what-is-an-algorithm-how-computers-know-what-to-do-with-data-146665
Flew, T. (2021). Regulating Platforms. Cambridge: Polity. (p. 109)
Lau, T. (2020). The Honest Ads Act Explained. Retrieved from https://www.brennancenter.org/our-work/research-reports/honest-ads-act-explained
Sunstein, C. R. (2006). Infotopia: How many minds produce knowledge. Oxford: Oxford University Press.
Turow, J. (2010). The daily you. Yale University Press.
Be the first to comment