Will the algorithms know us better than ourselves? Exploring the algorithms of digital platforms

Source: From Shutterstock / Who is Danny

The moment we use smartphones or computers to enter Internet life, algorithms have become an unavoidable high wall in our daily lives. These invisible and intangible algorithms make it difficult for most people to understand how they operate exactly (Flew, 2021& Noble, 2018), but as digital platforms gradually occupy a very important part of life and even shape the trends of our lives, we can easily realize the existence of ‘algorithms’.

What is the algorithm that we’re all inseparable from?

An algorithm is a set of rules and procedures used for tasks such as computing, data processing, and automated reasoning. They help streamline these activities and increase efficiency (Flew, 2021). According to Gillespie (2014), algorithms need not be software: in the broadest sense, they are encoded procedures for transforming input data into a desired output, based on specified calculations.

These concepts might still be difficult to understand completely, before briefly understanding what the algorithm is, firstly, we must realize that we live in a digital age, and every time we log in, browse, or open a web page on the Internet, we will leave traces that can be converted into data. The algorithm itself is meaningless, it can only play its role when it is connected with big data (Gillespie, 2014). As time passes and computer technology advances, algorithms can become more and more efficient, handling ever more voluminous data. (Flew, 2021).

Source: Moxee Marketing (2021)

How do digital platforms know so much about us?

The core of the digital platform’s business model is also data collection. The platform will use users’ detailed information, including interests, preferences, tastes, and behaviors (van Dijck et al., 2018 cited in Flew, 2021). Those platforms such as TikTok or Instagram usually provide free services to users, but in exchange, the platform collects personal information and converts it into useful data. When registering for a social media account, users must sign a contract allowing the sharing of personal information, otherwise we will not be able to use this platform. Usually, we ignore the content of the contract and directly choose “accept”, and this is the prerequisite for our information to be captured (Flew, 2021), and the algorithm can get to know us from here.

Source: Lionel Bonaventure/Agence France-Presse — Getty Images

Currently, TikTok has the highest average time per user(Simon, 2024), taking the algorithm behind it as an example. Although ByteDance keeps the exact workings of its algorithm secret, we can know its basic mechanism: TikTok collects information about user interactions, such as videos liked and reposted, accounts followed, comments sent, etc., for example, If you watch dance videos, the recommendation system will customize entertainment videos for you based on this phenomenon, further analyze your preferences based on your later performance, and more accurately target the types of dance videos you like. In addition, information about device and account settings will also be collected, such as your language choice, region, device model, etc.. Even the scenarios in which you use TikTok are very useful data, for example what types of videos people like to watch while commuting or travelling, giving the platform a better idea of users’ interests and preferences. (Latermedia, 2020 cited in Bhandari & Bimo, 2022).

When using TikTok, users do not have to follow other accounts or post content. Instead, they only need to swipe the video to have an interactive experience. The algorithm tracks those interactions, including the length of time we stay and the content we watch. After using TikTok for a while, we may notice that it has learned our preferences. This is because TikTok’s recommendation system contains a real-time learning mechanism. By capturing and analyzing the data left by users, it can provide feedback quickly. For example, when a user clicks on a certain video, TikTok will quickly update the user’s “favorite library” based on this information, and then immediately recommend similar videos based on this change. In the survey interview by Bhandari and Bimo (2022), one respondent said that he felt bored when he first started using TikTok, but once the algorithm started to “understand him”, the content began to become interesting.

The algorithm is a double-edged sword

When we wander and shuttle in the dizzying ocean of information on electronic platforms, it is undeniable that algorithms bring us convenience and a better experience. Social media platforms like YouTube and content platforms like Netflix use algorithms to create personalized recommendations for users. This allows us to find content we are interested in more quickly, improving our interactive experience. Social media users interact more when presented with interesting content on their mobile phones. Furthermore, the algorithms plays a crucial role in filtering harmful content and protecting users. Twitter’s algorithms, for instance, identifies and deletes malicious or illegal posts to ensure a safe user experience and platform security.

However, discussing the algorithms of digital platforms can raise doubts rather than enjoying the convenience they bring. Therefore, we will focus on the concerns caused by algorithms and explore potential solutions.

Concerns raised by algorithms

Our data appears to be shared between different digital platforms. For example, we may see advertisements for products we have browsed on Amazon on the homepage of Instagram. Similarly, when we search for a series on YouTube, Netflix may automatically recommend it to us. As mentioned before the operation and prediction of algorithms are inseparable from data. By signing the terms of service and privacy agreement during registration, we agree to allow our data to be shared with third parties unless we stop using the platform (Cohen, 2018 cited Flew, 2021). 

Source: Rafael Henrique/SOPA Images/LightRocket – Getty Images

Recently, there have been media reports, according to explosive court documents, that Meta allegedly allowed Netflix to access Facebook users’ direct messages for almost a decade, which violates anti-competitive and privacy rules. Maximilian Klein and Sarah Grabert, U.S. citizens, filed a major anti-trust lawsuit claiming that Netflix and Facebook had a ‘special relationship’ that allowed Netflix to better tailor its advertisements with Facebook. Meta and other major social media companies typically have access to exclusive data. However, in the current situation, algorithms are rarely used for social-political goals but are mainly used for commercial purposes (Just & Latzer, 2017). This is exactly what we fear, where our private information is sold to third parties for profit without our knowledge. Platforms usually collect our data for various reasons, originally intending to improve the user experience by tracking our behavior and then improving and enhancing their algorithms.

Source: Christina Animashaun/Vox

In the digital age, information overload has become an increasingly serious problem. Algorithms on digital platforms provide personalized services and determine the content we see. For instance, TikTok’s For You page displays different videos for different users. This information filtering influences our perception of the world to some extent (Bozdag, 2013). The causes of bias in algorithms are complex. The design of the algorithm itself, biases in the decision-making process, and the underlying assumptions and values embedded in the algorithm can all lead to bias, because algorithms are created by people who have their own perspectives and beliefs (Noble, 2018). 

E-commerce platforms like Alibaba can make us feel a sense of unfairness in our consumption experience more intuitively. When we search for a specific product, we will unknowingly be induced to strengthen our existing consumption habits. For example, the things we buy become more and more expensive, and it becomes increasingly difficult to see product recommendations that break through our own styles and habits. However, the premise that we have more opportunities to make independent choices is based on that we can see products that break through our habits. The algorithm has quietly deprived consumers of their right to see more products, making it impossible to have more consumer choices. This is obviously against fairness. Most digital platforms are commercial, and the algorithm selection mechanism is still not transparent enough (Pasquale, 2015). Many consumers are often unaware of the existence of seemingly “intimate” algorithms. 

Algorithms are not only used by well-known social media platforms, but also by many companies and governments to make decisions. Nevertheless, the basis of these decisions is often unclear. For instance, banks may use algorithms to decide whether to grant a loan to a customer, but we often don’t know exactly what the decision is based on and when. For most users, transparency is crucial to increase user participation and trust in the platform (Shin & Park, 2019).

Shin & Park (2019) argue that transparency is subjective to each user. What may seem like a transparent system recommendation to the algorithm provider may not be the case for its users. If transparency is an important issue, then this level of transparency should be from the user’s perspective. If users feel less concerned about providing their data to service providers, the platform can attract better service users. This creates a positive cycle.

Governance of Algorithms

The above-mentioned concerns caused by the application of Algorithms in digital platforms and the wide application of Algorithms in modern life make the regulation and governance of Algorithms very necessary. The Internet is characterized by its global reach, so activities on the Internet inevitably have an impact on individuals in the real world (Weber, 2010), and in this digital age, each one of us is a stakeholder in Algorithms. The governance of algorithms faces considerable challenges as it needs to balance the different interests of individual users, corporations, and governments (Weber, 2010), while at the same time, the rapid development of digital technologies means that regulators need to constantly keep up with and update their regulatory frameworks in order to adapt to new changes (Flew, 2021).

Firstly, transparency is crucial and digital platforms should disclose how algorithms work, including the factors that influence decisions and the data used to train them. Secondly, strengthening data privacy regulations is an important means of protecting user data from algorithmic misuse, including giving users more control over their data and requiring platforms to obtain explicit consent. Furthermore, enforcement of anti-discrimination laws is crucial to preventing algorithmic discriminatory behavior by ensuring that algorithms do not unfairly treat certain groups of people on the basis of factors such as race, gender or social status. Finally, international cooperation is key to developing consistent algorithmic regulatory standards and addressing transnational challenges, while continuous monitoring and adaptation are important mechanisms to ensure that regulation keeps pace with technological developments.

Try to open the black box as a user

As we begin to gain a superficial understanding of how algorithms work, it seems less intimidating, and we can believe that algorithms will never know us better than ourselves. Although the algorithm is produced by intelligent experts, the data is ultimately provided by users. Algorithms mechanically analyze and interpret our preferences, making choices for us. However, they often fail to consider that people’s preferences may change over time (Swart, 2021). It is important for Internet users to clearly recognize the existence of algorithms and improve their insights into them. As we use more digital platforms, the more we can think about how algorithms work and understand what specific values they represent (Swart, 2021).

Source: Twitter (2023)

At the same time, we can also pay more attention to the algorithmic logic of recommended content released by mainstream social platforms, such as TikTok or Twitter, which are highly used. In 2023, Twitter released a blog to help users understand as much as possible how the recommended content on their homepage is generated. Although the article uses some professional vocabulary, it still improves our understanding of Twitter’s recommendation algorithm. The blog stated that Twitter’s For You timeline now consists of 50% content from people we follow and 50% from those we don’t, although this may vary depending on the user. Additionally, the blog explained how Twitter determines the relevance of tweets from authors we don’t follow. They consider the recent tweets of the people you follow or the content liked by users who share your interests (Twitter, 2023).

Conclusion

Algorithms are present in every aspect of modern life, not just online but offline too. It is important to pay attention to and understand their impact. On the one hand, the smooth and secure operation of the algorithm requires cooperation between the platforms and governments to find the greatest balance between user privacy, security and convenient services. On the other hand, we should remain vigilant and suspicious of algorithms as they are often beyond our control (Gillespie, 2014). As digital platform users, we must also enhance our algorithmic literacy to prepare for any unforeseen challenges.

Reference list:

Bhandari, A., & Bimo, S. (2022). Why’s Everyone on TikTok Now? The Algorithmized Self and the Future of Self-Making on Social Media. Social Media + Society8(1). https://doi.org/10.1177/20563051221086241

Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics Inf Technol 15, 209–227. https://doi.org/10.1007/s10676-013-9321-6

Coglianese, C., & Lehr, D. (2019). TRANSPARENCY AND ALGORITHMIC GOVERNANCE. Administrative Law Review71(1), 1–56. https://www.jstor.org/stable/27170531

Flew, T. (2021). Regulating Platforms. Polity. https://bookshelf.vitalsource.com/books/9781509537099

Gillespie, T. (2014, January 17). The Relevance of Algorithms. Media Technologies, 167–194. https://doi.org/10.7551/mitpress/9042.003.0013

Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society39(2), 238-258. https://doi.org/10.1177/0163443716643157

Noble, S. U. (2018). Algorithms of oppression : How search engines reinforce racism. New York University Press. https://ebookcentral.proquest.com/lib/usyd/detail.action?docID=4834260.

PASQUALE, F. (2015). The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press. http://www.jstor.org/stable/j.ctt13x0hch

Shin, D & Park, Y. J. (2019). Role of fairness, accountability, and transparency in algorithmic affordance, Computers in Human Behavior, Volume 98, Pages 277-284

Swart, J. (2021). Experiencing Algorithms: How Young People Understand, Feel About, and Engage With Algorithmic News Selection on Social Media. Social Media + Society7(2). https://doi.org/10.1177/20563051211008828

Twitter. (2023). Twitter’s Recommendations Algorithms. https://blog.x.com/engineering/en_us/topics/open-source/2023/twitter-recommendation-algorithm

Weber, R.H. (2010). Introduction. In: Shaping Internet Governance: Regulatory Challenges. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04620-9_1

Be the first to comment

Leave a Reply