Algorithms play an important role in distributing personalised content that shapes our ideology and behaviour. While consuming the information that algorithms expose us to, we might all have had a slight concern about how much the algorithmic system knows about us. Not only algorithms can know us so well based on analysing and learning from the data collected from us, they actually construct our social reality and plays a role in internet governance. A recent issue of concern is that social reality is now increasingly shaped and constructed by algorithmic selection on the Internet in various life domains (Just & Latzer, 2017, p246). Through individualisation centring on a capital-driven value, algorithms produce social orders that sustain the capitalist system, functioning as institutions and governance mechanisms (Just & Latzer, 2017, p244).
What does concern mean to ordinary people like us?
In this blog post, to cultivate an understanding that algorithms are a form of institution that impacts our agency in both cyberspace and the reality of socio-cultural and political dimensions, in the first part, I am going to reflect on my transcultural experience in relation to algorithm’s individualisation and “filter bubbles”, to discuss algorithmic selection’s impact on political polarisation, as well as its potential in building space for marginalised group’s online community when held accountable. In the second part, I’m going reflect on the algorithms of Chinese social media Douyin functioning as intermediaries in building platform nationalism, and authority makes use of algorithms’ governance feature to exercise ideological control on social media.
Personalised Reality and Social Fragmentation
Based on data analysis and self-learning, algorithms construct the personalisation of users, which is the central goal of the algorithmic process and results (Just & Latzer, 2017, p247). Algorithmic selection shapes the construction of individuals’ realities, also as a result impacts the collective consciousness such as reproducing culture, norms, and values, hence shaping the social orders in nowadays societies (Just & Latzer, 2017, p246).
For example, algorithm selection facilitates the distribution of memes. Algorithmic selection contributes to the personalisation of the process of meme replication, adaption, and remixes, shaping the audiences’ sense of humour and ways of self-expression (Nissenbaum & Shifman, 2017). The languages, signs, and texts selected by algorithms convey individualist value, affecting the maintenance and reproduction of social orders. Since algorithms amplify the existing trends of fragmentation and individualisation to process personalisation, also predominantly convey and reinforce commercialisation as the dominant value (Just & Latzer, 2017, p248-251), the reality of each person’s life is being customised and shaped to sustain the capitalist ideology of individualisation and commercialisation.
As a result of algorithmic selection, individuals may be only exposed to information and perspectives that resonate with their existing ideologies such as beliefs and biases, and users may not encounter opposing views or diverse opinions. The phenomenon is noted as “filter bubbles”, which can lead to a reinforcement of existing beliefs, as well as limit individuals’ exposure to different discourses and critical evaluation of information (Andrejevic, 2019).

The experience of the “filter bubbles” effect is rather obvious to me, in terms of the divisions of information due to individualisation. I was a heavy social media user especially during the COVID lockdown, a time I just discovered TikTok and Instagram and was fresh to my transcultural environment. I started to build my new online identity which is based on some music subcultures. Meanwhile, without any chance to socialise or have any supporting network in reality, social media was the only resource I could form connections with people and gain understandings of Australian young people’s social life and popular culture. My online identity and behaviours such as my interests, language I use, sense of humour in English, and social skills that I thought would work everywhere with my peers in Australia (which I discovered soon that it is not the case), are based on the content I encountered online based on algorithms.
I built my connections and community on TikTok and Instagram, however, when I extended some of my online identity and relationships to reality, while the slang and jokes from online culture function playfully within my friend group in reality, if I get out of my social comfort zone and socialising with people whose lifestyle and identities are distinctively different to me, I have a preconceived idea that this person may not be consuming similar internet culture as me. I remind myself not to use some cyber language I internalised or not to make jokes related to viral online memes when talking with them, since not everyone is “chronically online”, or things that are viral to me might not have the same exposure and meanings for other people from other online communities.
Hence, rather than a projection of a coherent aspect of reality, the transcultural experience I gained from social media’s popular culture is a fragmentation constituted by the process of algorithms reproducing my existing values and interests. Algorithmic selection plays a vital role in shaping my individualised online space and communities that resonate with my ideology, and impacts my identity construction and social behaviour. It also largely intervened with my transcultural agency during lockdown since it reinforces the limitation and alienation of individuals.
Algorithms and Political Polarisation
Besides the individual dimension of the “filter bubbles” effect in mundaneness, algorithmic selection’s individualisation could cause consequences in political aspects. Since there is a situation where people only access information that confirms their ideology or communicate with like-minded people, the fragmentation and individualisation of audiences could bring detrimental consequences to democracy (Katz, 1996 & Mancini, 2013, as cited in Just & Latzer, 2017, p-248), and “filter bubbles” may potentially endanger two preconditions for democratic systems: unplanned encounters and shared experience (Sunstein, 2007, as cited in Just & Latzer, 2017). What is more, as the tendency of social media algorithms is to prioritise the distribution of polarising, controversial, false, and extremist content (Andrejevic, 2019, p45), politically polarised misinformation such as false news and conspiracy theories are widely distributed by social media’s algorithm systems, brings interruption to social cohesion and stability.
For instance, TikTok’s algorithms contribute to the spread of conspiracy theories. TikTok is a short video platform whose mechanism and algorithm system is designed to encourage user participation, creating conditions where videos are relational and affective – which is the vital drive of information circulation (p10). TikTok’s algorithm attempt to capture user desire and amplifies affective attention, “the circulation of conspiracy across digital platforms is driven by an affective charge that replaces trust in expertise and symbolic efficiency”, conspiracy theories do not require knowledge and understanding of the underlying reality, but rather capture a sense of tone that is aligned with the fears and anxieties of the audience (Grandinetti & Bruinsma, 2022, p6).
The algorithm-driven distribution of conspiracy theories exploits users’ affective attention from their emotions and concerns. Conspiracy TikTok conveys strong political-orientated misinformation that could be used as a tool to delegitimise opponent political views and ideas, causing harm to social democracy. The polarising and extreme ideas can draw huge attention from users, hence it falls into algorithmic selection’s priority. Since the goal of conspiracy theory is to exercise the power of control and mastering its desire for political order” (Fenster, 2008, as cited in Grandinetti & Bruinsma, 2022, p6), conspiracy TikTok serves the platform’s political polarisation as a game of power that messes with our social sensations and democracy development.

Platform algorithms are structured in social power relations, to maintain its maximum market value, algorithmic service such as search engines bias information towards the powerful side (Noble, 2018). Although algorithm systems’ capitalisation and individualisation could reinforce existing bias, and algorithm systems tend to take measures to discipline and control the endangered individual (Just & Latzer, 2017, p248), from my perspective, algorithm systems may have potential in building a safer space for marginalised communities if the platform and algorithm company could be held accountable. Algorithmic selection applications convey functions of search, aggregation observation/surveillance, prognosis/forecast, filtering, recommendation scoring, and content production allocation (Just & Latzer, 2017, p240), which all have the ability to help to identify and connect individuals with shared experiences and identities, filter harmful content out, and to better distributing their content to form online communities.
TikTok is a vital platform for LGBTQ community to connect and share resources, yet the biased TikTok’s algorithm system failed to protect them from hate speech and bullying, as the lack of boundaries of the platform regulation “make it an ideal hotbed for violent and extremist content” (Weimann & Masri, 2020). A more transparent policy in algorithmic regulation of hate speech and developing more machine-learning to identify hate speech could be beneficial to endangered individuals and their online communities. Since transparency might provoke complexity that defeating understanding, and “transparency is not just an end in itself, but an interim step on the road to intelligibility” (Pasquale, 2015, p8), it is important to keep questioning platforms’ accountabilities.
Algorithms as Intermediaries
Civic disposition refers to the attitudes, values, and behaviours that individuals have towards their communities and their responsibilities as citizens to facilitate a common goal of a better society (Novitasari, 2018). In the previous part, I discussed the filter bubble effect and algorithmic selection’s role in political polarisation, which brings harm to the democratic development of civic disposition. While the algorithm is a threat to civic disposition in a democratic context, algorithmic selection could also be used to shape political propaganda, to encourage users to practice “civic disposition” that strictly serving to the authority’s ideology.
Providers of algorithmic selection services are usually active as intermediaries and market-makers between two demand sides (Just & Latzer, 2017, p251), also “push the platformization of markets and modify power structures, leading the mass media to lose ground in the construction of realities” (Just & Latzer, 2017, p254). ByteDance, the internet technology company located in Beijing which provides the algorithm services of TikTok, also powers the operation and algorithms for a series of applications in China such as Douyin. ByteDance has aligned with the Chinese government’s dominant ideology for its own survival and profits, and functions as an intermediary between Douyin’s platform market, the government, mass media, and other power relations it engaged with.
Users of Douyin have to keep their activities under the authority’s regulation and align with CCP’s ideology, and Douyin formally stated that content posted on the platform should be aligned with the core socialist values in China (Chen, Valdovinos Kaye & Zeng, 2020, p107). The algorithm provider sustains the platform’s markets and CCP’s regime at the same time. Chen, Valdovinos Kaye, and Zeng explored Douyin’s platform algorithm in shaping platform patriotism in their article #PositiveEnergy Douyin: constructing “playful patriotism” in a Chinese short-video application. They noted that the positive energy content is facilitated by Douyin’s tight content monitoring, which largely relies on ByteDance’s proprietary algorithm and monitors. “ByteDance may have inadvertently, or perhaps intentionally, transformed Douyin into a digital enclave for the state to promote its positive energy brand of patriotism” (Chen, Valdovinos Kaye & Zeng, 2020, p107).
Based on Chen, Valdovinos Kaye, and Zeng’s research outcome, Chinese state institutions have adopted popular internet culture to intervene in social media users’ engagement and content creation, promoting “playful patriotism” content that is light and informal to engage the audience (Chen, Valdovinos Kaye & Zeng, 2020). Videos of “positive energy” and “playful patriotism” that promote socialist values from governmental accounts such as state media and state institutions, as well as educational institutions’ official accounts, constituted more than half of the “positive energy” content and reaches significant exposures. Douyin also promotes grassroots role models and family content that promotes harmony and micro-patriotism to shape the public’s ideology. Douyin has impacted the power and roles of Chinese national mass media in the construction of patriotic reality. Furthermore, algorithms are autonomous actors and policy-makers (Just & Latzer, 2017, p252), in ByteDance’s case that its algorithm services are designed and responsible for promoting the content of “positive energy”, the algorithms are factors reinforcing social norms and structures, impacting on individual’s agency. To pursue content exposure, individual content creators are expected to serve the dominant social values, they might need to self-moderate and self-censorship, and sacrifice their authenticity for reaching content visibility. Thus, while the governments engage with digital companies and social media platforms to achieve political goals (Chen, Valdovinos Kaye & Zeng, 2020, p113), individuals on Douyin are encouraged to practice their civic disposition following the platform values under strict monitoring.

Photo of an example of “positive energy” on Douyin, from PositiveEnergy Douyin: constructing “playful patriotism” in a Chinese short-video application. Chinese Journal of Communication, 14(1), 97–117.
Conclusion
Algorithm selection is not just a technology that facilitates the process of receiving information and making our online experience effective and convenient. Algorithms are agents that have bias and values, which could be understood as forms of institutions that shape our perspectives and social behaviour. Centring to personalisation and commercialisation, social media users may experience filter bubbles effect in daily and social dimensions, people are less likely to encounter diverse information and pay attention to different perspectives. Algorithmic selections also facilitate the distribution of false information such as conspiracy theories, shape political polarisation, and bring harm to the individual agency of civic disposition and social democracy. As intermediaries, algorithm providers operating between multisided markets, influence the role and power of mass media. By incorporating state social values and partnering with the government, algorithm providers such as ByteDance in China facilitate forms of civic disposition that align with the state’s dominant ideology, which reflect algorithms’ role in shaping our agency and behaviour based on the structure of social power relations.
Reference List
Andrejevic, M. (2019). Automated Media. Routledge.
Chen, X., Valdovinos Kaye, D. B., & Zeng, J. (2021). PositiveEnergy Douyin: constructing “playful patriotism” in a Chinese short-video application. Chinese Journal of Communication, 14(1), 97–117. https://doi.org/10.1080/17544750.2020.1761848
Grandinetti, J., & Bruinsma, J. (2022). The Affective Algorithms of Conspiracy TikTok. Journal of Broadcasting & Electronic Media, ahead-of-print(ahead-of-print), 1–20. https://doi.org/10.1080/08838151.2022.2140806
Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157
Nissenbaum, A., & Shifman, L. (2017). Internet memes as contested cultural capital: The case of 4chan’s /b/ board. New Media & Society, 19(4), 483–501. https://doi.org/10.1177/1461444815609313
Noble, S. U. (2018). Algorithms of oppression : How search engines reinforce racism. New York University Press.
Novitasari, N. (2018). Social Media Influence on the Millennial Generation’s Civic Disposition. Journal of Moral and Civic Education (Online), 2(2), 64–76. https://doi.org/10.24036/885141222201899
Pasquale, F. (2015). Introduction: the need to know. In The Black Box Society: The Secret Algorithms That Control Money and Information (pp. 1–18). Harvard University Press. http://www.jstor.org/stable/j.ctt13x0hch.3
Weimann, G., & Masri, N. (2020). Research Note: Spreading Hate on TikTok. Studies in Conflict and Terrorism, ahead-of-print(ahead-of-print), 1–14. https://doi.org/10.1080/1057610X.2020.1780027
.
Be the first to comment