Social Divides: The Societal Effects of Douyin’s Algorithms

Exploring How Tailored Content Shapes Perceptions and Reinforces Information cocoon

(维基媒体项目贡献者, 2017)

Exploring the Impact of Douyin on Social Norms and User Behavior

In the realm of social media, few platforms have shaped cultural trends as significantly as Tiktok, particularly its Chinese counterpart, Douyin. Its sophisticated algorithm, more than just a technical marvel, plays a critical role in moulding user behaviour by aligning content with users’ perceived preferences, often based on basic demographic information. This method not only confines users to their informational bubbles—limiting exposure to differing viewpoints—but also subtly directs the social dialogue, particularly around sensitive topics like gender roles.

This raises an ethical question: how does this balance between user engagement and algorithmic responsibility play out in real life?

Algorithms at the Heart of Social Media: Power and Influence

Digital platforms, particularly social media apps like Douyin, are fundamentally changing the social landscape by shaping user behaviour through sophisticated algorithmic manipulations.  This manipulation is not only about tailoring content to increase user engagement but also involves profound socio-economic implications.  As Terry said, they have moved from simple intermediaries like search engines to becoming dominant socio-economic structures.  This transformation is pivotal because the algorithmic filtering and personalization of content on these platforms can lead to the creation of information cocoons—spaces where users are exposed primarily to content that aligns with their existing beliefs and preferences (Flew, 2021).

(sherman.1521, 2020)

This algorithmic personalization based on user profiles can profoundly impact societal trends and individual behaviours, reinforcing stereotypes and perpetuating echo chambers. For example, platforms might push content based on gender, potentially influencing users’ perceptions and behaviours subtly and overtly. Such targeted content delivery is increasingly sophisticated, leveraging vast amounts of user data to predict and influence what users will find engaging (Pasquale, 2015).

Moreover, these platforms’ capacity to shape public opinion and individual cognition is intertwined with significant ethical and governance questions. They raise concerns about privacy, consent, and the extent of influence they should morally hold over public discourse. The governance of these algorithms, often opaque to the public and regulators, poses challenges in ensuring they serve the public good while respecting individual autonomy(Flew, 2021).

This background sets the stage for a deeper examination of specific cases like Douyin in China, where the intersection of algorithmic content curation and societal values highlights the complex role these platforms play in shaping cultural and social realities.

As we consider the implications of such technologies, it becomes clear that while they offer unprecedented connectivity and services, they also necessitate a critical examination of their impact on personal and collective freedom. This dual aspect of digital platforms as both enablers and controllers of social dynamics is a crucial point of discussion for policymakers, technologists, and users alike (Pasquale, 2015).

Navigating the Gender Divide: Exploring Douyin’s Algorithmic Influence

Douyin’s algorithms are transforming the social fabric, emphasizing content that users are likely to engage with, thereby shaping their perceptions and reinforcing existing beliefs. These digital strategies, which use extensive data to predict and affect user behaviour, come with their own set of ethical dilemmas, including issues of privacy and the extent of their influence over public discourse. 

(中国数字时代, 2022)

The recent viral discussion about Douyin, particularly its Chinese version Douyin, underscores growing concerns regarding the so-called “information cocoon” where users experience highly personalized content and comments based on demographic and behavioural data. The evolution of digital platforms from simple intermediaries to dominant socioeconomic structures complicates their generalization and necessitates a nuanced understanding of their impacts across social, economic, and cultural dimensions(Flew, 2021). This phenomenon was highlighted when different users observed distinctly different comments under the same video.

Users find out that a peculiar pattern emerged on Douyin: the platform’s algorithm appeared to segregate comments on videos based on the viewer’s gender. This was notably observed in a video where a couple was arguing. Male users predominantly saw comments from other males, whereas female users encountered remarks from females. This division sparked curiosity and led to several experiments by users, one of whom modified her profile to reflect middle-aged interests and noticed a swift shift in the content and comments the algorithm recommended, transitioning her digital experience from one age demographic to another(差评, 2023).

The strategic intention behind such algorithmic adjustments is clear: to maximize user engagement by tailoring content that reinforces pre-existing beliefs and interests. This strategy, effective in increasing time spent on the app and boosting potential ad revenue, raises ethical questions concerning transparency and the reinforcement of societal divides, such as gender disparities.

Douyin’s approach to content curation and comment visibility, driven by demographic data, not only encapsulates users within ‘information cocoons’ but also risks perpetuating gender stereotypes by curating content that seems endorsed by specific genders. The inability of users to control how comments are sorted—whether by popularity or time—further strips them of agency and clouds public discourse, skewing perceptions and potentially molding societal norms under the guise of algorithmic neutrality.

These practices resonate with Shoshana Zuboff’s concept of “Surveillance Capitalism,” where personal experiences are transformed into behavioural data, commodified to predict and influence user behaviour (Möllers, Wood, & Lyon, 2019). This paradigm shift, highlighted in works like “Governance by algorithms,” underscores a significant challenge to traditional notions of autonomy and privacy, suggesting that platforms like Douyin are not just passive intermediaries but active participants in shaping social dynamics(Just & Latzer, 2019).

As we delve deeper into the algorithmic manipulation on social platforms like Douyin, it becomes crucial for users to remain vigilant and critical of the content they consume. Understanding the mechanisms behind algorithmic decision-making can empower users to navigate digital spaces more consciously, advocating for transparency and a balanced discourse that bridges rather than widen societal divisions.

Algorithms Trap Users in Information Cocoons

A significant surge in user dissatisfaction has emerged, primarily fueled by the understanding of how algorithms heavily influence content curation on digital platforms.    This development has ushered in the era of the “information cocoon,” a term that encapsulates the feeling of being enclosed within a digital bubble.

(大海和鱼, 2021)

The core of the issue lies in how these algorithms operate, subtly creating echo chambers by personalizing content to a degree that not only reaffirms users’ existing beliefs but also potentially distorts their perception of reality. This personalization, while initially designed to enhance user engagement and satisfaction by delivering content that is presumably more relevant, has led to unintended consequences. Users are increasingly finding themselves trapped in loops of similar content, which significantly limits their exposure to diverse viewpoints and ideas.

A particularly poignant example of these concerns can be seen in how platforms like Douyin handle user interactions. The platform’s method of curating comments and content based on demographic data has raised serious questions about transparency and the ethical implications of such practices. This manipulation of user experiences highlights a broader issue with algorithmic governance and its role in shaping public discourse.

Douyin’s New Features Aim to Boost Autonomy and Ethical Engagement

Douyin’s page for adjusting preferences(差评,2023)

Douyin has recently made significant adjustments to its platform to enhance user autonomy and address ethical concerns about its algorithms. In a pivotal move, the social media giant introduced a new “Content Preferences” setting. This feature allows users to adjust how content is recommended to them, offering the ability to either uphold the system’s default settings, dial down or intensify content suggestions, or completely switch off personalized recommendations. These changes reflect Douyin’s commitment to providing users with more control over their viewing experience. Additionally, Douyin rolled out a “not interested” button and an “optimize recommendations” feature. These tools were developed to fine-tune the accuracy of the platform’s algorithms, particularly when users choose non-personalized content options. The intention behind these features is to ensure that users still discover engaging and relevant content without feeling manipulated by the platform’s algorithms(赵, 2022).

These updates are Douyin’s response to criticism regarding the platform’s potential to create information bubbles and reduce user agency. In his analysis, Terry Flew points out generalizing about platform is becoming increasingly difficult” due to their complex role in socio-economic organization(Flew, 2021). Douyin’s strategic adjustments underscore the ongoing challenge digital platforms face in balancing user engagement with transparency in algorithmic processes, highlighting a nuanced approach to addressing user feedback and ethical concerns.

Does this change give users a choice?

Shoshana Zuboff in her work on surveillance capitalism points out that these types of platform adjustments are often superficial, masking the deeper structural dynamics at play, where “instrumentarian power” is the instrumentalization of human behaviour for modification and monetization”(Möllers, Wood, & Lyon, 2019).

They might still be deeply embedded within a business model that fundamentally seeks to commodify user attention. The adjustments include more granular control over content preferences and the introduction of features like the “not interested” button. These changes are ostensibly designed to mitigate the ‘information cocoon’ effect by providing users with tools to influence the algorithm’s output. However, as Shoshana Zuboff argues, these forms of control are often superficial, giving users a sense of agency without actual substantive change in the commodification of personal data for profit. This manipulation is a characteristic of what she terms ‘surveillance capitalism,’ where personal data is commodified in a new form of market that is not based on the traditional exchange of goods but on the extraction and control of data for behaviour modification(Möllers, Wood, & Lyon, 2019).

Other Factors Influencing Douyin’s Strategy

The broader implications of Douyin’s strategy are multifaceted, particularly in light of China’s low birth rates and the government’s intent to avoid increasing gender antagonism (Minzner, 2024). The platform’s algorithmic decisions must navigate complex socio-political terrains, where the manipulation of content to avoid gender conflicts can align with broader national goals to stabilize societal structures and support demographic strategies. However, such manipulation risks exacerbating societal divisions or suppressing essential discourses on gender and equality. Terry Flew discusses the evolution of digital platforms and their role as complex socioeconomic structures that challenge traditional regulatory frameworks. He suggests that digital platforms, by their design and scale, inherently possess the capacity to shape societal norms and behaviours in profound ways that are often hidden from public scrutiny(Flew, 2021).

China’s fertility rate continues to decline (Kirkeggard, 2024)

The dual focus on enhancing user engagement while managing content to align with broader social policies presents a strategic challenge for Douyin. While it may bolster short-term engagement metrics, the long-term societal impacts of such strategies, including potential reinforcement of gender stereotypes or the marginalization of critical social movements, are significant. These strategies necessitate a delicate balance between commercial interests and ethical considerations, highlighting the need for more transparent and accountable algorithmic governance in digital platforms.

Managing Algorithms for Equitable Discourse

This exploration of Douyin’s algorithmic practices underscores the substantial influence these mechanisms have on user behaviour and societal norms in China. By tailoring content using demographic insights, Douyin not only boosts user interaction but also molds social trends, which could exacerbate gender disparities and cement existing stereotypes. The platform’s recent initiatives to refine content curation show an attempt to address criticisms about user independence, yet they might still favour commercial gains over true user empowerment.

As we look ahead, the potential long-term consequences of such algorithmic control include heightened social fragmentation and a decline in the richness of perspectives shared online. With social media platforms like Douyin playing a pivotal role in shaping public discourse, the transparency of these algorithms and the empowerment of users stand out as paramount concerns.

To mitigate these effects, it is crucial for users to proactively adjust their platform settings to better control the content they encounter and to question the validity of the information they consume. Advocating for more stringent regulations on algorithm transparency and user influence is vital. By fostering awareness and advocating for stronger accountability from social media companies, users can help cultivate a digital environment that supports diverse and equitable dialogue, expanding our collective understanding of the world.

Reference

差评 (2023). 男性和女性的评论区不一样,算法连这也不放过了?-36氪. [online] 36kr.com. Available at: https://36kr.com/p/2426760371627010

Flew, T. (2021). *Regulating Platforms*. Cambridge: Polity.

Just, N., & Latzer, M. (2019). Governance by algorithms: reality construction by algorithmic selection on the Internet. *Media, Culture & Society, 39*(2), 238-258.

Minzner, C. (2024). China’s Population Decline Continues. Council on Foreign Relations. Available at: https://www.cfr.org/blog/chinas-population-decline-continues#:~:text=Statistics%20suggest%20that%20China’s%20total,trajectory%20is%20far%20from%20unique. [Accessed 13 Apr. 2024].

Möllers, N., Wood, D. M., & Lyon, D. (2019). Surveillance capitalism: An interview with Shoshana Zuboff. *Surveillance & Society, 17*(1–2), 257–266. https://doi.org/10.24908/ss.v17i1/2.13238

Pasquale, F. (2015). The Need to Know. In *The Black Box Society: The Secret Algorithms That Control Money and Information* (pp. 1-18). Cambridge: Harvard University Press.

赵云合 (2022). 抖音算法大调整 用户掌握选择权 – 零售 – 亿邦动力. [online] m.ebrun.com. Available at: https://m.ebrun.com/496452.html [Accessed 14 Apr. 2024].

Picture reference:

大海和鱼 (2021). 算法个性化推送的弊端:信息茧房效应. zhihu.

Kirkeggard, J. (2024). The birth rate of Chinese women is falling. piie.

sherman.1521 (2020). How to Break out of Your Social Media Echo Chamber. WIRED.

维基媒体项目贡献者 (2017). 字節跳動旗下短片社交軟體. [online] Wikipedia.org. Available at: https://zh.wikipedia.org/zh-cn/%E6%8A%96%E9%9F%B3.

中国数字时代 (2022). ‘性别对立’,是专属于女性的罪名吗?. 中国瞭望.

Be the first to comment

Leave a Reply