Content Moderation in Bilibili: Towards Relatively Democratic Online Community Governance

☑️ I have read and agree to the Terms of Service and Privacy Policy

Have you actually read it? Are you clear about what kind of agreement you have with the platform?

Agreeing to a user agreement is a compulsory step to complete the sign-up or log-in of a social platform. Normally, little attention is paid to the specifics of these terms, as they are written in lengthy and complex legal language and users can only choose to accept or quit using the platform (Ammori, 2014, as cited in Myers West, 2018; Suzor, 2019). Although the purpose of these terms is to define the legal relationship between the operator and the user, and to clarify the rules of use of the platform and the operator’s responsibilities, the running company has almost complete decision-making power when it comes to setting these rules (Suzor, 2019).

Image by Gerd Altmann (from: Pixabay)

What We Can See, What We Can Express

You pretty much only see what they want you to see.

For example, personalised content push is where the platform collects data such as users’ browsing histories, search records, clicking behaviours, etc., analyses and learns about users’ interests and preferences so as to have the appropriate content pushed. This approach helps to improve user stickiness (Suzor, 2019). Moreover, such user data is an important basis for operators to improve and upgrade platform services (Flew, 2021). Considering users’ concerns about information leakage, operators make commitments to users about privacy protection in user agreements.

To what extent do you feel you are being surveilled?

Although the intermediary nature of social platforms allows them to be virtually unaccountable for users’ behaviour (Flew, 2021; Suzor, 2019), in order to provide a better user experience, the operating companies will screen and filter the content posted by users on the platforms (Gillespie, 2018a, as cited in Flew, 2021; Suzor, 2019). Means include algorithmic screening and manual review to identify and remove harmful information, such as pornographic obscenity, hate speech, and incitement to violence. However, Suzor (2019) believes that unless platforms enforce appropriate penalties, such as removing content or banning accounts, for allegedly offending content posted by themselves, most users will not realise that they are being monitored and regulated, and will view platforms as relatively private spaces in online socialising.

Nonetheless, the large user base in the digital age means that a number of people are still affected to varying degrees, and the content moderation mechanisms of many platforms continue to come under constant criticism and censure.

Contradictory And Confusing Content Review

When published content is removed, accounts are banned or flagged or reported, do you feel that you are being maliciously targeted? There is some reasonable ground for such a thought, and there is no denying that such a possibility exists. But let’s start by looking at the problems with social media platforms’ moderation mechanism themselves.

High error and low appeal rates

Suzor (2019) points out that a large digital platform needs to host a huge amount of contents, especially for websites that require high real-time interactivity, and the contents are commonly reviewed after publication. The low-cost censorship systems that are more inclined to be adopted inevitably lead to frequent errors (Suzor, 2019) and lack the necessary flexibility in dealing with complex situations. For instance, a political activist on Facebook (which requires real names) was several times reported to be blocked because her non-English name would be misunderstood in English (Suzor, 2019). However, users struggle to argue with moderators over what may be erroneous treatment. Many feel that they are limited in what they can say, or that no clear appeal process is provided on the platform, or that no helpful response can be given back (Suzor, 2019). Overall, the feedback system still leaves much to be desired.

Insufficient transparency and inconsistent criteria in the decision-making process for moderation

Moderators usually do not provide users with a clear explanation of why specific content was removed or an account banned, even when it is technically feasible, as in the case of YouTube, which still does not give clear reasons for deletion (Whalen, 2020). According to Table 1 (Whalen, 2020, P. 173), the proportion of videos deleted without cause on YouTube (1.1%) is more than double that of those deleted for explicit rule violations (0.5%), and it is even as high as 3.6% of the videos that are rendered unavailable by accounts blocked without a known reason. In addition, the subjectivity of manual review also results in problems (Suzor, 2019). For a website that allows global access to users with diverse backgrounds covering different cultural and political environments, differences or conflicts between the personal values, political stances, and cultural perceptions of the moderators and the users may lead to inconsistent censoring results, triggering anger and frustration among users. Facebook had suspended the account of Celeste Liddle, a black Australian feminist, for posting photos of semi-nude women covering paintings, while allowing Esquire Magazine to post fully naked photos of Kim Kardashian with sensitive parts covered, an act that was questioned as a racial discrimination, but Facebook did not make any apology and informed those who wanted to share Celeste Liddle’s speech that they should not use those photos (Suzor, 2019).

Table 1 Short categorisation of availability of videos (90,492,976 total videos), (Whalen, 2020, P. 173).

Pressures from stakeholders

Platforms as intermediaries are subject to pressures from various parties, such as users, advertisers, governments and regulators, non-governmental organisations and civil society, and many others. The relationships between these stakeholders are complex, and their demands and goals may be mutually supportive or conflicting. It is necessary for operators to balance these different interests in order to maintain the benign development of the platform and user satisfaction. As a result, they have to continue to strengthen speech control (Suzor, 2019) so as to adapt to the changing external environment. However, as operators keep more detailed standards and codes of practice for content moderation secret from the outset, this further limits users’ perceptions and understanding of moderation standards (Myers West, 2018).

Taking into account the main factors mentioned above, users are often confused about the censorship criteria of platforms, and they can only interpret the logic and motivation of moderators in dealing with offending speech through various speculations, which are referred to as “folk theories” (Eslami et al., 2015, Kempton, 1986, as cited in Myers West, 2018; Suzor, 2019). The consequence is that users have less trust in the platform and a sense of exclusion from online communities such as Twitter, which has been described as a “global town square” (Myers West, 2018; Suzor, 2019). Moreover, they are unable to learn the rules of using the platform and correct their violations (Myers West, 2018).

User Participation in Online Community Governance – The Case of Chinese Video Platform Bilibili

Meet Bilibili

Bilibili occupies a leading position in China’s video sharing platforms (Yin & Zhang, 2023), and its main user group is the highly digitalised Generation Z. Bilibili has published official data showing that users under the age of 24 account for almost 80% of its total users (Zhang, 2024). This group is characterised by a strong sense of morality, the pursuit of individuality, encouragement of expression, and respect for diversity and innovation (Yin & Zhang, 2023; Zhang, 2024). In addition, they demonstrate a great deal of loyalty to the platform, which is a strong competitive advantage for Bilibili (Yin & Zhang, 2023). Furthermore, this platform focuses on building a commercially valuable online cultural community with professionally user-generated content (PUGC), with such videos having reached 85.5% of the total number of streams (Yin & Zhang, 2023).

Establishment of the Disciplinary Committee

In 2017, Bilibili implemented a mechanism for user participation in community governance by establishing a disciplinary committee (Chen & Yang, 2023). According to Chen and Yang (2023), the formation of the committee follows the voluntary principle and consists of senior users of the platform, who usually show more positive attitude and willingness to maintain community order. Under such a system, the committee collects and evaluates content reported by users, and is voted on and publicly discussed by more than 500 members, so that a fuller picture of the case can be gained and a fairer decision can be made (Chen & Yang, 2023).

The Content Moderation Process

According to Figure 1 (Chen & Yang, 2023), contents on Bilibili are divided into three main categories. Let me first briefly explain here that bullet comments, also called danmu, are comments that appear on the video frame. Users can leave public comments at any point in time during video playback, and these comments will be displayed on the screen for a few seconds before disappearing. Besides, users can set the font colour and size of danmu, and choose to have it shown in a parallel movement from right to left, or static at a specific position in the frame. Therefore, bullet comments have a significant impact on users’ experience of watching videos.

As we can see, Bilibili’s content review happens both before and after publishing. First, the pre-release of content is handled by AI system and the platform’s internal screening team. For different content types, they have separate responsibilities and also collaborate for double reviews. As for the committee’s participation, it is only set up after the release of videos, and the scope of the moderation is limited to those that have been reported. Moreover, their decision-making process is supervised by the screening team.

Figure 1  Bilibili’s content management and review process (Chen & Yang, 2023).

Chen and Yang (2023) add further details in their study. When committee members vote on randomly assigned reports, more than 60% of the total votes are needed to rule a specific content as a violation, and a decision to ban an account requires more than 50% of the total votes to be made (Chen & Yang, 2023). Additionally, the open discussion will be centered around views submitted anonymously by committee members in response to reported cases. (Chen & Yang, 2023).

Discussion

Bilibili has realized a certain degree of democracy in terms of community administration. Despite the limited involvement of users, this still means that the operational process of content control is more transparent and open. On the one hand, users are able to better understand and learn the rules, thus reducing suspicions and misunderstandings due to unclear reasons for being controlled and keeping a good community atmosphere. On the other hand, it strengthens users’ sense of participation and belonging as community members (Chen & Yang, 2023), as well as their commitment to maintaining community order. In addition, user participation makes regulation more rigorous and fair, which further reduces the possibility of mistaken deletion due to platform algorithmic loopholes and subjective judgments of the review team.

However, the implementation of user participation in platform governance still needs to take into account the market environment and specific national circumstances.

The theories and examples in the previous sections reflect irreconcilable conflicts between stakeholders that ultimately lead to a narrower discursive space (Suzor, 2019). However, in the case of China, where the landscape of online public opinion is highly regulated and controlled by the ruling party, Bilibili, in its quest for sustainable development has complied with the governing party’s demands by assisting them in their political propaganda and ideological infiltration of the platform (Chen & Yang, 2023). The official account of China’s Communist Youth League (CYL), for example, employs a language style that is in line with the platform’s tonality and is more relevant to the younger generation (Chen & Yang, 2023). Positive interactions between the party-state and the users have developed under the facilitation of Bilibili, an approach that Chen & Yang (2023) refer to as “bidirectional mediation”.

Although the external environment faced by each platform is different, dynamic and complex, the overall direction of improving governance of platforms continues to favor providing users with clearer and more specific policies and regulations so that they can utilize their initiative to reduce the output of violating content at the source and ease the platform’s monitoring pressure.

Citations

Chen, Z., & Yang, D. L. (2023). Governing Generation Z in China: Bilibili, bidirectional mediation, and online community governance. The Information Society39(1), 1–16. https://doi.org/10.1080/01972243.2022.2137866

Flew, T. (2021). Issues of Concern. In Regulating platforms (pp. 72–79). Polity Press.

Myers West, S. (2018). Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society20(11), 4366–4383. https://doi.org/10.1177/1461444818773059

Whalen, R. (2020). Understanding content moderation systems: new methods to understand internet governance at scale, over time, and across platforms. In Computational Legal Studies (pp. 166–189). Edward Elgar Publishing. https://doi.org/10.4337/9781788977456.00013

Suzor, N. P. (2019). Who Makes the Rules? In Lawless: The Secret Rules That Govern our Digital Lives (pp. 10–24). chapter, Cambridge: Cambridge University Press.

Yin, J., & Zhang, H. (2023). Analysis on Competitive Strategy of Bilibili. Highlights in Business, Economics and Management11, 108–111. https://doi.org/10.54097/hbem.v11i.7953

Zhang, S. (2024). Differentiation Strategy of Bilibili Platform. Highlights in Business, Economics and Management30, 151–154. https://doi.org/10.54097/rrvews20

Be the first to comment

Leave a Reply