Facebook Governance dilemma under multilingualism

In August 2018, Reuters discovered a shocking post on Facebook. The post read in Burmese as: “Kill all the Kalars that you see in Myanmar, none of them should be left alive.” However, the English translation given by Facebook was: “I shouldn’t have a rainbow in Myanmar.

This post takes place against the backdrop of repeated outbreaks of ethnic conflict in Myanmar. About 700,000 Rohingya have fled Myanmar because of military oppression and ethnic violence (Stecklow, 2018). At the same time, Facebook has been abused to spread hatred against them, with the post referring to “Kalar” as an insulting term for Rohingya.

The post clearly violated Facebook Community Standards by calling for violence and criminal behaviour against the Rohingya (Facebook Community Standards, 2024). However, the machine translation provided by Facebook was far from the meaning of the original content due to the difficulty of its system interpreting Burmese text (Reports, 2018), which completely failed to capture the hate and violence intended in the post. Therefore, the platform could not detect and address the offensive content promptly. 

Apparently, Facebook is facing challenges in dealing with multilingual content, as well as detecting and addressing hate speeches in various social and linguistic contexts (Sinpeng, Martin, Gelber, & Shields 2021).

Image 1: Hate speech against the Rohingya on Facebook (Source From Reuters)

Facebook’s global reach has grown rapidly in the last several years, especially in Asian countries such as Indonesia, Philippines, and Myanmar, where it has become a major ISP through its Free Basics program by collaborating with local telcos to provide low-cost access (Reports, 2018). Today, more than 2 billion Facebook users around the world are communicating freely across physical boundaries in their own language (Facebook Community Standards, 2024).

It cannot be ignored that there has been a surge in online hate speech globally with the boom of Facebook. Social media platforms allow users to quickly generate and spread information or ideologies (Kalsnes & Ihlebæk, 2020). However, platforms like Facebook are often abused to disseminate hate speech and extreme viewpoints in nations that are politically unstable, racist, historically religious, and sexist discriminatory. Examples of hate speech attacks against the Rohingya are just the tip of the iceberg, with similar things sprouting up in areas such as Philippines, India and Myanmar. As UN investigators have said, “Facebook has been a valuable tool for those seeking to spread hate against Muslim minorities” (Reports, 2018).

Image 2: Hate speech against the Rohingya on Facebook (Source From Reuters)

In the age of digital communication, with a few exceptions for authoritarian regimes such as China and Vietnam, most nation-states in Asia have been sluggish in regulating online speech, although they are the primary regulators of public discourse historically (Sinpeng et al., 2021). In particular, social media platforms like Facebook are facing unique moderation and regulation challenges in multilingual and multicultural contexts. Therefore, how to better respond to online hate speech become a cutting-edge issue.

Where is the line between online hate speech and free speech? What challenges do platforms face when dealing with multilingual content? How can hate speech be effectively identified and removed in different social and cultural contexts? What regulatory frameworks should platforms adopt for different countries’ laws and value systems? How do differences in users’ digital literacy become a challenge? How do platforms have a duty to moderate?

Maybe we would find some useful answers when we examine the issues above about hate speech on digital platforms and its regulation.

Where is the borderline between hate speech and free speech

“Exclusion, hate, and prejudice are not new. What’s new, however, is that much of this has now gone online. “

From Dr. Babak Bahador, TED Talks, How to combat the internet’s hate problem, 2021

While it is true that social media platforms, by far the most participatory form of mass discourse, have long transcended borders and boundaries and have become an important place for people to communicate, connect, and free speech. However, hate speech has also proliferated on these platforms due to the tolerance of content (Howard, 2019).

The borderline between online hate speech and free speech has become increasingly blurred in the digital age. It’s always been a controversial issue to draw the line between them in all linguistic contexts, not only in the Asia-Pacific region. In addition to what was mentioned at the beginning of the blog that hundreds of thousands of people were ethnically cleansed in Myanmar because of communication that was going on Facebook beforehand. There have also been many hate-driven violence against particular ethnic groups, including over 50 Muslims being killed in Christchurch, New Zealand; African-Americans killed in a church in Charleston, South Carolina; Jews killed in the synagogue in Pittsburgh; Hispanics killed in EI Paso, Texas etc. (Bahador, 2021).

Image 3: A sign is seen near Christchurch after the mosque attacks (Source From Reuters)

It is not simple to draw the line between hate speech and free speech, though. It seems that those who oppose bans on hate speech are the real guardians of free speech; while those who support them are either antagonistic toward free speech or at least willing to infringe on free speech for the sake of other values (Howard, 2019).

To resolve this debate, we must first clearly define free speech and hate speech:

  • Free speech: a basic moral requirement for everyone to express and communicate;
  • Hate speech: attacks or insulting statements on a specific group of people with the intention of destroying their dignity and rights, and which may lead to actual acts of violence or social exclusion.

It follows that the key question in the debate is not whether we should infringe on free speech to eliminate hate speech and its associated vicious incidents but rather whether hate speech should be considered protected by the right to free speech. Free speech, for sure, should be respected and protected, but we also have to recognise that it should be restricted and condemned when it endangers the dignity, safety and rights of others. The prohibition of hate speech does not restrict people’s ability to express themselves; instead, it upholds social order, respects individual rights, and ensures people’s safety (Howard, 2019).

Facebook is facing challenges when dealing with multilingual content

(1) Difficulty in effectively identifying hate speech when dealing with multilingual content

As use expands and Facebook enters every country and region, there are places with languages and cultures we don’t understand.” — Chris Cox (2013)

Stemming from differences in language and culture, as well as a lack of translation technology, Facebook is facing the challenge of recognising and removing hate speech. As Cox (2013) said, even though Facebook has become a major Internet provider in most Asian countries, it doesn’t know the local languages and cultures of many regions well enough.

Facebook didn’t even hire a single Burmese-speaking employee before rushing in Myanmar, even though it has become one of the main news sources in this country (Sablosky, 2021). The flaws in the translation function have caused many issues, where the error is not only in the user-generated content but also in the Burmese translations of the internal guides published by Facebook.

The first release of its Burmese translation of Community Standards implementation guidelines is screwed up in many passages, with the English phrase “We take our role in keeping abuse off our service seriously” being translated into Burmese as “We take our role seriously by abusing our services”.

In response, a Facebook spokesperson simply said, “We’re working on ways to improve the quality of translations, and in the meantime, we’ve turned off this feature in Myanmar.

(2) Platform regulatory framework is always influenced by local laws and interpretations

In addition to the fact that various nations have varied expectations and value systems of Internet communications, the meaning of phrases is also influenced by our subjective opinion (Sinpeng et al., 2021). As a result, how key terms are understood in different linguistic contexts might change.

For example, the term “offence” in hindi is a noun of crime, which means constituting infringement. However the meaning of “offense” in the English context is broader, as it can refer not only to illegal behaviour, but also to impolite expressions in certain contexts.

In addition, platform regulations on how to control content are often influenced by local laws, so there are no universal regulations that apply to all places when it comes to social media platforms responding to and controlling hate speech.

We saw that German law restricts speech that “violates the human dignity of another person by insulting, maligning or defaming a section of the population“, while New Zealand prohibits “threatening, abusive or insulting speech that is likely to provoke hostility towards another person” (Sablosky, 2021).

To effectively address this challenge, Facebook needs more local policy experts, who should have strong political, cultural, and linguistic ties to better understand local social and cultural contexts, so that to recognise and understand the true meaning of key terms in different contexts.

(3) Lack of specific legislation against hate speech in the Asia-Pacific region

There is a divergence worldwide regarding whether hate speech should be legally protected. In the UK, inciting racial or religious hatred can be seen as a criminal offense (Brown, 2016), while many developed democracies like Australia, Denmark, France, Germany, India, South Africa, Sweden, and New Zealand also have similar laws addressing this issue. However, nations led by the United States often prioritize free speech over other political values.

Yet, there is a lack of clarified laws relating to hate speech in the Asia-Pacific region. On the other hand, Facebook’s regulation of hate speech is more comprehensive than the majority of legislation in this region.

Despite Facebook’s rules expressly prohibiting “violent or dehumanizing speech” targeting minority groups or comparing them to animals, Reuters reports that there are still at least a thousand instances of hate speech on Facebook that come from Myanmar, including derogatory remarks made about Muslims and Rohingya people, calling them dogs, maggots, and rapists, suggesting that they should be fed to pigs, and calling for their shooting or eradication (Report, 2018).

Image 4: Derogatory remarks made about Muslims and Rohingya people on Facebook

In fact, when free speech conflicts with other societal norms, it is crucial to provide clear normative guidance to ensure the dignity and safety of marginalized citizens.

Do platforms have a duty to moderate

Many social media companies have adopted a laissez-faire attitude towards extremist content on their platforms for a long time.

As early as 2013, scholars warned Facebook about the platform’s potential for inciting racism and hatred towards Muslims, especially the Rohingya people in Myanmar. David Madden, a Tech entrepreneur, delivered a speech at Facebook headquarters in 2015, highlighting how the platform was being used to fuel hate, yet these warnings were ignored.

Take, for instance, the case of Philippine leader Rodrigo Duterte, whose supporters often use Facebook to vilify opponents. Journalist Maria Ressa has long been vocal about her desire for Facebook to act against fake news; she even discussed it with Mark Zuckerberg at the 2017 F8 conference. However, Facebook did not take any action until 2018.

In September 2021, Frances Haugen, a former employee of Facebook, accused the company of being indifferent to hate speech, violence, and misinformation on its platform and of not acting upon internal reports to improve the safety of potential vulnerable users.

Gillespie (2018) claims the execution and adjustment have been playing a significantly critical role for these online platforms as the increasing presence in worldwide level. As further supported with his opinion, moderation is the principal nature of the platforms. Additionally, the revolutionary shift of Facebook has become the national communication infrastructure, connecting the digital reframing of the structure of organizations and the continuous user’s engagement. 

Consequently, the potential future direction towards online hater speech under multilingual environment suggests that freedom should have been social norm, coercion needs further justifications.


  Bahador, B. (2021). How to combat the internet’s hate problem [Video]. In TED Talks. https://www.ted.com/talks/babak_bahador_how_to_combat_the_internet_s_hate_problem 

  Facebook Community Standards. (n.d.). Transparency Centre. Retrieved April 14, 2024, from https://transparency.fb.com/en-gb/policies/community-standards/ 

  Howard, J. W. (2019). Free speech and hate speech. Annual Review of Political Science, 22(Volume 22, 2019), 93–109. https://doi.org/10.1146/annurev-polisci-051517-012343 

  Kalsnes, B., & Ihlebæk, K. A. (2020). Hiding hate speech: Political moderation on Facebook. Media, Culture & Society, 43(2), 326–342. https://doi.org/10.1177/0163443720957562 

  Reports, S. (2018, August 15). Why Facebook is losing the war on hate speech in Myanmar. Reuters. https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/ 

  Sablosky, J. (2021). “Dangerous organizations: Facebook’s content moderation decisions and ethnic visibility in Myanmar.” Media, Culture & Society, 43(6), 1017–1042. https://doi.org/10.1177/0163443720987751 

  Sinpeng, A., Martin, F. R., Gelber, K., & Shields, K. (2021, January 1). Facebook: Regulating hate speech in the Asia Pacific. https://hdl.handle.net/2123/25116.3 

  Stecklow, S. (2018, September 6). Facebook removes Burmese translation feature after Reuters report. Reuters. https://www.reuters.com/article/idUSKCN1LM202/ 

Be the first to comment

Leave a Reply