Criminals Shielded by Social Media Privacy Rights

Navigating the balance between privacy and security in cyberspace

Figure 1: Everything to Know About the Nth Room Case in Cyber Hell (Source: Netflix, 2022)

“He already had my face, my voice, my personal information. I was afraid that he would threaten me with that information if I said I would quit.”

– from one of the underage victims of the “Nth Room”

This quote comes from one of the underage victims of the “Nth Room” case, one of the most serious sexual exploitation cases in South Korean history.

Between late 2018 and 2020, South Korea was rocked by the “Nth Room” scandal. The operators of “Nth room”, named for its multiple chat rooms labeled with ordinal numbers like Room 1, Room 2, and so on, took to Telegram to spread vile sexual exploitation content. Victims were forced to filming rape, sexual abuse, and other forms of exploitation, with the videos and images then sold to paying members.At least 74 victims were identified, with over 260,000 users accessing these chat rooms and engaging in payment activities.

Following the incident, Telegram remained silent in response to South Korean law enforcement demands for information on those uploading illegal videos.Transactions were conducted using virtual currencies, and chat logs were regularly purged, making evidence collection and tracking incredibly difficult.

As stated in its “Frequently Asked Questions” section,Telegram (n.d.) takes a hands-off approach unless public collections, channels, or bots are suspected of illegal activities. All chats and group chats on Telegram are treated as private, with the platform declining to process any related requests.

The “Nth Room” case in South Korea sheds light on how emerging technology has been abused by perpetrators to commit cybercrimes, exploiting the protection of digital privacy rights and almost evading global capture. Cybercrimes have become a global issue.

How to strike a balance between the protection of user privacy rights of digital platforms and cyber governance on the Internet? Does it really coexist? Taking the example of Telegram, a social media communication platform that claims to be “either unsafe for everyone or safe for everyone,” what role does it play in such cybercrimes? Is digital platform neutral? To what extent should regulation be implemented? Who should be responsible for such digital crimes? Is there a natural contradiction between protecting user privacy and combating crime?

Perhaps the answer is not necessarily binary.

What is privacy?

Privacy is a fundamental human right. The right to privacy “constitutes an absolute imperative for…individual[s]” (Eissen, 1967, cited in De Meyer, 1973) and is enshrined in international human rights treaties (UNODC, n.d.).

The definition of privacy varies, but overall, the concept of the right to privacy primarily focuses on three aspects (Cooley, 1907; Fried, 1970; Janis, Kay and Bradley, 2000; Maras, 2009):

– the right to be free from observation and interference;

– the ability to keep one’s thoughts, beliefs, identity, and actions confidential;

– the right to choose and control when, what, why, where, how, and to whom information about oneself is revealed and to what extent information is revealed.

The right to choose and control information about oneself, including both physical and cyber actions, links privacy rights to information and data protection.

What is the relationship between privacy and security?

It’s worth noting that security and privacy are closely intertwined. Safeguarding privacy serves as a method to ensure security, while sometimes compromising certain aspects of privacy can also contribute to enhanced security.

Imagine this,

without data privacy, someone who’s intrigued by a selfie you’ve shared online could easily get hold of your address and contact details, even leading to offline harassment and stalking.

In turn,

when user privacy is 100% guaranteed, if an anonymous user verbally abuses and attacks you online, aside from reporting and banning their account, there won’t be any consequences. This could further encourage the occurrence of illegal online behavior.

Figure 2: Everything to Know About the Nth Room Case in Cyber Hell (Source: Netflix, 2022)

Let’s discuss these two points separately.

Protecting privacy serves as a means to achieve security by ensuring individuals the dignity, autonomy, and freedom to live and express themselves without fear or coercion. This is particularly crucial in the cyber world, where data protection is the core of security. The security of users is significantly influenced by how their data is accessed, collected, deleted, modified, and disclosed. Moreover, data protection facilitates online anonymity to some extent, creating a zone of privacy for individuals and groups to freely express opinions and exercise freedom of expression without arbitrary or unlawful interference or attacks (A/HRC/29/32, para. 16).

On the other hand, the online anonymity can encourage some people to express cruel, discriminatory, racist, hateful, and other forms of harmful speech towards others, which they might stop doing and self monitoring if their personal information were public.Taking the Telegram as an example, criminals took advantages of Telegram’s encryption technology and anonymity features to boldly carry out criminal activities. This made it incredibly difficult for the South Korean authorities to identify the masterminds behind the scenes, as their “privacy rights” were protected by Telegram.

What role does Telegram play in such cybercrimes?

In fact, Telegram staunchly defends the core principle of “protecting user privacy to achieve security,” or rather, it is an extreme advocate.

Telegram was born to protect user privacy, developed by Pavel and his brother Nikolai Durov.  It is an encrypted social communication platform that supports features such as secret chats, self-destructing messages, and periodic account deletion, making it difficult for governments to monitor users. Technically, Telegram has several main social modules, sticker sets, channels, bots, chats, and group chats accommodating up to 200,000 people.   Telegram divides them into two parts: sticker sets, channels, and bots are public, while chats and group chats are private. Regarding the question “There’s illegal content on Telegram. How do I take it down?” it states,

Figure 3. Illegal Report. (Source: Telegram)

“All Telegram chats and group chats are private among their participants. We do not process any requests related to them .”

All cloud chat data is stored highly encrypted, and once secret chat is enabled, data can only be accessed on the original device, cannot be forwarded, is not stored on servers, and supports self-destruction. Telegram emphasizes that the two most important aspects of its privacy philosophy are protecting private conversations from third-party (government, employers) eavesdropping and protecting personal data from third-party (such as marketers, advertisers, etc.) infringement (Telegram, n.d.).

Rejecting the South Korean police (at that time, the FBI also participated in the joint investigation) to request user data for the main perpetrators of the Nth room incident is not the first and last time Telegram has refused. Telegram refuses to provide decryption keys to monitor users to multiple national governments, refusing to combat serious crimes such as terrorism, and has been blocked by judicial orders in multiple countries.

Claiming that “either everyone is safe or no one is safe,” Telegram provides a new way of communication for criminal groups through links and the use of cryptocurrencies. Felicity Gerry, an Australian criminal lawyer and professor at Deakin University studying human trafficking, stated that “The idea of organised crime keeping that activity a secret is not new. But the use of technology is enabling that to be done in new ways, both to carry out the crime and to prevent investigations” (Lowy Institute, 2020).

“Is the risk of terrorism more important than 100% privacy? This is a large-scale debate about societal values.” Pavel pointed out in an interview (CNN, 2016), “It should be decided by the people of the relevant countries, not by me.” Pavel also addressed questions about terrorists communicating using his software and whether software should be held accountable, stating, “Terrorists also use iPhones, Android phones, Microsoft chips, so saying that we or any technology company should be responsible for this is misleading.” In his view, the dual nature of technological progress will always exist.

The developers of Telegram seem to lean more towards describing their platform as an intermediary, a communication service, a provider of encryption to protect user privacy, and an observer guarding privacy in the private domain where illegal activities occur. However, is the platform really just a neutral service provider?

The definition of digital platforms

Let’s first take a look at the definition of platforms. We can learn from Terry Flew’s book “Regulating Platforms” (2021) that, given the diversity and complexity of platforms, there are currently three influential definitions. Among them, one focuses on whether platforms have regulatory responsibilities.

In Custodians of the Internet, Tarleton Gillespie (2018)  places the practice of moderation at the core of their operations. For Gillespie (2018), platforms are online sites and services that,

– host, organize, and circulate users’ shared content or social interactions;   

– do so without having produced or commissioned (the bulk of) that content;   

– beneath that circulation of information, build on an infrastructure for processing data for customer service, advertising, and profit;

– and moderate the content and activity of users through logistics of detection, review, and enforcement.

He sees the promise of moderation to be one of the commodity that platforms offer. He insists that platforms can not survive without moderation.

According to the above statement, it is clear that although platforms prefer to portray themselves as mere service providers with no stance, they do have a responsibility to moderate the culture within their platforms.

How are platform regulatory responsibilities defined?

As digital platforms become crucial components of contemporary culture, the influential power of platform culture orientation cannot be ignored. Therefore, platforms need to take on a certain responsibility in content moderation and maintenance. However, questions may arise: Should platforms be held accountable for all users’ illegal behavior? Are platforms too innocent?

In this regard, an important provision is the platform’s legal immunity aspect, CDA230, which is part of Chapter 47 of the United States Code and was enacted as part of the Communications Decency Act of 1996. The passage and subsequent legal history supporting the constitutionality of Section 230 are considered crucial for the development of the Internet in the early 21st century. Combined with the Digital Millennium Copyright Act (DMCA) of 1998, Section 230 provides Internet service providers with a safe harbor, allowing them to operate as content intermediaries as long as they take reasonable measures to remove or block access to such content, without being held responsible for that content.

CDA230 to some extent represents platform neutrality. Its authors, Christopher Cox and Ron Wyden, argued that interactive computer services should be treated as distributors, not responsible for the content they distribute, as a means of protecting the evolving Internet at the time. However, CDA230 immunity is not unlimited. Platforms need to respond to reports of illegal content on the platform rather than ignoring them.

It must be acknowledged that the increasingly difficult justification for the strong protection of freedom of speech provided by CDA 230 regulations is becoming evident. Further, there is a strong moral argument that respected companies should not tolerate this type of abusive behavior (Suzor, 2019).

For platforms, this presents a very contradictory situation. They want to be seen as merely providing technology for people to communicate and share their ideas and content, protecting users’ privacy so they can communicate safely. However, at the same time, in order to maintain a good community environment, they need to plan, regulate, and control their networks through access to user data. In this regard, most platforms engage in excessive collection and use of user data.

Is Trading Privacy for Security Always the Optimal Solution?

Is combating crime and protecting data privacy necessarily contradictory? Not necessarily.

Figure 4. GDPR & ePrivacy Regulations. by Dennis van der Heijden, n.d.

“There are usually technical solutions that can prevent misuse without compromising personal privacy,” says Marit Hansen, the Data Protection Commissioner for the German state of Schleswig-Holstein. According to her, there may not necessarily be a conflict between privacy and security.

GDPR provides a relatively perfect solution.

GDPR (General Data Protection Regulation) is a comprehensive data protection regulation enacted by the European Union (EU). It came into effect on May 25, 2018, aiming to protect the personal data and privacy of EU citizens. GDPR is considered the strictest data protection law in the world. It sets a high standard for most global personal data protection laws, demonstrating an excellent example for updates and reviews of data protection laws worldwide (“Countries with GDPR-Like Data Privacy Laws,” n.d.).

Moreover, we can find the correct approach for serious cybercrimes within GDPR:

According to GDPR (2018), in specific circumstances, law enforcement authorities can search for personal information of anonymous users under GDPR provisions to achieve investigatory purposes. It must be emphasised that GDPR stipulates that the processing of personal data must comply with the principles of lawfulness, fairness, and transparency, and must satisfy the necessity and proportionality of specific purposes. When it comes to purposes such as preventing, investigating, detecting, or prosecuting criminal offenses, enforcing criminal penalties, and maintaining public security, law enforcement authorities can legitimately process personal data under GDPR provisions. In carrying out these tasks, law enforcement authorities must adhere to other GDPR provisions, such as the principles of data minimisation and purpose limitation, and must implement appropriate technical and organizational measures to ensure the security and privacy of personal data.

“We support freedom of speech and peaceful protest, but there is no place for terrorism on the platform. Our success in countering ISIS proves that you don’t have to sacrifice privacy for security. You can, and should, enjoy both.”

The balance between privacy and security, as stated by Pavel (2019) on Platform X, will eventually find an answer.

References

Countries with GDPR-Like Data Privacy Laws. (n.d.). insights.comforte.com. Retrieved from https://insights.comforte.com/countries-with-gdpr-like-data-privacy-laws

CNN. (2016, February 23). Telegram founder: We’ll happily comply with terrorism investigations. *CNN*. Retrieved from https://edition.cnn.com/2016/02/23/europe/pavel-durov-telegram-encryption/

De Meyer, J. (1973). The Right to Respect for Private and Family Life, Home, and Communications in relations between individuals, and the Resulting Obligations for States Parties to the Convention. In A. H. Robertson (Ed.), *Privacy and Human Rights*. Manchester University Press.

Flew, T. (2021). *Regulating platforms*. Polity Press.

GDPR Info. (2018). Article 4 GDPR. GDPR Portal. Retrieved from https://gdpr-info.eu/art-4-gdpr/

Gillespie, Tarleton. (2018). *Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media*. DOI: 10.12987/9780300235029.

Kim, R. (2022, May 27). Everything You Need to Know About the Nth Room Case in ‘Cyber Hell’. Journalists and the police took down one of the biggest sexual exploitation rings in South Korean history. Retrieved from https://www.netflix.com/tudum/articles/everything-to-know-about-the-nth-room-case-in-cyber-hell

Lowy Institute. (2020, April 10). The “Nth Room” Case and Modern Slavery in the Digital Space. *The Interpreter*. Retrieved from https://www.lowyinstitute.org/the-interpreter/nth-room-case-modern-slavery-digital-space

Nocut News. (2020, March 24). [인터뷰] “저는 ‘박사방’ 중학생 피해자입니다” [Interview: “I am a middle school student victim of ‘Doctor’s Room”]. Retrieved from https://www.nocutnews.co.kr/news/5314308

Suzor, N. P. (2019). *Lawless: The Secret Rules That Govern our Digital Lives*. Cambridge University Press.

Tech Learn Easy. (n.d.). Telegram: One-To-One Super Secured Video Calling Options for Android and iOS. Retrieved from techlearneasy.com/telegram-secured-video-calling/

Telegram. (n.d.). There’s illegal content on Telegram. How do I take it down? Frequently Asked Questions. Retrieved from https://telegram.org/faq#q-there-39s-illegal-content-on-telegram-how-do-i-take-it-down

UNODC. (n.d.). Privacy: What it is and why it is important. United Nations Office on Drugs and Crime. Retrieved from https://www.unodc.org/e4j/en/cybercrime/module-10/key-issues/privacy-what-it-is-and-why-it-is-important.html

United Nations. (2015). *Report of the Human Rights Council on its twenty-ninth session* (A/HRC/29/32). Retrieved from https://undocs.org/A/HRC/29/32

Van der Heijden, D. (n.d.). GDPR & ePrivacy Regulations. Retrieved from www.Convert.com/GDPR/

Be the first to comment

Leave a Reply