Cryptography–A Privacy lock For The Digital Age

(Source: INTERNXT)

With the advent of datafication era, the turnover of web technologies and the manipulation of major social media operators have brought about new modes of contact networks and data inclusion. Whether it is the motivation of health data, the use of cloud-based technologies, or the upgrading of search engines are all positive flowing changes brought about by the data age (Nissenbaum, 2015). However, there is also controversy and protest behind this. The role of digital technologies in the online civic space is particularly important but in the internet age during the web 2.0 period, seems like it is dominated by digital authoritarianism. Our understanding of the boundaries of individual rights and the extent to which personal information can be controlled has been limited. The poor environment in which power parties continue to take advantage of user privacy has to be taken seriously.

This blog will begin by using the operating principles of the operators behind social media networks as a starting point to describe why the public’s right to know about the use of personal privacy has become uncontrolled and the shift in the way the Internet is regulated and governed. Then, it will suggest future regulatory models for the online space by describing the rationale for third-party platforms like Callisto that use data encryption and emerging technologies to protect user privacy.

(Source: 3CR

Before exploring privacy and digital rights, we should understand how social media networks work and why they need our privacy. In the era of big data, the public’s private information has been data-driven. Essentially, what social media platforms need is to collect data and analyze the vast network of information that exists behind it in order to bring about greater possibilities for profitability and expansion of user reach. There is a cycle of social media companies at work here. First of all, the platform needs to attract the attention of users and potential users with ads and buzz messages. This is because modern and customized content is essentially the core commodity offered by online platforms as time goes on (Flew, 2021). They use data on click frequency and number of visitors to make inferences about the content that each person is broadly interested in. Facilitate specific interactions through filtered display contents, e.g., comments, tags in photos, interactions resulting from re-tweets of content, etc.

The user must interact is fundamental to the cycle, while the type of content being posted and the degree of display after filtering is the technical setting in social media operations (Suzor, 2019). Because of the underlying rules of this, it is the platform that determines what users can do. Finally, they perform personal and private data collection through this process, for example, the user’s social circle, browsing preferences, action trajectory, and life rhythm. Ultimately, the algorithmic technology predicts and formulates user behavior, constantly providing us with relevant content that we like, promoting our engagement and loyalty to the platform. Behind this cycle is an indirect or direct violation of our privacy rights. This shows that the core of the overall operation of the company behind the platform is commercial interests.

Digital Rights and Privacy Online:

Digital rights are the fundamental human rights that exist in the online network and digital world. In contrast to the real-life right to privacy, which is the control of personal information and the selective access set by others, digital rights also include the right to access information and the right to participate in digital culture. Digital rights need to ensure that individuals online and offline enjoy the same protected rights at the same time (Goggin et al., 2017).

Where does all the privacy leak from?

Privacy is precisely the source of indifference and subconscious consent with respect to the user’s personal behavior. Let’s recall that after entering any website or social media network, we are asked “accept all cookies?” and “does the user agree to be bound by the terms and conditions listed”. Let’s reflect on whether anyone actually reads these terms and conditions or does anything to protect privacy beyond the basic settings? I believe that the vast majority of people will choose to ignore the potential consequences of this step and choose to trust the platform provider completely. However, it is this behavior that can lead to users unknowingly giving digital platforms the right and permission to violate our digital rights (Ramokapane et al., 2019). We are becoming accustomed to letting others use our information.

(Source: NPR)

The reason for this phenomenon is that, unlike the constitutional constraints and rules on the substance of society, in the web 2.0 era, the operators of digital platforms have the absolute power and permission to specify the rules in the maintenance of almost commercial interests. The opaqueness of the policy and the process of content review is like a black box hidden from users. All terms and conditions of service are almost always presented to users as a vague concept in the tone of cumbersome, complex legal texts. Before entering the platform, the user either chooses to accept the onerous constraints listed or is denied access to the service (Flew, 2021). What this amounts to is that the user is given almost no rights in the terms of service, and there is no way to know if the complex terms will be freely adjudicated without the user’s knowledge (Suzor, 2019). From the moment user information is collected with permission, the way it is misused is untraceable.

The need for an Internet governance and regulatory model is reflected in three main points, which are…

  • To Balance datafication and the freedom and privacy rights of Internet citizens
  • To ensure that privacy and data can coexist under proper use and management to create a good online environment
  • To promote the long-term positive development of the Internet and society, and to ensure transparency and accountability in policy implementation

In the age of digital era, as the Internet becomes more diverse, the need for individual rights and privacy is gradually being blurred. The existence of the Internet is not limited to a small area but includes the entire world, and the continued imbalance is extremely threatening to most areas in terms of digital media, politics, etc. The diverse products of the Internet have revealed their monitoring and investigation of users. In terms of social media, web tracking, data surveillance and information capture are all used for specific purposes due to the development of technological systems. The online behavior of users, identity construction, etc. as external factors are the refinement of social media to improve prediction systems and determine consumer sensitivity (Nissenbaum, 2015). The way online media species collect, track, and disseminate information about users derives from the way users’ online identities and self-expressions are perceived in the different contexts in which they are located. Instead of maintaining users’ expectations and trust, online technologies choose to trade sensitive information.

 In the big data revolution, the implementation of tracking and predictive technologies brought about by datafication and data monitoring is the main conflict that threatens our digital rights such as electronic personal information. How to support the correctness of data while protecting users’ privacy, how to define the encryption of data collection and use, and reliability are difficult challenges to measure.

(Source: Clinical Lab)

Based on the examples experienced by all in recent years first hand. At the particular time of COVID-19, data surveillance was classified as necessary to contain the spread of the epidemic. Since the outbreak of the epidemic and the development of the long-lasting crisis, people around the world have been “legally” deprived of their privacy and freedom for a long time. According to Sharna & Bashir, 2020, a survey of 50 apps developed for COVID-19 in the Google Play store, more than half of the apps require access to the mobile devices used by users. Whether it is common device access (contacts, media, camera, network access, etc.) or highly sensitive information about individuals (health data, geolocation, email addresses, etc.). Only a very small part of the collection of user data can be protected anonymously and ensure the encryption of online transmissions (Sharna & Bashir, 2020). What is most disturbing is that such applications are given public trust and access with open government permission. It is unknown and unpredictable whether all data will be continuously monitored and collected by the government to generate information networks and maps of activities after the special period is over.

Thus, the emphasis on individual rights and privacy, and thus public awareness, and the definition of digital rights are essential to the freedom and rights of Internet citizens. From the perspective of current regulations, it is difficult to truly realize the basic rights of online citizens in the digital age. However, a platform called Callisto, a new technology utilized by third parties, has cleverly demonstrated that this is the best way to help achieve the coexistence of privacy and data without the need for government and technology company regulation.

(Source: NPR)

What is Callisto and how does it work?

Callisto aims to create a platform for college campuses to support victims of sexual assault and harassment in their search for solace and justice. The background for its creation stems from the fact that perpetrators of sexual assault at universities are often repeat offenders and no victim wants to be at the top of the pile to make risky accusations. Victims choose to bear the painful memories themselves out of fear of the perpetrator and fear of damage to their reputation, thus avoiding secondary victimization. Therefore, based on this situation, the creation of this platform offers the best way to create balance in an unfair social structure by helping victims to join forces and reduce the potential for intimidation in the pursuit of justice (Bellos, 2022).

The novel technology used by Callisto to protect privacy is called “privacy-enhancing technologies”, or Pets (Bellos, 2022). There are several cryptographic components in the key server of the platform system, including Shamir Secret Sharing, Oblivious pseudo-random functions (OPRFs), Symmetric encryption, and Public key encryption (Rajan et al ., 2018). Cryptography is the most trusted key to protect user privacy. Callisto encourages victims to enter things like personal information and perpetrator identification details. It also promises that emerging technologies and services can be designed to anonymize and encrypt relevant and highly sensitive information. When the same perpetrator’s name appears twice, the platform connects two strangers to build trust without revealing any personal information (Bellos, 2022). And each will have a third-party attorney who will be vetted by the platform to host the case. This legal counsel is also the only third party who can decrypt the key. Amazingly, even the legal advisor does not see the details of the incident or the identity information until matched to the same perpetrator. There is a triple authentication requirement for access in this (Rajan et al., 2018). Neither Callisto’s employees, data analysts, legal requests for litigation investigations, nor hackers who maliciously steal information can decrypt entry data from any stage of the process. All they end up with is a string of garbled codes. The ultimate goal of such strict protection of information is to bring healing and justice to the victims in the shadows


Refer to the above operation of pets in Callisto and how the whole platform works. Encryption means should be put into regulation. To avoid a profit-oriented platform and unpredictable government regulation, a third-party number data management department or regulator should be established to support the key protection of personal data. At the same time, to improve the transparency and accountability in policy implementation, platforms should be changed from their inception in order to raise awareness now. Firstly, terms of use should be simplified into clearer terms. Secondly, to inform and give feedback to users in a more concrete way when features collect or share data.

(Source: Educba)

In the transition from Web 2.0 to Web 3.0, the core meaning is to focus on the ownership of the individual user. Supported by technologies such as personalization and artificial intelligence, the concept of the enhanced individual gives control and ownership to the user and supports the digital economy. On this basis, it fundamentally balances the intense relationship between digital rights and the privacy of most users. At the same time Web 3.0 also aims to make the Internet more open and decentralized. Users will no longer be overly dependent on the Web for their personal data and information. With the new technology, distributed ledger technology, it will shift the concept of big data that existed in web 2.0 thus converting it to the user taking back ownership of personal data (Vermaak, 2023). The power of encrypted data regulation by individuals and third-party institutions not involved in its use is shaping the future of Internet governance and privacy protection.

To emphasis the main problem is that our online space is different from the real-life constitutional control and there is no clear definition of digital platforms and digital rights that privacy is constantly being used without the knowledge of individuals. The rights of privacy have been constantly violated in the age of dataism, it ‘s hard to define what is digital. The rights of privacy have been constantly violated in the age of dataism, it ‘s hard to define what is digital rights and privacy due to the vague and complicated terms of service and the black box of data. Thus, privacy needs to be protected and digital rights need to be clarified to balance the use of information.


Flew, Terry (2021) Regulating Platforms. Cambridge: Polity, pp. 72-79.

Suzor, Nicolas P. 2019. ‘Who Makes the Rules?’. In Lawless: the secret rules that govern our lives. Cambridge, UK: Cambridge University Press. pp. 10-24.

​​​​​​Sharma, T., & Bashir, M. (2020). Use of apps in the COVID-19 response and the loss of privacy         protection. Nature Medicine26(8), 1165–1167.

Ramokapane, K. M., Mazeli, A. C., & Rashid, A. (2019). Skip, skip, skip, ACCEPT!!!: A study on the usability of smartphone manufacturer provided default features and user privacy. Proceedings on Privacy Enhancing Technologies,2019(2), 209–227.

Bellos, A. (2022, October 29). Can a new form of cryptography solve the internet’s privacy problem? The Guardian. Retrieved April 15, 2023, from

Rajan, A., Qin, L., Archer, D. W., Boneh, D., Lepoint, T., & Varia, M. (2018). Callisto:A Cryptographic Approach To Detect Serial Predators Of Sexual Misconduct. Proceedings of the 1st ACM SIGCAS Conference on Computing and Sustainable Societies

Nissenbaum, H. (2015). Respecting context to protect privacy: Why meaning matters. Science and Engineering Ethics24(3), 831–852.

Goggin, G., Vromen, A., Weatherall, K., Martin, F., Webb, A., Sunman, L., Bailo, F. (2017)    Executive Summary and Digital Rights: What are they and why do they matter now? In Digital Rights in Australia. Sydney: University of Sydney.

Vermaak, W. (2023, March 24). What is web 3.0? decentralized internet explained: CoinMarketCapW. CoinMarketCap Alexandria. Retrieved April 15, 2023, from

Be the first to comment

Leave a Reply