Confirmation of privacy governance principles – adoption of policy changes for WhatsApp in 2021


WhatsApp’s privacy policy was updated in early 2021 to inform users that some data would be shared with Facebook, WhatsApp’s parent company, which stated that the data would be used to help Facebook provide better friend suggestions, display more relevant ads, and improve the user experience of Facebook products. WhatsApp guarantees users, however, that messages and calls will be encrypted end-to-end, meaning that only the sender and receiver will be able to see the information. According to Sur and Goswami’s research (2021, p. 165), some respondents fear the privacy policy changes would affect them and are seeking for alternatives such as SIGNAL and Telegram. WhatsApp has over 5 billion users worldwide, according to a blog post published in January 2017, who share over 400 million messages, 100 million images, and 6.4 billion videos per day (Malekhosseini, Hosseinzadeh and Navi, 2018, p. 1143). Meanwhile, according to a recent survey on popular perceptions of WhatsApp, the phrases “social,” “private,” and “security” are frequently connected with the programmer (Caetano et al., 2018; (Zarouali, Brosius, Helberger and Vreese, 2021, p. 253). As a result, taking WhatsApp’s policy revision in 2021 as an example, we will summarise several digital stewardship concepts with practical implications based on a discussion of privacy, security, and digital rights.

Privacy, security and digital right:

One of the things we may need to examine before moving on to the next topic is the notion of specific privacy. “Privacy” is founded on the belief that a private individual should be “let alone,” unnoticed or undisturbed by others (Warren Brandeis, 1890; Marwick & Boyd, 2018, p. 1158). Furthermore, privacy can be defined as the right to be alone, the ‘right to show oneself selectively to the world,’ control over personal information, or even freedom from others’ judgement (Politou, Alepis, Virvou, & Patsakis, 2021, p. 8). The key quality of this, according to Hiranandani (2011, p. 1092), is the individual’s ability to manage the flow of information about, about, or from her. I’m not going to try to give a perfect definition here; rather, I’d like to emphasise that we can use this essential feature – control over information – as a coordinate for comprehending the concept of privacy throughout this work. At the same time, privacy is a social artefact that must be understood within a certain social context (Politou, Alepis, Virvou, & Patsakis, 2021, p. 8). Because of ideological and policy disparities between countries, the privacy challenges confronting modern cultures are complex (Kenny & Korba, 2002, p. 648). Marwick and Boyd (2018, p. 1159) extend the notion that cultural differences vary across a wide range of countries and languages to include differences within communities, based on sensitivity to context, subject location, and any given dynamic. Hence, the structure of context is a tool that could assist me better grasp and frame the issue in a more detailed way. The attribute features of the individual media, systems, or platforms of their unique material shape contexts (Nissenbaum, 2018, p.836). This could imply a structure that goes beyond individual interests. Instead, regardless of whether content is intercepted, exposed, or used via mediation techniques, it should be “respected” (Nissenbaum, 2018, p.849). Finally, in this study, privacy is viewed as a process of information flow in a given social context, in which we must examine not only the influence of the specific social property in which it is located, but also the systemic processes that surround the transmission process.

Most people report having no control over their online privacy (Goggin et al., 2017, p. 1), and networked technologies are complicating these dynamics to the point where most people must choose between disclosure, concealment, and connection (Marwick & Boyd, 2018, p. 1158). This could indicate a trend in which privacy and security concerns become increasingly intimately intertwined in a data-driven society. Sur and Goswami(2021, p. 160) summarise cybersecurity concepts as follows: govern, defend, detect, and respond. This necessitates the implementation of appropriate technical or organisational safeguards when handling personal information to prevent accidental, unauthorised, or unlawful access, use, alteration, disclosure, loss, destruction, or damage (Akinsanmi and Salami, 2021, p. 2). In this context, it is widely assumed that privacy and security are incompatible. Akinsanmi and Salami (2021, p. 4), on the other hand, feel that a balance between privacy and health security can be achieved. Given the state’s emphasis on security, Hiranandani (2021, p. 1096) observed that, far from being in conflict with security, privacy actually strengthens governments’ abilities to defend the common good.

Privacy safeguards an individual’s personal sphere, where personal speech and activities can be freely conveyed. As a result, the right to privacy is at the heart of all fundamental freedoms and serves as the foundation for all other human rights and liberties (Hiranandani, 2011, p. 1092). While contemplating privacy, we must evaluate it in connection with human rights. According to Petley (2017, p. 87), political discussion freedom is an idea of democratic society that pervades the entire convention. This empowers individuals to safeguard their right to be themselves, as well as their right to exist independently of the state (Hiranandani, 2021, p. 1092). As we previously discussed, many of the privacy challenges we confront stem from the Internet’s digital domain network (Nissenbaum, 2018, p. 835). Personal electronic gadgets can store an unprecedented quantity of information on a person in ways that benefit their daily lives (Losavio and Keeling, 2014, p. 198). In accordance with the definition of privacy, the internet is the primary location where information flows are currently transmitted, and so human rights connected to privacy may need to grow with the times and expand into the digital sphere.

Case analysis:

Back to the WhatsApp example, one of the more contentious modifications to the 2021 terms is that some data would be shared with Facebook, WhatsApp’s parent firm. User account information (such as phone numbers and profile names), device information, and transactional data will be shared. When combined with our consideration of privacy, this indicates that corporate power over the transmission of people’s information flows. It is critical that we develop standards to govern this process in relation to this behaviour. This process, according to Filimowicz (2022, p. 59), defines the roles, responsibilities, and processes associated with data generation, collection, processing, and protection. In this section, we will attempt to summarise four concepts in the context of a specific conversation to serve as some practice.

According to Marwick and Boyd (2018, p. 1158), the technology sector frequently portrays its products as an exchange of personal information between those eager to disclose it in exchange for benefits. Third-party data gathering, as Mayer and Mitchell (2012) put it, “supports free material on the web and promotes online innovation, but it comes at the expense of privacy” (p. 413; Filimowicz, 2022, p.63). According to WhatsApp, this data will be utilised to assist Facebook in providing better friend suggestions, displaying more relevant adverts, and improving the user experience of Facebook products. This is treated as if it were a comparable concept. In the context of this mistrust, we might begin by proposing a concept to assist the two parties in reaching an agreement. According to Access Now (2019, p. 12), independent authorities and effective enforcement procedures might be developed. This can be summed up as an accountability principle. In other words, comprehensive processes for dealing with personal information, ensuring accountability, and punishing individuals who misuse their authority through unchecked surveillance must be put in place (Hiranandani, 2011, p. 1097).

On the other side, there is some fear that this behaviour is the result of a chilling effect. Without extra information and openness, WhatsApp’s arrangement may give clients a false sense of security (Kenny and Korba, 2002, p. 267). Customers may be unaware of these brands’ promotional aims. As a result, consumers’ personal information may be sold (to other companies), merged, or repurposed (Zarouali, Brosius, Helberger and Vreese, 2021, p.253). The notion of transparency becomes critical in this situation. According to Access Now (2019, p. 4), governments and policymakers must guarantee that data protection rules are negotiated in a transparent, open, and inclusive manner. This can be interpreted as an application of the idea of transparency. The basic cause for this phenomena, according to Filimowicz (2022, p. 63), is that the industry’s business model is predicated on privacy asymmetries. This might also be seen as emphasising the connection between the principles, and an accountable institution would surely play a part in this transparency.

Companies like WhatsApp frequently respond to these issues with a lengthy and unassailable privacy clause. WhatsApp has done the same with this policy modification, thereby providing a “opt-out” policy to anyone who want to continue using the service. This system, which is “built on the value of data and information” (van Dijck 2014, p. 199), takes advantage of consumers’ vulnerabilities, shortcomings, and emotions (Filimowicz, 2022, p. 69). In most cases, users attempt to strike a balance between privacy concerns and information sharing (Malekhosseini, Hosseinzadeh and Navi, 2018, p. 1162). This suggests that the user has some control over the situation. Albesher and Alhussain’s study (2021, p. 251), on the other hand, found that the average social media user does not adjust his or her security settings. As a result, the notion of autonomy in this study emphasises a type of usability in reality. This refers to whether programmes provide their users with adequate information to make informed decisions about who can access and share their data (Albesher and Alhussain,2021, p. 251).

This final premise might be viewed as the greatest form of assurance. According to Access Now (2019, p. 10), in the face of privacy issues, we must establish protection mechanisms in the event of leakage. This can also be interpreted as a meaningful response to WhatsApp’s new behaviour. One thing Mars, Morris, and Scott (2019, p. 525) highlighted in their prior WhatsApp research that users should be aware of is that persons are removed from WhatsApp groups when they leave the department as an added precaution while using WhatsApp. This can be viewed as a specific practise of forgetting. The same can be said of autonomy. Consent must be easily withdrawn rather than inferred through inaction (Politou, Alepis, Virvou, & Patsakis, 2021, p. 20). This can be summarised as a right of withdrawal and a right to be forgotten.


Despite emphasising the importance of social contextual variation on privacy in the definition of privacy at the top, several of the results summarised in this study are still based on Western thinking. Even in an Australian study, there was and still isn’t universal support for North American-style free expression (Goggin et al, 2017, p. 2). We must remember that self-determination is a basic First Nations principle, and that the dominant Western logic of privacy does not always promote Indigenous cultural values and traditions, but rather functions as a means of reconciliation under distinctly Western ideals (Marwick and Boyd, 2018, p. 1163). At the same time, the emergence of covid-19 places mobile apps at the centre of the virus’s propagation, since they may successfully satisfy the demand for rapid contact person monitoring (Politou, Alepis, Virvou, & Patsakis, 2021, p. 165). Certain technologies acquire a vast quantity of personal information, including location, travel, and personal health information, to protect themselves from a “unknown and fast changing” virus (Akinsanmi and Salami, 2021, p. 1). They are used to monitor, track, and regulate the virus’s spread. Several of these safeguards, however, encourage a trade-off between privacy and security. In the post-epidemic period, we must pay attention to some of the new changes in privacy issues as well as the adaptation of regulatory principles.


In terms of explanation and illustration, this essay seeks to glean some digital governance ideas from WhatsApp’s modifications to its privacy policies in 2021. First, we’ll talk about privacy, security, and digital rights. We do not seek to describe privacy directly here, but rather highlight the qualities of privacy to aid comprehension. We also demonstrate how it is intertwined with security and digital rights in today’s society. Following that, we deduce responsibility, transparency, feasible autonomy, and the right to be forgotten through concrete cases. We conclude by emphasising that the framework of this paper still corresponds to the Western model of discussion to a considerable extent. In terms of geography, we may need to consider social context distinctions, notably national and regional limits. In terms of time, the alterations brought about by covid-19 to human society were undeniably tremendous. Debates must be centred on the specific social environment of the post-epidemic era.

(Word counts: 2089)

Reference list:

Access Now. (2019). Data Protection Guide for Lawmakers.

Akinsanmi, T., & Salami, A. (2021). Evaluating the trade-off between privacy, public health safety, and digital security in a pandemic. Data & Policy, 3.

Albesher, A. S., & Alhussain, T. (2021). Evaluating and Comparing the Usability of Privacy in WhatsApp, Twitter, and Snapchat. International Journal of Advanced Computer Science & Applications, 12(8), 251–259.

Filimowicz, M. (2022). Data Privacy in Digital Advertising: Towards a Post-Third-Party Cookie Era. In Privacy. Taylor & Francis Group.

Goggin, G., Vromen, A., Weatherall, K., Martin, F., Adele, W., Sunman, L., & Bailo, F. (2017). Digital Rights in Australia.

Hiranandani, V. (2011). Privacy and security in the digital age: contemporary challenges and future directions. The International Journal of Human Rights, 15(7), 1091–1106.

Kenny, S., & Korba, L. (2002). Applying digital rights management systems to privacy rights management. Computers & Security, 21(7), 648–664.

Losavio, M., & Keeling, D. (2014). Evidentiary Power and Propriety of Digital Identifiers and the Impact on Privacy Rights in the United States. The Journal of Digital Forensics, Security and Law, 9(2), 197–204.

Malekhosseini, R., Hosseinzadeh, M., & Navi, K. (2018). Evaluation of users’ privacy concerns by checking of their WhatsApp status. Software, Practice & Experience, 48(5), 1143–1164.

Mars, M., Morris, C., & Scott, R. E. (2019). WhatsApp guidelines – what guidelines? A literature review. Journal of Telemedicine and Telecare, 25(9), 524–529.

Marwick, A. E., & Boyd, D. (2018). Understanding Privacy at the Margins: Introduction. International Journal of Communication (Online), 1157–.1165

Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852.

Politou, E., Alepis, E., Virvou, M., & Patsakis, C. (2021). Privacy and Data Protection Challenges in the Distributed Era (Vol. 26). Springer International Publishing AG.

Petley, J. (2017). Human Rights and Press Law. In The Routledge Companion to Media and Human Rights (1st ed., pp. 83–94). Routledge.

Sur, S., & Goswami, K. (2021). Digital Privacy: Case Study Analysis on Whatsapp Privacy Policy Changes. International Journal of Applied Science and Engineering, 9(2), 157–167.

Zarouali, B., Brosius, A., Helberger, N., & de Vreese, C. . (2021). WhatsApp marketing: A study on WhatsApp brand communication and the role of trust in self-disclosure. International Journal of Communication, 15, 252–276.

Be the first to comment

Leave a Reply