Introduction
Nowadays, an increasing number of individuals utilize the internet to visit websites and download countless mobile applications. According to Ani Petrosyan (2023), as of January 2023, the overall number of internet users worldwide is 5.16 billion, accounting for 64.4 percent of the global population. As Internet services become more widely available, online privacy has become one of the biggest concerns of Internet users (Flew, 2021).
This blog will discuss the definition of privacy and take Zoom as an example to assist readers in identifying potential privacy risks. Although these hazards are identified, customers do not immediately give up free online services to protect their privacy. In this unfair privacy-service deal, the consumer is a “contract taker”.
What is privacy?
A typical definition of privacy is “the right to be let alone” (Warren & Brandeis, 1890). Historically, the term “privacy” has been used to indicate an intrinsic human right, although it may be constrained in practice by competing powers, obligations, and norms (Flew, 2021).

In the context of the Internet, as the advancement of Internet technology has made our lives more convenient and colorful, the definition of privacy has evolved.
For example, we can pay with Apple Pay instead of carrying bank cards or cash. When we open social media apps, we are able to view the images shared by our friends at any time and from any location, as well as send text messages to friends and parents without paying extra bills. When we open Google Maps, we know exactly how to go to our destination, which saves us time. However, these ubiquitous conveniences in our lives come at a cost. The cost is our privacy. Let’s recall now. Every time we open a freshly downloaded mobile app, we must first register for an account and provide our personal information, and then the functions of the app can be used. In the Internet era, privacy has taken on economic properties, and it has been reinterpreted as a commodity that can be exchanged for perceived benefits (Campbell & Carlson, 2002). Users’ privacy is like a coin that may be exchanged for free online services.
Zoom
Zoom is an online platform to connect. Its main services include online meetings, team chats, virtual workspaces, VoIP phone systems, and so on. Zoom is commonly utilized in study and work environments. Since the outbreak of covid-19, a large number of students have started taking classes remotely via video conferencing apps and employees also worked at home. Zoom has undoubtedly become a winner. In December 2019, the number of Zoom users was 1 million. Four months later, the number of people attending virtual meetings via Zoom each day had risen to 300 million (Aiken, 2020). Today, even if people no longer consciously wear masks to maintain social distance, Zoom still has not vanished from public view. Just last week, I took an online course via Zoom.
Is Zoom, which we frequently use, safe? Does it invade our privacy?
Privacy and security concerns behind Zoom
Zoom had numerous security and privacy concerns.

A security bug in the Mac Zoom Client was discovered in 2020, allowing any website to invite people to a Zoom meeting without the authorization of the host (Ellis, 2021). This means that these websites may read a series of information such as location, time, meeting information, etc. If you install Zoom on your Mac computer, you could be spied on by third-party websites.
In addition, Zoom has leaked users’ personal data to other companies. In April 2020, Zoom was sued for allegedly disclosing user data to firms such as Facebook without fully informing them. When a user joins a Zoom conference call, Zoom sends a report to Facebook with a variety of personal information about the user, such as the device used, location, and so on (BROOKS, 2020).
We “give up” our privacies
It is obvious that Zoom may raise security and privacy concerns. Not only Zoom, but most mobile apps, particularly social media platforms, have a significant risk of potential privacy leakage.
According to Surfshark (2022), two internet users’ personal privacy has been compromised every second of 2022.

So, after knowing this, will we refuse to use these applications in order to protect our own privacy?
For most people, the answer is probably no. There is a privacy paradox here. According to Acquisti and Gross (2006), users are concerned about their privacy but take little action. This might be due to the privacy trade-off. The term “privacy trade-off” refers to the trade-off between free online services and user privacy (Flew, 2021). When deciding whether to disclose information, users are making a dynamic trade-off between the benefits and costs of providing personal information (Dinev & Hart, 2006). When the benefits available outweigh the costs of disclosure of personal information, users will voluntarily give up some of their privacy in order to access a high-value online service.
For example, when I open Zoom to attend an online meeting, I must enter my name, email address, and cell phone number. Only after providing this information, will I be able to access the online conference room. And all the personal information I submitted is true because the virtual meetings are used for school online classes or business meetings. In other words, in exchange for the Zoom online meeting experience, I provided real personal information. It can be viewed as a “transaction”. A user or consumer gives his personal information to a website or application in exchange for a free service (NORBERG et al., 2007). If it is imagined as a traditional transaction, the user’s personal data acts as currency.
An unfair “deal”
This trade is not fair.
Wottrich et al. (2018) argue that users face privacy trade-offs while utilizing applications, while they do not appear to be capable of making well-considered trade-offs. First, most customers prioritize the service itself, and the cost of privacy is frequently pushed to the back burner. Especially, new users who are eager to access a service tend to focus on the app’s functionality rather than the personal data processing behind it (Aiken, 2020).
Second, there is an information gap between users and service providers. Even if users recognize that submitting personal information on a website may be risky, they do not know what information is collected and processed, and for what purposes. Users may be unaware of how releasing the information would affect their privacy (Li & Unger, 2012). The use of data is purposefully hidden. Users participate without knowing exactly what they are participating in, which is considered by Campbell & Carlson (2002) to be participation without informed consent. Due to the invisibility of the end use of personal information, users are unable to effectively evaluate the cost of privacy they have to pay.
Finally, users have no other options. It is extremely common for a user to be unable to access or utilize a website or application if he or she does not provide personal information. Users have to give up part of their personal information for self-disclosure in order to access the service, which Campbell and Carlson (2002) described as a sense of “losing out” that merchants nurture for consumers. If consumers refuse to self-disclose, they will forfeit the product or service that would have been provided to them and are excluded from the market reward. As Davies (1998) noted, merchants induce consumers to have an “illusion of voluntariness” to conceal the fact that they are conducting surveillance.
As a result, in the privacy trade-off, it seems that we actively give up our privacy by providing personal information. But in reality, we have no more choice in the unfair deal. As Gandy (1996) noted, the consumer is always a “contract taker”, not a “contract maker”.
Stop being naive! We are constantly at a disadvantage.
We have always been at a disadvantage in the unfair privacy-service deal, which has never changed.
(1)A Failed Democratic Experiment
In 2009, Facebook attempted to turn users into “contract makers”. Mark Zuckerberg, Facebook’s CEO, promised that any changes to the platform’s policies would be based on user feedback. Users may directly participate in changes to its service terms. Changes will be implemented only if more than 30% of active users vote in unison. This means that at least 300 million people must agree or reject a provision, which is nearly impossible. Facebook seems to give users the chance to switch from “contract taker” to “contract maker.” However, Facebook set conditions for democratic revisions to the terms that are extremely challenging to implement.
I read some articles that praised Facebook for its democratic experiment. Leetaru (2019) argues that Facebook used to be a democratic company, and it is our failure to actively participate in voting that led to the firm becoming a dictatorship. It, in my opinion, is the sense of “losing out” that I mentioned previously. In this situation, the reward of the market is democratic rights. After an invalid vote, we lost the ability to engage in changes to the service terms. But we seem to forget that the goal set by Facebook is impossible to achieve.
In any case, the result is that we are still “contract takers”.
(2)Do privacy terms mean greater security?
Have you ever noticed the privacy policies on a website or mobile application? Did you thoroughly read the privacy statement before agreeing to the terms? Will you believe a website because of its privacy policies?
A brutal truth is that privacy conditions are not written for the user, but rather for the merchant. Service conditions grant operators a wide range of rights, with the goal of protecting the company’s legitimate interests (Suzor, 2019).
Furthermore, privacy terms are typically ambiguous and confusing, and regulations are frequently too complicated (Turow & Nir, 2000). According to Cranor (2012), if a person reads the privacy terms of all the websites he logs into in a year, he will spend 244 hours. It is an astonishing number! Obviously, we do not have the time and energy to read the privacy policies of every new website we visit.
A video — How tech companies deceive you into giving up your data and privacy:
In addition, most online privacy statements change frequently without user consent (Steinfeld, 2016). There is a note concerning privacy statement modifications at the bottom of Zoom’s privacy statement, which was issued in February 2023: We will update the privacy statement regularly and upload updated privacy statements on the official website as soon as possible. That means we only agreed on the original privacy statement. We need to spend more time reading the amended policy. And we do not know what sections of the Zoom Privacy Policy were changed. It is difficult for most users to discern which ones have changed since they are not marked on Zoom’s privacy page.
Also, Zoom’s Privacy Policy (2023) states, “We will notify you when there are important changes to the Privacy Statement, and you can choose whether or not to continue using our products.” Consider the privacy modification process as a chain, it might be suggesting a possible update, confirming the update, posting it, and notifying the user. There is no doubt that the user is at the end of the chain. There are only two options in front of us: accept the terms or discontinue the service. Assume that we’ve been utilizing Zoom for online virtual meetings for a year and are comfortable with it. One day, we received an unexpected notification of an important change in the privacy policy. We must decide whether to accept change. What would you choose? I believe most people will probably ignore this notification and choose to continue using Zoom. Because we have enjoyed the service, a sense of “losing out” will be stronger. As a result, we once again agreed to the stipulations.
Conclusion
In conclusion, in the digital age, privacy presents economic property and can be used as money in return for internet services. After a privacy trade-off, it seems that users actively give up their privacy in exchange for the online services they require. However, the privacy-service trade is unfair. Users are always at a disadvantage, and they are rule-receivers with no initiative.
Reference
Acquisti, A., & Gross, R. (2006, June). Awareness, information sharing, and privacy on the Facebook. Paper presented at the 6th “Privacy Enhancing Technologies,” workshop Cambridge, England.
Aiken, A. (2020). Zooming in on privacy concerns: Video app Zoom is surging in popularity. In our rush to stay connected, we need to make security checks and not reveal more than we think. Index on Censorship, 49(2), 24–27. https://doi.org/10.1177/0306422020935792
Ani Petrosyan. (2023, April 3). Global digital population 2022. Statista; www.statista.com. https://www.statista.com/statistics/617136/digital-population-worldwide/
BROOKS, K. J. (2020, April 1). Zoom sued for allegedly sharing users’ personal data with Facebook. CBS NEWS. https://www.cbsnews.com/news/zoom-app-personal-data-selling-facebook-lawsuit-alleges/
Campbell, J. E., & Carlson, M. (2002). Panopticon.com: Online Surveillance and the Commodification of Privacy. Journal of Broadcasting & Electronic Media (46:4), 586–606.
Cranor, L. F. (2012). Necessary but not sufficient: standardized mechanisms for privacy notice and choice. Journal on Telecommunications & High Technology Law, 10(2), 273–.
Davies, S. (1998). Re-engineering the right to privacy. In Agre, P., & Rotenberg, M. Technology and privacy: the new landscape (pp. 143–165). Mit Press.
Dinev, T., & Hart, P. (2006). An Extended Privacy Calculus Model for E-Commerce Transactions. Information Systems Research, 17(1), 61–80. https://doi.org/10.1287/isre.1060.0080
Ellis, W. (2021, October 20). Zoom’s Privacy Issues Affecting Australian Businesses. Privacy Australia. https://privacyaustralia.net/zoom-privacy-issues-affect-australian-business/
Flew, T. (2021). Regulating Platforms. John Wiley & Sons.
Gandy, O. (1996). Coming to terms with the panoptic sort. In Lyon, D., & Zureik, E. Computers, surveillance, and privacy (pp. 132-155). University Of Minnesota Press.
Leetaru, K. (2019, April 13). Facebook Was A Democracy 2009-2012 But We Didn’t Vote So It Turned Into A Dictatorship. Forbes. https://www.forbes.com/sites/kalevleetaru/2019/04/13/facebook-was-a-democracy-2009-2012-but-we-didnt-vote-so-it-turned-into-a-dictatorship/?sh=1da16c50657a
Li, T., & Unger, T. (2012). Willing to pay for quality personalization? Trade-off between quality and privacy. European Journal of Information Systems, 21(6), 621–642. https://doi.org/10.1057/ejis.2012.13
NORBERG, P. A., HORNE, D. R., & HORNE, D. A. (2007). The Privacy Paradox: Personal Information Disclosure Intentions versus Behaviors. Journal of Consumer Affairs, 41(1), 100–126. https://doi.org/10.1111/j.1745-6606.2006.00070.x
Steinfeld, N. (2016). “I agree to the terms and conditions”: (How) do users read privacy policies online? An eye-tracking experiment. Computers in Human Behavior, 55(B), 992–1000. https://doi.org/10.1016/j.chb.2015.09.038
Privacy | Zoom. (2023, February 24). Explore.zoom.us. https://explore.zoom.us/en/privacy/
Surfshark. (2022, April 13). Data breach statistics by country: Q1 2022. Surfshark. https://surfshark.com/blog/data-breach-statistics-by-country
Suzor, N. P. (2019). Lawless : the secret rules that govern our digital lives. Cambridge University Press.
Turow, J., & Nir, L. (2000, May). The Internet and the Family, 2000: The View from Parents, the View from Kids. Report Series No. 33.
Warren, S. D., & Brandeis, L. (1890). The Right to Privacy. Harvard Law Review.
Wottrich, V. M., van Reijmersdal, E. A., & Smit, E. G. (2018). The privacy trade-off for mobile app downloads: The roles of app value, intrusiveness, and privacy concerns. Decision Support Systems, 106, 44–52. https://doi.org/10.1016/j.dss.2017.12.003
Be the first to comment