Privacy Paradox: The Tug of War Between Privacy and Rights

In the early years of digital platforms, users were attracted by their freedom and openness and gave little concern about privacy issues  (Russell, 2024b). As digital platforms become an integral part of our daily lives, perceptions of privacy are beginning to change. Privacy has gradually evolved from a simple concept of “private spaces to practice personal activities “(Erica Longfellow, 2001) to a complex human right.

In recent years, the growing number of data breaches and misuse of personal information has increased privacy awareness (Joanne Hinds, 2020). The growth of digital platforms has brought privacy issues to the forefront of the Internet. Despite growing concerns about privacy breaches, people continue to use digital platforms in their daily activities – illustrating the complex privacy paradox. This blog explores the privacy paradox, the complex process by which individuals weigh the benefits of using digital platforms against potential privacy risks. By studying Tiktok’s case and the reasons behind the privacy paradox, we aim to allay users’ fears of using platforms and give them a clearer understanding of their digital rights.

Privacy Paradox 

Users use these platforms heavily while concerning privacy breaches. The mismatch between individuals’ desire for privacy and their actual behaviours is what we call the privacy paradox(Susanne Barth & Menno D.T. de Jong, 2017). The “privacy paradox” suggests that although Internet users are concerned about privacy breaches, this rarely translates into actual action (Taddicken, 2014).

Beresford et al. (2012) conducted an experiment. Participants were asked to purchase identical DVDs from one of two competing shops. the first shop asked for income and date of birth, and the second shop asked for favourite colour and year of birth. When prices were the same, the same participants purchased from both shops. When the price of the first shop was set 1 euro lower, almost all participants chose the cheaper shop, even though it asked for more personal information (Alastair R. Beresford & Dorothea Kübler, 2012). A privacy-related questionnaire was administered to the participants after the experiment. 75% of the participants indicated they were interested in privacy issues, and 95% indicated they were very interested. Beresford’s experiment is a good example of the privacy paradox. Users tend to share private information in exchange for commodity value and personalized services, even if they are aware of privacy risks (Barth & De Jong, 2017).

Users are willing to sacrifice

TikTok is currently one of the world’s most popular platforms, and Chinese company ByteDance owns it. TikTok has been scrutinized in several countries in recent years due to privacy issues and possible sharing of user data with the Chinese government. In March 2024, the United States enacted an Act restricting TikTok (Wikipedia contributors, 2024).

“TikTok’s popularity shows no signs of waning, even as some users voice concerns about the app’s potential threat .“

(Ivana Saric, 2023)

According to a new survey by the Pew Research Centre (a nonprofit think tank), more than 40% of Americans who believe TikTok brings a threat to national security are still using the app (Pew Research Center, 2023). For young consumers, they are afraid of missing out on the connections provided by TikTok (Ivana Saric, 2023). In the case of TikTok, users have recognized that their privacy is at risk but still choose to use the platform for social connection and entertainment value. This decision reflects a privacy trade-off, where users are thinking that the benefits of TikTok outweigh the privacy risks.

Why it happens?

This privacy paradox highlights a contradiction: our privacy concerns do not match our actions. Why do we continue to share personal information even as we worry about privacy breaches? Here are four reasons behind the privacy paradox:

  • Irrational decision-making: Users do not always follow a rational decision-making process when exposing their individual information but are based on heuristics (i.e., simplified problems) (Alessandro Acquisti & Jens Grossklags, 2006). Humans have limited energy and cannot rationally assess all possible risks (Sakhhi Chhabra & Indian Institute of Managemen, n.d.)
Heuristics: Definition, Examples, And How They Work (Frimodig, 2023)
  • Cognitive bias: One of the common causes of the privacy paradox is individuals’ cognitive biases about themselves. It gave them unrealistic optimism. “Optimism bias is a cognitive bias that causes someone to believe that they are less likely to experience negative events (Wikipedia Contributor, 2023).” This overconfident behaviour leads people to overestimate the potential benefits and underestimate the risk of information breaches (Baek, Kim, and Bae, 2014).

“Why would anyone want to misuse my information, I’m not so important. What can happen to me as an individual?”

(The Privacy Paradox: Why Do People Share Their Data?, 2023c)
  • Information asymmetry: When making information privacy decisions, platform information asymmetry often affects individuals. Many users do not fully understand how companies use the data they share (The Privacy Paradox: Why Do People Share Their Data?, 2023c). Bandara et al. (2017) revealed that information asymmetry is a major issue in the current digital market. For example, you have to agree to its privacy policy when you want to use Zoom. However, it is difficult for users to read these complex and lengthy policies integrally. These policies include “collecting personal and meeting data”, which leads to potential privacy risks (Bluvshtein, 2022).
  • FOMO: Fear of missing out on social interactions (FOMO) played a significant role in the privacy paradox. FOMO is characterized by a desire to maintain a constant connection with what others are doing (Przybylski et al., 2013). The widespread popularity of TikTok created a form of social connection. Although users may recognize privacy concerns, they still choose to stay on the platform due to the influence of social connections with peers and celebrities.
  • Violate contextual integrity: Helen Nissenbaum developed a theory of privacy, “Contextual integrity”, in her book Privacy In Context: Technology, Policy, and the Integrity of Social Life(Privacy in Context Technology, Policy, and the Integrity of Social Life, n.d.).

According to the theory of contextual integrity (CI), privacy norms prescribe
information flows with reference to five parameters — sender, recipient,
subject, information type, and transmission principle

Helen Nissenbaum, 2019)

We can understand the privacy paradox through this theory. People think that the way that platforms process information is consistent with what they expect. In fact, the platform leaks and steals their information. This behaviour of the platform clearly violates contextual integrity because providing information to others would violate the principle of confidentiality in the transmission of information. Violations of contextual integrity lead to the privacy paradox.

We need referees: Coping with the privacy paradox

1. Legal Framework

An effective way to face the challenge of the privacy paradox is to develop a comprehensive legal framework. The European General Data Protection Regulation (GDPR) is a comprehensive data protection law designed to protect the data privacy rights of all individuals within the European Union. The GDPR enacts a set of strict data processing principles. These principles ensure transparency in the platform’s data storage, processing, and transmission and give individuals more control over their data. Here are the GDPR’s principles related to personal data protection:

  • The principle of data minimization: according to Article 5 of the GDPR, only the minimum amount of data necessary to complete the processing activity is allowed to be collected, avoiding the collection of unnecessary personal data(Art. 5 GDPR – Principles Relating to Processing of Personal Data – General Data Protection Regulation (GDPR), 2021).
  • Consent requirements: Consent Requirement: Under Article 7 of the GDPR, individuals must explicitly give their consent to data processing. At the same time, data processors are required to prove that individuals have consented to the processing of their data.
  • Transparency requirements: Articles 12 to 14 of the GDPR require data processors to provide individuals with transparent information, in particular details about the collection, use and processing of data.
  • Rights of data subjects: Articles 15 to 20 of the GDPR detail the rights of individuals, including access to their data, rectification of inaccurate data, erasure of data under certain conditions, restriction of processing and data portability. The Regulation gives individuals a number of “right to be forgotten” rights.
  • Data Protection Impact Assessment (DPA): Article 35 of the GDPR requires an assessment to be carried out before processing data that may pose a high risk to the rights of individuals.
  • Appointment of a Data Protection Officer (DPO): Articles 37 to 39 of the GDPR require the appointment of a Data Protection Officer (DPO) in some organisations. The DPO is responsible for monitoring compliance with the GDPR and acts as a liaison between supervisory authorities.
  • Cross-border data transfers: Articles 44 to 50 of the GDPR set out the conditions for cross-border data transfers, requiring any transfer of personal data to a country outside the EU to ensure that the receiving country provides adequate data protection.
  • Penalties for non-compliance: Articles 83 and 84 of the GDPR provide for fines for non-compliance to ensure that organisations comply with the law.

2. Empowering Users with Control

However, it is not enough to rely on legal provisions to resolve the privacy paradox; Increasing user control over privacy is equally important in this tug-of-war (Amiseq, 2024). Here are some simple steps: 

  • Setting permissions in apps: For example, find “Settings” in WeChat and go to “Personal Information and Permissions” to manage system permissions. In Instagram, find “your information and permissions” in settings, go to “save search history”, and change the system “default” to “days”. “After a certain number of days, the system will clear the records automatically.
  • Turn off remote access: Turn off remote access in your digital devices(Tim Miller & Jeannie Marie Paterson, 2020). For example, On Windows, you can find “System and Security” and go to “Remote Settings” in “System” to block remote access. Under Remote Desktop, select “Do not allow connections to this computer” to disable remote access.
  • Regular Cookie Clearing: Configure your browser to clear cookies and browsing data when you close it. For example, the settings in Google Chrome allow users to delete or block existing cookies. Google Chrome also offers an invisible mode, which deletes your browsing history and clears cookies on your device after you close all invisible windows (How Google Uses Cookies – Privacy & Terms – Google, n.d.).

3. Emerging Technologies

Relationship of blockchain

The rise of blockchain technology offers a way to balance transparency and data protection. By using blockchain technology, data can be made publicly available without revealing personal information. Information is encrypted and dispersed throughout the network, providing high security and protection. Blockchain can prevent privacy breaches while complying with the GDPR governance of the platform (Md Mehedi Hassan Onik & Chul-Soo Kim, 2019). Here are three solutions:

  • Efficient storage management: With Blockchain technology, it can easily erase user data because of the characteristics of the chain data structure, it allows users to delete their personal information stored on the chain because the remaining information is linked together.
  • Security: Another characteristic of Blockchain is that the data stored on the chain can’t be modified because everyone linked to this chain will be noticed, and the modified data won’t get verified since every chain will cross-verify itself as well.
  • Transparency and Authentication: Everything related to personal data is transparent, as the Blockchain stores data and broadcasts every modification to everyone on the chain with encrypted keys.


Digital giants play the role of gatekeepers in the digital world (Flew, 2021), complying with laws and regulations and collaborating with different sectors to achieve co-governance. The privacy paradox is a complex issue. We need to find a balance between achieving its benefits and protecting our digital rights. Despite privacy issues, we are still willing to share our data due to trust, social habits, and technology(The Privacy Paradox: Why Do People Share Their Data?, 2023b). By understanding the reasons behind these issues and learning more about individuals’ digital rights, we can make better decisions about how we use our personal information.

Reference list:

1. Russell, R. (2024, February 7). Unveiling the digital dilemma: Navigating the controversial realm of privacy in the digital age. Medium.

2. Erica Longfellow. (2001). Public, Private, and the Household in Early Seventeenth‐Century England. Journal of British Studies.

3. Joanne Hinds. (2020). “It wouldn’t happen to me”: Privacy concerns and perspectives following the Cambridge Analytica scandal. International Journal of Human-Computer Studies.

4. Susanne Barth & Menno D.T. de Jong. (2017). The privacy paradox – Investigating discrepancies between expressed privacy concerns and actual online behaviour – A systematic literature review. Telematics and Informatics.

5. Taddicken, M. (2014). The “Privacy Paradox” in the social web: the impact of privacy concerns, individual characteristics, and the perceived social relevance on different forms of Self-Disclosure1.‘Privacy-Paradox’-in-the-Social-Web%3A-The-Impact-Taddicken/a8a9d00d93d2f48224b038519fc1c5172bc06539

6. Beresford, A. R., Kübler, D., & Preibusch, S. (2012). Unwillingness to pay for privacy: A field experiment. Economics Letters, 117(1), 25–27.

7. Barth, S., & De Jong, M. D. (2017). The privacy paradox – Investigating discrepancies between expressed privacy concerns and actual online behaviour – A systematic literature review. Telematics and Informatics, 34(7), 1038–1058.

8. Wikipedia contributors. (2024, March 31). Restrictions on TikTok in the United States. Wikipedia.

9. Saric, I. (2023, July 13). Why TikTok retains its popularity despite users having security concerns. Axios.

10. Pew Research Center. (2023, July 10). Majority of Americans see TikTok as a national security threat | Pew Research Center.

11. Alessandro Acquisti, & Jens Grossklags. (2006). What can behavioural economics teach us about privacy? ETRICS.

12. Sakhhi Chhabra & Indian Institute of Managemen. (n.d.). Why Does Privacy Paradox Exist? A Qualitative Inquiry to Understand the Reasons for Privacy Paradox Among Smartphone Users. Journal of Electronic Commerce in Organizations.

13. Wikipedia contributors. (2023, December 20). Optimism bias. Wikipedia.

14. Baek, Y. M., Kim, E. M., & Bae, Y. (2014). My privacy is okay, but theirs is endangered: Why comparative optimism matters in online privacy concerns. Computers in Human Behavior, 31, 48–56. doi:10.1016/j. chb.2013.10.010 

15. The privacy paradox: Why do people share their data? (2023c, July 3).

16. Bandara, R., Fernando, M., & Akter, S. (2017). The privacy paradox in the data-driven marketplace: The role of knowledge deficiency and psychological distance. Procedia Computer Science, 121, 562–567. doi:10.1016/j. procs.2017.11.074 

17.Bluvshtein, C. (2022, September 26). The 20 most difficult to read privacy policies on the internet.

18. Przybylski, A. K., Murayama, K., DeHaan, C. R., & Gladwell, V. (2013). Motivational, emotional, and behavioural correlates of fear of missing out. Computers in Human Behavior, 29(4), 1841–1848. doi:10.1016/j. chb.2013.02.014 

19. Nissenbaum, H. (2019). Contextual integrity up and down the data food chain. Theoretical Inquiries in Law, 20(1), 221–256.

20. Art. 5 GDPR – Principles relating to processing of personal data – General Data Protection Regulation (GDPR). (2021, October 22). General Data Protection Regulation (GDPR).

21. Amiseq. (2024, February 9). Navigating the privacy paradox: balancing personalization and protection.

22.Tim Miller & Jeannie Marie Paterson. (2020, May 12). The privacy paradox: Why we let ourselves be monitored.–why-we-let-ourselves-be-monitored

23. How Google uses cookies – Privacy & Terms – Google. (n.d.). Privacy & Terms – Google.

24. Md Mehedi Hassan Onik & Chul-Soo Kim. (2019). Privacy-aware blockchain for personal data sharing and tracking. Privacy-aware Blockchain for Personal Data Sharing and Tracking.

25. Flew, Terry. (2019). Regulating Platforms.

26. The privacy paradox: Why do people share their data? (2023b, July).

Be the first to comment

Leave a Reply