Protecting Your Digital Self: The Importance of Data Privacy in the Information Age

What’s your first thought when you hear about social media? Do you spend most of your day browsing? Does it ever occur to you that your privacy is at risk while you passing the time, meeting new people, staying up to date on current events, and gathering social information (Quinn, 2016)? Researchers have sought to understand how these platforms’ use intersects with effects, as well as to identify potential impacts on relationships, social goals, and valued outcomes such as privacy and sociality (Quinn, 2016).  I have asked people around me, compare to people concern about data leakage, people do not know how to protect the privacy of the data occupy the vast majority. Only a few said they did not care about data being compromised. For the protection of user data, the government has issued a series of laws and regulations, but there are still some omissions, which are not conducive to users. So in my opinion, users should understand the importance of data privacy and the serious consequences of data leakage, strengthening the awareness of data protection In addition, the government should consider users’ concerns more from a security standpoint and strengthen data protection laws as a result.

What is privacy?

To protect data privacy, first we need to understand what privacy is and why it is important. Privacy is a fundamental right that is necessary for autonomy and the protection of human dignity, and it serves as the foundation for many other human rights (What is privacy? 2017). It entails the right to control access to personal information, the right to choose how personal information is collected, used, and disclosed, and the right to be free of unwanted intrusion, surveillance, or interference in personal matters (What is privacy? 2017). Since the advent of new technologies like the global positioning system (GPS) and social networking through the Internet, privacy is a concern that has gained more attention in the legal system (Acquisti et al., 2013). Government agencies’ data leaks might give hostile countries access to sensitive information; businesses’ violations could give rivals access to confidential information; and schools’ violations could give criminals access to student data so they can steal identities. A hospital or doctor’s office breaking the law can give sensitive information to anyone who might use it wrongly. Data privacy is essential for safeguarding sensitive information, fostering trust, and following laws. The privacy rules enable us to assert our rights in the face of significant power imbalances (What is privacy? 2017). Prioritizing data privacy and taking the appropriate precautions to protect sensitive information should be a top priority for all parties—individuals, businesses, and governments.

Nissenbaum’s privacy framework helps understand how privacy works in different social contexts. It is a privacy principle that forms privacy norms based on the environment in which personal information is shared or used (Doyle&Helen, 2010).  The three sections of Privacy in Context are as follows: (1) “Information Technology’s Power and Danger,” (3) “The Framework of Contextual Integrity” and (2) “Critical Assessment of Prevalent Approaches to Privacy.” (Doyle&Helen, 2010) According to Nissenbaum’s theory, it is critical to consider the environment in which personal information is shared or used, the standards and expectations of the target group, and the transmission principles that underpin these activities (Doyle&Helen, 2010). This means that privacy expectations may vary depending on the context, and what is considered private in one context may not be considered private in another. He also suggested that the context in which personal information is shared or used shapes privacy as a social norm rather than an absolute right or virtue (Doyle&Helen, 2010). This indicates that privacy expectations are not just a matter of individual preference but are influenced by the broader social and cultural context in which they occur. By considering these factors, individuals and organizations can make informed decisions about how to collect, use, and share personal information in a way that respects individuals’ privacy rights and promotes social norms of trust, respect, and confidentiality.

The Facebook Cambridge Analytica scandal- Revealed 50 million profiles


Individuals and organizations face ongoing challenges in protecting data and mitigating security threats as society becomes more increasingly digital (Hinds et al., 2020). One recent data protection case is the Facebook-Cambridge Analytica scandal (Cadwalladr&Harrison, 2018). In 2018, it came to light that political consulting firm Cambridge Analytica had improperly collected data from millions of Facebook users to affect the 2016 US presidential elections. Facebook gave a third-party app access to the information of people who had taken a personality test on the social media platform. Yet the software also gathered information about users’ Facebook friends, in violation of Facebook’s terms of service. Next, Cambridge Analytica utilized this information to produce carefully targeted political ads to sway voters. There was a public outcry following the controversy as many individuals were concerned about the exploitation of their personal information. Facebook was heavily criticized and charged of failing to safeguard its users’ privacy.

According to research, many people are unaware that the information on their news feeds is curated based on their preferences and opinions. It also implies that they are unaware of fake news, filter bubbles, and, most likely, targeted advertising that may influence their opinions or political preferences (Hinds et al., 2020). Conflict might arise because people feel their expectations of retaining their privacy have not been honored, according to CPM theory, when boundaries are unclear (i.e. their privacy has been violated). This directly affects how much users trust the platform and whether they will continue to use it. It can be seen that the platform’s protection of data privacy is not only safeguarding the rights and interests of users, but also protecting its own reputation.

Research about user’s attitudes towards privacy

Source: Surfshark, survey of internet users from AU, CA, DE, UK, US

How private do you feel online? Let’s look at what the good online people think about privacy concerns. 90% of respondents said they completely or somewhat agree that online privacy is important to them, while 32% said service quality is more important than privacy (Surfshark, 2022).

Source: This image is licensed under the Creative Commons Attribution-Share Alike 3.0 International License-

Nevertheless, less than half of people are aware of the right to be forgotten when it comes to taking control of such data (the concept of having your private data removed from internet searches or directories) (Surfshark, 2022). With age, the knowledge gap between this and broader privacy awareness widens. The respondents are aware that there are dangers on the internet. 70% said they are at least concerned about online security (Surfshark, 2022). About half of respondents say they feel secure when using the internet, but a third report having already experienced data breaches (Surfshark, 2022).

This image is licensed under the Creative Commons Attribution-Share Alike 3.0 lnternational License-

In terms of privacy-enhancing tool use, 63% of respondents use at least one antivirus. Only 39% use an adblocker (perhaps those desperate pleas to turn them off are effective), and 36% use a password manager (Surfshark, 2022).

According to this study, we can conclude as follows: there is growing concern that the information gathered by government agencies and corporate organisations may result in the leakage of personal information (Shamsi and Khojaye, 2018). Users’ reactions and behaviours towards information leakage vary. People are aware of the problem of personal information leakage and try to avoid it (Yi et al., 2020).

For sharing private information, there’s another interesting point of view. According to the literature, consumer intentions to return to a website are influenced by privacy concerns, trust, and attitude about a company’s website (Chelappa and Pavlou, 2002; Belanger et al., 2002; Eastlick et al., 2006). Take email and business marketing for example. The features of the receivers, such as their views of privacy risk, and the characteristics of the sender, such as the sender’s reputation, both influence attitudes towards direct mail (De Wulf and Vergult, 1998). According to Godin (1999), marketing can become more individualised and effective when customers consent to share information and get commercial solicitations (Tezinde and co, 2002). This means that the attitude and manner of the propagandist directly affects whether the consumer or user wants to inform private information for the next cooperation.

How to protect Digital rights?

For users to exercise their digital rights, they can taking various steps to preserve their privacy and guarantee that their personal data is treated properly. Including:

1. Asking the deletion of their personal information: Consumers have the right to ask that their information be removed from a company’s database. This guarantees that their data won’t be utilised for things they didn’t authorise or that are no longer pertinent.

2. Users can opt out of some data gathering activities, such as targeted advertising or data sharing, by choosing this option. This enables them to restrict how much personal information is gathered and handled by businesses.

3. Maintaining data privacy awareness and education: Users can keep up to date on data privacy problems and educate themselves on the best practises for safeguarding their personal information. Reading privacy rules, maintaining current on data protection regulations, and being aware of potential data breaches are some examples of how to do this.

What’s the problem of privacy regulate? And how to improve it?

Concerning the occurrence of various data leakage phenomena, the government should review gaps in laws and regulations and take the lead in monitoring data privacy. The main protection law is The European General Data Protection Regulation (GDPR) which went into effect on May 25, 2018 (Jeanette&Jesse, 2019). Its goal was to harmonise European privacy and data protection laws while also assisting EU citizens in better understanding how their personal information was being used and encouraging them to file a complaint if their rights were violated (Jeanette&Jesse, 2019). Given the GDPR’s origins as a citizen-focused regulation, the impact of the regulation on individuals — in Europe and elsewhere — is an important benchmark for understanding its successes and shortcomings (Jeanette&Jesse, 2019). However, the GDPR’s overall lack of precision in how data subjects’ rights are defined in relation to artificially intelligent algorithmic systems renders it “toothless” in this area (Jeanette&Jesse, 2019). In this case, governments can develop new privacy laws and regulations that address emerging technologies and practices, such as artificial intelligence, the Internet of Things, and biometric data to manage unconstrained situations.

With its expansive breadth, GDPR has encountered comparable logistical and operational problems (Aho & Duffield, 2020). According to the European Union, businesses appear to be treating the GDPR more like a legal conundrum in an effort to maintain their current business practises rather than changing how they operate to better safeguard the interests of customers (EDPS, Citation2019, p. 5). In this situation, governments should existing laws can be enforced and improved on this basis. To strengthen relevant laws on data protection, and to establish a comprehensive data protection system together with the GDPR. As an illustration, on May 21, 2019, Canada published a draught of its proposed Digital Charter, which states that “the content of personal information given by individuals shall be controlled, including the subject and the purpose of the utilisation of personal information” (Mcmillan. ca, 2019). The EU’s General Data Protection Rules, which took effect in May 2018, provide in Chapter III that “data subjects must have the right to access, correct, delete, and process data,” and that “data collectors shall provide relevant information when collecting data” (Intersoft Consulting, 2018).

In the digital age, it is crucial for people, businesses, and governments to be aware of the threats and take precautions to safeguard privacy. Protecting privacy data is not only a matter of compliance with regulations but also an ethical responsibility for companies and organizations that handle sensitive information. By understanding users’ attitudes towards privacy, organizations can develop more effective privacy policies and practices that align with users’ expectations and needs to better data protection.


Acquisti, A., John, L. K., & Loewenstein, G. (2013, June). What is privacy worth? | The Journal of Legal Studies: Vol 42, no 2. What Is Privacy Worth?  Retrieved April 8, 2023, from

Aho, B. and Duffield, R. (2020) Beyond surveillance capitalism: Privacy, regulation and big data in Europe and China, Economy and society. Taylor&Francis Online. Retrieved April 16, 2023, from (Accessed: April 16, 2023).

Cadwalladr, C., & Harrison, E. G. (2018, March 17). Revealed: 50 million facebook profiles harvested for Cambridge Analytica in major data breach. The Guardian. Retrieved April 15, 2023, from

Doyle, T. Helen Nissenbaum. (2010, December 1). Privacy in Context: Technology, Policy, and the Integrity of Social Life . J Value Inquiry 45, 97–102 (2011). Retrieved April 16, 2023, from

European Data Protection Supervisor (EDPS). (2019). Annual Report 2018. Retrieved April 16, 2023, from from [Google Scholar]

Hinds, J., Williams, E. J., & Joinson, A. N. (2020, June 13). “it wouldn’t happen to me”: Privacy concerns and perspectives following the Cambridge analytica scandal. International Journal of Human-Computer Studies. Retrieved April 15, 2023, from

Huang, L.Zhou, J.Lin, J. and Deng, S. (2022), “View analysis of personal information leakage and privacy protection in big data era—based on Q method”, Aslib Journal of Information Management, Vol. 74 No. 5, pp. 901-927. Retrieved April 15, 2023, from

IT (2017), “The survey showed that 79.0% of respondents felt their personal information had been leaked”, Retrieved April 15, 2023, from

Jeanette Herrle, J. H. (2019, July 9). The peril and potential of the GDPR. Centre for International Governance Innovation. Retrieved April 16, 2023, from

Quinn, K. (2016, March 1). Why we share: A uses and gratifications approach to privacy regulation in social media use. Taylor & Francis. Retrieved April 15, 2023, from

Shamsi, J.A. and Khojaye, M.A. (2018), “Understanding privacy violations in big data systems”, IT Professional, Vol. 20 No. 3, pp. 73-81. Retrieved April 15, 2023, from

Surfshark. (2022, June 6). How people see their privacy in 2022. Surfshark. Retrieved April 15, 2023, from

What is privacy? Privacy International. (2017, October 23). Retrieved April 9, 2023, from

Be the first to comment

Leave a Reply