
Introduction
At present, big data has gradually become a part of our lives, often without us even noticing. Through big data, better information collection, analysis, storage and communication can be carried out, enabling individuals to easily access the resources they require. However, have you ever considered how much of your personal data is really private? The harsh reality is, as big data continues to develop, issues related to network privacy and security are increasingly emerging. Many privacy issues today we currently face arise from digital networks, particularly the Internet and its various platforms, including mobile systems, email services, social networks, cloud providers, and the Web itself (Helen, 2018). These challenges pose significant risks to users’ personal information, including the potential for data breaches and other financial or mental problems.
The Invisible Hand: Who Collects Your Data Online and How?
A lot of your data is being collected online, and most people do not fully know how much is being taken. Here are some examples.
Big platforms like Google, Facebook, Amazon, TikTok, and YouTube collect your data. They do this to make your experience more personal, show you ads, improve their services, and see what you do on their sites. Other apps, like games, fitness trackers, weather apps, and messaging apps, also collect data. They track things like where you are and how you use the app. They often sell this data to advertisers or data companies. Governments in the USA and UK also gather data through surveillance programs and company requests for national security and public safety purposes.

Your data is collected online through various and invisible methods. One common way is through cookies and trackers. These are small files saved in your browser. They watch what you do on websites. Companies use them to learn what you like. If you block cookies, they can still find out who you are by using something called device fingerprinting. This means they look at things like your screen size, the browser you use, your system, and what fonts you have.
Apps also ask for permission to use parts of your phone, like your location, contacts, microphone, or camera. A lot of times, they ask for more than they need. This data can also be shared with third-party partners. When you make an account or log in, you give more details like your name, email, device ID, and where you are.
What you do on social media gives companies even more information. Every post, like, comment, and message helps build a profile about you. This is used to show you certain content or advertising. Sometimes, police or other groups can also look at this data. Apps and websites can also track your location using GPS or your internet address. This helps companies analyse foot traffic.
Real-world Consequences of Weakened Privacy
Yet part of the concern for privacy is that people are becoming accustomed to their privacy being invaded. As a result, they may accept this intrusion and stop demanding their right to privacy, or at least in areas that are frequently violated (McCloskey, H. J. 1980). Weakened privacy in the digital world can lead to serious real-world consequences for individuals, communities, and societies. A major risk is identity theft—when personal information like Social Insurance numbers or banking details are stolen and used for fraud. Data breaches have exposed millions to this harm, often resulting in long-term financial and emotional impacts.
Surveillance and profiling cause significant concerns as well. When companies or governments track online behaviour, they create profiles that can manipulate choices, influencing the products you see, the news you receive, and your political targeting. This can also make society more divided and even affect elections.
Marginalized groups may experience greater harm from weakened privacy, leading to discrimination in hiring, insurance, or loans. In addition, location tracking and excessive data collection can put marginalized groups like survivors of abuse, at risk of violence or harassment.
Case in Point: The NSW Court Data Breach

In March 2025, a serious data breach happened at the New South Wales Department of Communities and Justice. About 9,000 sensitive court files were accessed from the NSW Online Registry. It was unclear whether the files had been downloaded or simply reviewed.
The breach was detected when technicians noticed unauthorized changes to data during routine maintenance. After an investigation, it was discovered that an account holder had accessed the system illegally. The account was then quickly closed to prevent further access.
This incident is particularly alarming for domestic violence survivors, as their personal information may have been exposed. Delia Donovan, CEO of Domestic Violence NSW, said this breach could cause for victim-survivors relying on the justice system for protection.
The report pointed out a few serious problems that require more attention:
- Security vulnerabilities in critical systems
The breach showed that the systems designed to protect private and sensitive data were not very strong. The fact that technicians found the breach during “routine maintenance” instead of through active monitoring indicates a reactive approach to security. This means the system is not effective at catching problems in real-time, which is dangerous when it holds data for people who need protection.
- Insider threat dimensions
This was not an outside hacker—it came from someone already inside the system. That changes how we think about security. Instead of just stopping outside threats, systems also need to watch for what people inside are doing. It’s not clear if the person used their own access in the wrong way or if someone else got into their account. Either way, it is a big problem.
- Risk amplification for vulnerable populations
For domestic violence survivors, this kind of data breach is not just about privacy. It can put them in real physical danger. The leaked files may include safe places, daily routines, or other details that should never be shared. So, when looking at these risks, we need to think not just about privacy but also about safety and how it can affect individual’s life.
Case in Point: Amazon’s Tracking Lawsuit

In January 2025, a group of people filed a lawsuit against Amazon. They said the company was secretly tracking people through their phones and selling the data. The lawsuit said Amazon put special code, called Amazon Ads SDK, into different apps. This code let Amazon collect location data without people knowing or saying yes. The data revealed sensitive personal information, including users’ residences, workplaces, and visited locations.
As we all known, Amazon, as one of the largest e-commerce platforms globally, boasts a vast user base and significant traffic. As of the end of 2024, it is estimated that Amazon’s active consumer accounts worldwide have reached approximately 310 million. Notably, around 80% of these users are from the U.S. market, with an estimated active user population in the United States reaching 230 million (Naveen, 2024). Such a frequently used software, closely related to our lives, has experienced incidents of user privacy data breaches or misuse before. In 2023, Amazon would pay over $30 million in fines to settle alleged privacy violations related to its Alexa voice assistant and Ring doorbell camera, according to federal filings.
Public Awareness and the Privacy Paradox
“Together, we have interviewed and observed countless teens and young adults as they struggle to achieve privacy in a networked age. Many people choose to participate in social media, carry cell phones, and engage in other online activities knowing full well that their data are being collected, their actions are being monitored, and their online experiences are being algorithmically generated and personalised.” –Marwick & Boyd, 2018
Public awareness of online privacy has significantly evolved, with many recognising that their activities are tracked. However, a “privacy paradox” persists that users express concern yet continue to use data-collecting services without reading privacy policies or understanding how their data is collected (Norberg et al. 2007).

For users, it remains difficult to understand how to infer user profiles, how to integrate data, and how to obtain technical protection measures. A user’s decision to read a privacy notice is linked to their trust in such notices. Both reading and trust are affected by privacy concerns, the perceived comprehensibility of the notices, reliance on alternatives for risk reduction, and demographic factors like education, age, and online experience (Milne & Culnan, 2004). These significant knowledge gaps lead to the persistent disconnection between theoretical concerns and actual behaviours.
Who Really Benefits from Current Privacy Policies?

While privacy policies aim to protect users, big tech companies and data-driven businesses often benefit the most. Privacy is often a space where the needs of organisations and the rights of individuals come into conflict (Bélanger, F., & Crossler, R. E., 2011). They create complex privacy terms that comply with legal requirements but allow them to use extensive user data with little resistance. These companies have the resources to navigate regulations. What is more serious is that our leaked privacy will be sold to the third party such as black industry chain, they use our information to defraud our relatives and friends and gain benefits.
Digital Policy and Governance: Are the Rules Enough?
Digital policy and online privacy governance aim to protect personal information. For example, California has a law named the California Consumer Privacy Act (CCPA) that gives people more control over their own data. But the fact is policymakers face different kinds of challenges in effective privacy governance. For example, it is difficult to control how data moves between countries. They also lack sufficient labor forces or tools to enforce the rules. As we all know, new technology changes fast, making it hard to keep up with the time. Governments need to keep people safe from cybercrime. They often use data to achieve this. But when they collect too much data, it can harm people’s privacy. This brings up big questions about freedom and safety. It is also not easy to balance the need to use data with the need to protect individual rights.
The Path Forward: Enhancing Online Privacy
We are on a long and challenging road way to enhance online privacy. Privacy is essential for maintaining the various social relationships we wish to have with others (Rachels, 1975). For now, what we need to do includes the aspects on technology, law and public awareness for creating a better and safer online world.

One important thing is that the world needs stronger privacy laws. Many current rules are outdated or only valid in certain regions. We need clear rules that make companies take responsibility for how they collect, use, and share data. Governments in different countries can collaborate to find a balance between developing innovation and ensuring public safety.
Another thing that should be done is that tech companies and various platforms need to build privacy protections into their products from the start. The way platforms are governed is crucial, as they shape how people communicate and influence public culture along with the social and political experiences of their users (Suzor, 2019). They should focus on protecting user data instead of trying to collect as much as they can.
Users also need to learn more about how our data is collected and used. We should know what we are sharing, who is getting it, and what it might be used for. If we do not know these things, we cannot really give full permission or protect our own privacy.
Conclusion
Today’s era of big data is like a double-edged sword, which not only gives users a faster and more convenient way to obtain information, but also contains the risk of personal privacy disclosure. Recent data breaches have exposed many vulnerabilities in the digital media industry and government databases. As a result, the call to protect our privacy has become one of the most important issues facing governments. Protecting online privacy is not just a technical issue, but is about personal freedom, security, and human dignity. The right to privacy should not be regarded as an absolute human right; rather, it is one that is grounded in specific social and legal contexts (Flew, 2021). As digital technology becomes more critical in our lives, we demand stronger laws, more transparent policies, and tools that put control back into the hands of users.
Reference
Archie, A. (2023, June 1). Amazon must pay over $30 million over claims it invaded privacy with Ring and Alexa. NPR. https://www.npr.org/2023/06/01/1179381126/amazon-alexa-ring-settlement
Bélanger, F., & Crossler, R. E. (2011). Privacy in the digital age: A review of information privacy research in information systems. MIS Quarterly, 35(4), 1017–1041. https://www.jstor.org/stable/41409971
Boscaini, J. (2025, March 26). Identity of hacker behind NSW court website data breach unknown, police say. Abc.net.au; ABC News. https://www.abc.net.au/news/2025-03-27/nsw-hacker-data-breach-identity-unknown-police-say/105101486
Flew, T. (2021). Regulating platforms (pp. 72–79). Cambridge: Polity.
Marwick, A. E., & Boyd, D. (2018). Privacy at the margins| understanding privacy at the margins—introduction. International Journal of Communication, 12(1158).
McCloskey, H. J. (1980). Privacy and the right to privacy. Philosophy, 55(211), 17–38. https://doi.org/10.1017/S0031819100063725
Milne, G. R., & Culnan, M. J. (2004). Strategies for reducing online privacy risks: Why consumers read (or don’t read) online privacy notices. Journal of Interactive Marketing, 18(3), 15–29. https://doi.org/10.1002/dir.20009
Nissenbaum, H. (2018). Privacy in Context (p. 835). Stanford University Press.
NORBERG, P. A., HORNE, D. R., & HORNE, D. A. (2007). The privacy paradox: Personal information disclosure intentions versus behaviors. Journal of Consumer Affairs, 41(1), 100–126. https://doi.org/10.1111/j.1745-6606.2006.00070.x
Rachels, J. (1975). Why privacy is important. Philosophy & Public Affairs, 4(4), 323–333. JSTOR. https://doi.org/10.2307/2265077
Ruby, D. (2023, June 8). 43+ Amazon Statistics 2023 (Users, Market Share & Trends). DemandSage. https://www.demandsage.com/amazon-statistics/
Schappert, S. (2025, January 29). Amazon created software to collect senstive location data on its shoppers, lawsuit claims. Cybernews. https://cybernews.com/privacy/amazon-lawsuit-collecting-consumers-sensitive-location-data/
State of California Department of Justice. (2024, March 13). California Consumer Privacy Act (CCPA). State of California – Department of Justice – Office of the Attorney General. https://www.oag.ca.gov/privacy/ccpa
Suzor, N. P. (2019). Who makes the rules?. in lawless: The secret rules that govern our lives. (pp. 10–24). Cambridge University Press.
Be the first to comment