
“Do you want to enable location tracking?” “Do you want to enable personalised ads?” “How do you want to manage your Google activity?” (Google, 2024).
Those questions should be familiar to people who have used Google before; which is the majority of people on earth, because not only Google, but any other legal digital platforms have asked similar questions to their users to secure their privacy before or during their experience. The purpose of these questions is to draw boundaries on what user information/ data can or can’t be collected during their experience of using the platforms. Nevertheless, that brings us to more questions;
- “Why do we need to protect our privacy”
- “Why do they need our information?”
- “How did they collect our information in the first place?”
- “Is there more to our online privacy that we don’t know yet?”
The more we dig, the deeper the hole is.
Maybe it’s all just our imagination, or maybe the less we know, the better. However, it is obvious that the relationships between us as platform users, and the digital platform providers are unbalanced; they have more control over the data produced by us, the user, and the process of data collecting is still untransparent. For that reason, the only thing we can do; as people who still reading this, is to study the current situation and analyse the possible outcome to better protect ourselves from uncertainty.

“Why do they need our information?” – The current explanation behind digital privacy
Before we take a deep dive into the controversies and concerns, we need to establish a fundamental engagement with the concept of digital privacy; what is digital privacy, and why do we need to protect it?
Digital privacy essentially refers to the protection of information that represents us as users of the internet, which information can also be called data, from inappropriate or illegal use by other stakeholders in the digital world. The definition of digital privacy applies to a variety of layers surrounding the term data, including how the data being collected, analysed, stored, and presented on the internet (IEEE Digital Privacy, 2024).
On the other hand, digital privacy also focuses on the type of data being collected. There’s general data to which each individual contributes or uploads online; they could be your contact details, personal information or IP address. There’s also so-called big data, which is based on massive and anonymous information regarding interactions between the user and the digital content, such as the time you spend on a video, or what tool you will use to help improve your experience on a website (Flew, 2021).
Big data are messy and unorganised before any actions are applied to them, so as an audience, it is easy to overlook or ignore their intention. However, this is not the case for the owners of digital platforms because of their potential for business model development, which will be explained later in this blog.
Nevertheless, digital privacy requires understanding from both the digital platforms and the population that is using it. For digital platforms, it’s their responsibility to establish and clarify how digital privacy has been applied to the user experience, but for the user of the platform or the internet in general, Marwick & Boyd (2018) state,
“For many people, privacy is not simply the ability to restrict access to information, but the ability to strategically control a social situation by influencing what information is available to others, how this information is interpreted, and how it will be spread.” (p.1158).
It is almost the natural instinct of humans to feel more comfortable when they know they are in control, and in today’s scenario, we are born with the right and privilege to have privacy in many ways.
However, Nissenbaum (2018) argues that: “The theory of contextual integrity holds the source of this anxiety to be neither in control nor secrecy, but appropriateness.” (p. 839).
Therefore, it is not the platform audiences who want to have complete control over the data that they produce, but to have an understanding of the details regarding the decision-making process. Especially after acknowledging that digital platforms have been monitoring their data, the need for better governance regarding privacy has increased over the past decade.

“Why do they need our information?” – The current explanation behind digital privacy
To gain a different perspective on digital privacy, it is critical to understand what are the reasons that digital platforms need to collect user data, and as mentioned previously, big data being one of them.
Big data might seem unrelated to improving user experience on digital platforms, considering how confusing and unclear the source of the big data is, but that is exactly how digital companies use it as a business model to produce customised and automated content for audiences.
Van Dijck (2014) explained the idea of datafication in her book, and she explained that real-time tracking and predictive analysis can be achieved through transforming social actions into online quantified data.
What she means by that, is the creator of digital platforms can collect big data from the users regarding their different behaviours and interactions with the platform, then translate it into a pattern or even a structure and technology that can provide customised and automated experiences to the audiences based on that pattern, then repeating the same process until it can automatedly analyse those big data, which in another word “algorithm”.
With algorithms, digital platforms can embrace automation in governance of the online environment, and based on the previously identified user pattern, locate any illegal interactions between the user and the digital content in real-time, and prevent any issues from happening. Sound wonderful, right?
But what happens if the convenience turns its back on us? Or, maybe the creator of this convenience isn’t on our side the entire time? Then, who’s responsible for the outcome?
You would likely say the creator of this convenience is responsible, which is partially correct, because they are the ones who created the system based on big data. However, there are arguments surrounding the unbalanced relationship between audiences and the platforms regarding digital privacy, more specifically on data collection. It might reverse your response into blaming yourself for not being aware of the unequal relationships among your digital privacy and the platform, or simply didn’t finish reading this blog.

“How did they collect our information in the first place?” – Controversial regarding the unbalanced relationships between audiences and the platforms
At the beginning of this blog, the three questions Google asked regarding location tracking, personalised ads, and managed activity were obviously related to digital privacy and user data. Although those questions seem to address user privacy concerns adequately, the only type of data being covered is personal data, not big data.
One of the reasons behind this situation is public awareness of data. When mentioning digital privacy, the first thing that majority of the population comes to mind is personal information, which is reasonable considering the importance and the severe outcome if handled incorrectly. Nevertheless, the commonality of public understanding of big data has not been increasing as the application of analysing big data for digital platforms has, which already created a gap between users and platform creators.
Flew (2021) elaborates on that, the current laws in Australia cover privacy issues but only deal with personal data, and it’s on the whole powerless when it comes to anonymous or aggregated data, like big data. The European Union’s General Data Protection Regulation (GDPR), they ignored governance on data collection and data processing, instead focusing on data use because of public interest concerns, which is evidence of the point above about general audiences’ knowledge regarding digital privacy still remaining on protecting the personal data stage.
As a result of these situations, digital platforms have further control over our knowledge and desire for digital privacy. They can redefine the meaning of digital privacy on their own platform, by only displaying a small portion of the data being collected and allowing adjustment.
Does it mean there are no legal actions or anything we can do to stop them from collecting our data?
The answer is no, there’s always a solution for anything, and in this circumstance, you can simply choose not to use digital platforms, easy!
This is because the data collection process had already begun when the terms of service agreement appeared before you actually use the digital platform, which means that the user can either accept their data being collected during their user experience, or they can choose not to use the service. Each digital platform’s terms of service include a clause that indicates: your access to the platform can be terminated at any time for any or no reason. (Suzor, 2019, p.11)
This statement is the summary of the unbalanced relationships between digital users and digital platforms. While digital platforms don’t provide users with detailed clarification of the data collection and data processing, and users have not been given an opportunity to learn the concept of big data. The creator of digital platforms also doesn’t need to provide explanations to anyone regarding their decision-making and their true intention behind the operations.

“Is there more to our online privacy that we don’t know yet?” – The outcome of this imbalance relatio
We have established that one of the reasons this imbalanced relationship exists, is because of the audience’s shallow knowledge of digital privacy and big data. So, the imbalanced relationship should supposedly recover to an equal status as more people acknowledge their rights and responsibilities regarding big data as platform users. Nevertheless, there are too many uncertainties behind digital platforms, and it is possible that the creators of the digital platforms will lose control of their work.
The most obvious outcome of this unbalanced relationship will be the exacerbation of the connection between audiences and digital platforms. Although the definition of digital platforms only refers to the infrastructure that allows interactions and engagement with the content, the producer of digital platforms is required to establish a mutual connection with their user to ensure their satisfaction. It means the digital platform creators need to create an environment that allows users to feel safe to engage online, including education and instructions on the reason and the mindset to protect their digital privacy.
Nonetheless, there’s nothing wrong with improving the convenience of user experience, instead, people are hoping to see more customised content automatically delivered to them, and during the process, user-generated data is indispensable and unavoidable for the future of digital platforms. Therefore, it is important to create workable and reliable platform rules, so to eliminate concerns and better address some of the current issues not only regarding privacy but also the convenience developed based on our data.

Case study – The daughter of Baidu’s executive leaked user information on the internet
In March 2025, the famous Chinese search engine “Baidu” was reported to have leaked its user data like phone numbers to the public, and the data belonged to regular people, a pregnant woman and a K-pop star. The cause of this incident is a 13-year-old daughter of Baidu’s executive under the name of Xie Guangjun, his daughter claims she used her father’s identity to access the database, and the reason why she conducted such behaviour is that she got into an online argument with others, who hadn’t been officially revealed on record, but some evidence indicates that they might be the people she leaked the data on internet or related to them (Jing, 2025).
Regarding the outcome, Baidu has updated its “zero tolerance” policy towards breaches of user privacy and stringent data anonymisation and access control (Chan, 2025). In comparison to the previous Baidu Privacy Policy (2023), where they simply mention they will apply access control mechanisms to control workflow, and confidentiality agreement with Baidu’s employees.
Overall, there are a lot of mysteries and questionable details regarding this case study, including how she gained access to his father’s identity, or what she’s attempted to achieve out of this behaviour. Despite all of the questions regarding daughter’s behaviour or the parent’s responsibilities, it is absolutely ridiculous to think of the accessibility that the girl or even the father, as the company’s executive can obtain user data without any permission from other authorities.
Throughout the blog, we have already established that the current digital privacy is without a doubt unbalanced toward us, the user, and this case study not only has reassured us of the statement but provided the worst scenario, where the stakeholders of digital platforms treated our digital privacy like nothing. The comparisons of Baidu’s current and 2023 versions of privacy policy, have shown the negative outcome of each digital platform providing a distinct definition for user digital privacy, where there can be many grey areas that require clarification or standards for better privacy protection.
Conclusion
It is probably too late to apologise for flooding you with questions and concerns. However, this is a topic that requires public awareness and attention to make changes in the industry, and what we can do as user/citizens in this digital world is to educate ourselves with what comes with convenience.
AI Acknowledgment
I have used Grammarly to correct my wording and grammar in this assessment
References
Center for Democracy & Technology. (2016). 10 Tips for Protecting Your Digital Privacy. [Image]. Center for Democracy & Technology. https://cdt.org/insights/10-tips-for-protecting-your-digital-privacy/
Chan, W. (2025, Mar 21). Baidu stresses strong data privacy measures after executive’s daughter doxes netizen. 9 Apr, 2025, from https://www.scmp.com/tech/big-tech/article/3303354/baidu-stresses-strong-data-privacy-measures-after-executives-daughter-doxes-netizen?module=perpetual_scroll_0&pgtype=article
Chowdhury. (2021). The Future Startup Dossier: How Baidu Has Become China’s Google. [Image]. Future startup. https://futurestartup.com/2021/07/08/the-future-startup-dossier-how-baidu-has-become-chinas-google/
Flew, T. (2021). Regulating Platform. Cambridge: Policy, pp. 79-86.
Google. (2024, Sep 16). Privacy Policy. https://policies.google.com/privacy?hl=en-US
IEEE Digital Privacy. (2024). What is Digital Policy and Its Importance?. https://digitalprivacy.ieee.org/publications/topics/what-is-digital-privacy-and-its-importance#:~:text=Digital%20privacy%2C%20a%20subset%20of,and%20transmitted%20within%20digital%20environments
Jing, S. (2025, Mar 20). Baidu exec’s teen daughter linked to doxing scandal using overseas data in online dispute. 9 Apr, 2025, from https://technode.com/2025/03/20/baidu-execs-teen-daughter-linked-to-doxing-scandal-using-overseas-data-in-online-dispute/
Kron. (2021). 5 Differences Between Data Security and Data Privacy. [Image]. Kron. https://krontech.com/5-differences-between-data-security-and-data-privacy#
Lenz, G. (2020). Why digital privacy matters. [Image]. Gregor Lenz. https://lenzgregor.com/posts/digital-privacy/
Marwick, A. & Boyd, D. (2018). Understanding Privacy at the Margins: Introduction. International Journal of Communication, pp. 1157-1165.
Nissenbaum, H. (2018). Respecting context to protect privacy: Why meaning matters. Science and Engineering Ethics, 24(3), 831-852.
Siddiqui, S. F. (2022). 5 ways to safely store your data on a digital platform. [Image]. LinkedIn. https://www.linkedin.com/pulse/5-ways-safely-store-your-data-digital-platform-saniya-feroze-siddiqui
Srija, L. (2023). The Digital Age: Navigating The Risk And Rewards of Sharing Personal Information Online. [Image]. Medium. https://blog.startupstash.com/the-delicate-balance-of-privacy-in-the-digital-age-navigating-the-risks-and-rewards-of-sharing-c294014e80ae
Suzor, N. (2019). ‘Who Makes the Rules?’. In Lawless: the secret rules that govern our lives. Cambridge, UK, pp. 10-24
Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197-208. https://www.proquest.com/docview/1547988865?_oafollow=false&accountid=14757&pq-origsite=primo&sourcetype=Scholarly%20Journals
Be the first to comment