Who is controlling people’s privacy in the era of “big data”?

Platform-user relationship

Social media has always been a very torn existence. From its inception, on the one hand, the founders wanted to provide people with an easy way to communicate and create a substantial social network through the Internet; at the same time, they also wanted to maintain the privacy space promised to people. However, with the rapid growth of the Internet, people’s reliance on social media has grown exponentially. This development has led to massive data generated when people use social media, triggering some greed among platform owners. Platform companies have started to take data resources from users and then use them to provide feedback to users for better services. Under the seemingly mutually beneficial business model, there are many dangers. Burgess et al. (2017, pp. 254-278) suggest that their regulatory authority and responsibilities must be re-examined as digital platforms evolve. In addition, regulators should consider whether these changes will lead to more regulation in general or specific regulations specifically designed to protect consumers.

According to Flew (2021, pp. 72-79), a question is raised: “The fact that the lines between platforms and infrastructure are now blurred. “Indeed, digital platforms of all kinds are evolving into digital media themselves, perhaps because much of the digital media content in question is presented on mobile media. In terms of the Internet, there are four basic types: the Web, social networks, e-commerce, and cloud computing. In addition to these different types of technologies, new forms of information processing systems exist, such as social networks and e-commerce sites. The Web provides us with others to achieve goals and objectives. As such, it is impossible to separate it as one technology; instead, it requires multiple technologies. Businesses, consumers, users, services, and other components are connected in a closed loop. The network effect suggests that the larger the digital platform, the more users it has. The connection between its users may provide more space for the platform to grow, causing some monopoly influence in some designated areas.

Does the Internet really protect privacy?

However, all trends are driven by data derived from the algorithms of consumers in various platforms, and digital media collect information from users to give them better-personalised services through extensive data feedback. This is a ubiquitous business model in the current Internet era. However, such behaviour can not help but let users begin to worry that their privacy has not been leaked in such interactive behaviour. Can the Internet protect their privacy? For example, since 2012, Facebook has been implementing a plan to cooperate with third-party application developers in a new business model, in which application developers provide all the social activities of their users, and Facebook provides them with the social graph of their users. For such a business model, Terry said digital platforms could be seen as modern society’s economic and cultural rights, which is typical of 21st-century capitalism.

Figure 1: Smartphone “eavesdropping”

Such fears Nissenbaum mentioned as a source of privacy threats are socio-technical systems (Nissenbaum, 2015). While the development of information technology has provided convenience to people’s lives, it also brings the potential for personal privacy to be lost. Data anonymization is generally an effective means of preventing privacy violations. However, existing anonymization principles rarely consider hostile attacks against data sets with functional dependencies. He also pointed out that personal information is often leaked unexpectedly, sometimes without people knowing the problem. For example, one of the most common questions people wonder is whether the phone is listening to our conversations. Almost all have encountered something similar when users and their friends talk about an object or event on social media platforms. Before long, people can see their ads or receive tweets on various other software pages, sometimes without using electronic devices, but only through verbal communication. Personal information can be leaked in all sorts of unexpected ways, and people sometimes do not even realise the problem. 

Facebook’s experiment with democracy

Facebook announced the “like” button at its launch event in August 2010. This button, which is the most common button on social networking software today, connects different users through an identical “like”. This is Facebook’s so-called “Open Graph”, which will bring together independent icons of data and information through a medium in the middle to form a more social, personalised social networking system. This leads to the subconscious “like” behaviour of users. Their preferences and opinions on content are unconsciously leaked out.

Figure 2: Facebook CEO Mark Zuckerberg at the annual developer conference. Source: https://shorturl.at/qzJX2

Furthermore, in 2009, Facebook CEO Mark Zuckerberg declared that Facebook was a national business and that users could directly participate in creating the site’s terms of service. However, the reality is that in 2019 it broke the eavesdropping scandal on users, and before that, it was punished several times for data privacy issues. So Facebook filed a motion to abolish altogether the user’s right to vote to collect more private data. However, after this, they have repeatedly strengthened the security and privacy aspects to provide their users with as safe a social environment as possible. Nevertheless, it is hard to be in a truly secure environment. In recent years more and more people are considering the issue of how to form big data repositories without sacrificing much personal privacy. Developers want data only; how can we have a win-win situation?

The hidden rules of platforms

It is hard to say that there is a company that would dare to make a big gamble and promise its users that it will provide a good service without touching their privacy. All that existing companies can do is establish one rule after another and try to gain partial access at a level of security acceptable to users. Suzor said that the actual rules of social media platforms are hidden. Although there are legal regulations to protect users’ rights, the platform companies also have rules for using the respective software. It is these companies that hold the primary power. Users must accept these terms because The social environment is such that people use these social, shopping and other software. This is the main component of people’s lives. Social media platforms have assumed a significant share of the communication channels with others in our current society. They are already the centre of people’s lives, which can significantly impact people’s lives once they are partially stripped away. No one can escape. Once someone leaves this social system, it will be challenging to integrate into society, and they will struggle to survive.

As the Internet has become more and more involved in the real world over the years, data has become an increasingly important asset in the Internet. Search engines want to know people’s curiosity, and social media wants to connect people all over the Web; consumer platforms want to know people’s desire to buy; data encompasses things in life, big or small, and even the Internet may understand us better than we know ourselves. Since the beginning of the Big Data era, the Internet has constantly been recording highly efficiently. This means that anyone can unknowingly steal their life information, making it very difficult to keep personal data safe on the Internet. Protecting privacy is no longer a valid issue because we know that anything we do online will be recorded and permanently stay in the digital world.

The balance of privacy and personalization

So how can they balance the need for user privacy with the desire for personalised advertising? Social networks are indeed smarter with the “Open Graph” system, but what is even scarier is that people are so used to this kind of privacy leakage that they may not understand how the “like” button violates their privacy, which is a convenient feature. In fact, it is a good design, and the starting point is to build a better social system. However, the problem is that Facebook needs to inform users that they will get information through this small button, which means that this series of data collection is done without the user’s knowledge and consent. They did not explicitly announce the purpose of the data collection, and users did not have the right to see it or ask for it to be stopped. As a result of this lack of clarity, a feature that could be used to improve the user experience of the software and optimise social networking has been twisted into a drawback that could lead to privacy breaches.

Figure 3: Facebook “like” icon

Platform governance and regulation

As Suzor mentions, the platform’s governance is essential (Suzor, 2019, pp. 10-24). These digital platforms’ various terms and conditions primarily affect their users’ digital rights and public culture. In legal terms, terms of service are a two-way transaction between the consumer user and the digital platform, and Facebook’s attempts to build a democratic enterprise confirm just how fragile the democratic freedom of choice that platforms offer to consumers is when it comes to specific social values. These platform companies wield almost absolute power, and their terms of service can increase the value of their users by giving them certain “powers” without touching the law. However, they can also easily take those powers back when they gain the hearts and minds of the people. Especially for large platform operators, this almost partisan control of rights is widespread, and in this environment, the TOS does not appear as a regulation of users but instead serves to protect the legal rights of the company. Large platform companies set and enforce what they consider to be “fair” terms of service for their companies under the terms of the law.

Aren’t there laws that restrict these companies and thus maintain the privacy and security of their users? Data protection laws are in place in the U.S., and the first thing that is clear is that users have to give informed consent to data collection. So in this area, we need to regulate data collection through various means, including legal mandates and ethical and technical requirements. Moreover, in data collection, the main focus is still on technical means because this way can avoid illegal access to others’ personal information to the greatest extent possible and allow users more privacy protection. However, the law control is more for the infringement of national interests, for the protection of consumer rights needs to be more detailed regulations. There is no law on how Facebook operates its services, and it can only provide the basic rules. Compared to everyone who uses new software and sees that there are consent terms to access the service.

Behavior of the masses

Moreover, what do most people do? Does anyone actually open those terms and conditions and check them out a little bit? No, people numbly check the boxes. Occasionally, some conscientious software will force the user to read the terms of service, automatically open the page, and require the user to stay on the page for 10 seconds and swipe to operate. However, people actually need to read them carefully. For this state of affairs, to which people have long been accustomed, digital platform companies are even more reckless in their hidden efforts to squeeze users’ privacy. While users think that the lengthy terms and conditions are restrictions woven into the company’s fabric, the truth is that the companies are rigorously exploiting loopholes in the law to protect their interests in obtaining more information.

Figure 4: For your security, don’t just click the little “I agree” box; read the terms and conditions to try to understand how operations use personal information.

What is even sadder is that what people can search on the Web about digital permissions and digital privacy is exhibited under the control of digital platform companies. People cannot benefit from their rights through the user’s identity. More often than not, people are forced to ignore privacy concerns when they are superficially aware of them or shift the blame elsewhere. For example, some people are wary, angry and unaccepting of surveillance cameras or signals in public places as something that jeopardises their privacy. However, for the Web itself, data collection by mobile apps, public social graphs, and behavioural advertising are seen as ways to enhance the Internet, praised for their merits but ignored for their flaws. Even the sources of privacy threats are socio-technical systems (Nissenbaum, 2010), which are associated with “big data”.

More hidden risks ahead

The reason why privacy issues are becoming more and more serious is that ChatGPT 4.0 and major software companies are now releasing development projects on artificial intelligence. Because the training of relatively mature AI systems requires a large amount of data as the basis, and this data often involves the user’s private information. Information leakage in life may not be the focus. People’s biological sclera and face may be essential identification tools in the future. Personally, I think the technical level of the problem is well solved. The important thing is the primary digital platform company for the platform’s governance and can be based on the user’s most essential information implementation rights and the right to know.


Facebook’s privacy game – how Zuckerberg backtracked on promises to protect personal data. (2019). Facebook’s privacy game – how Zuckerberg backtracked on promises to protect personal data. ComputerWeekly.com.


Farber, D. (2012). The Facebook vote and a nation-state in cyberspace. CNET.


Flew, Terry (2021) Regulating Platforms.Cambridge: Polity, pp. 72-79.

Gillespie, Tarleton(2017)  ‘Governance by and through Platforms’, in J. Burgess, A. Marwick & T. Poell (eds.), The SAGE Handbook of Social Media, London: SAGE, pp. 254-278.

McCarthy,C.(2021). Facebook F8: One graph to rule them all. CNET.


Nissenbaum, H.(2018). Respecting contest to protect privacy: Why meaning matters. Science and Engineering Ethics, 24(3), 831-852.

Suzor, Nicolas P.2019. ‘Who Makes the Rules?’. In Lawless: the secret rules that govern our lives. Cambridge, UK: Cambridge University Press. pp. 10-24.

TechTarget(2019). Microsoft to apply CCPA protections to all US customers | TechTarget. Security. https://www.techtarget.com/searchsecurity/news/252473945/Microsoft-to-apply-CCPA-protections-to-all-US-customers?_gl=1

Be the first to comment

Leave a Reply