The Manipulation of Personal Privacy by Data Platforms: ACritical Analysis

I. Introduction

Today’s digital platforms are an important part of social life, but at the same time the issue of personal privacy has become a topic that can no longer be avoided in this case. Digital platform companies have great rights to manipulate the privacy of individual users on the platforms they operate, and this paper will point out the challenges to the right to privacy in today’s era, and then from the perspective of, the tendency of the terms of service of the digital platforms, the digital platforms, and so on, we will discuss how digital platforms rely on users’ privacy to gain access to them, and illustrate this with case studies.Digital platforms rely on the way users’ privacy is obtained, illustrate the way digital platforms manipulate personal privacy, and combine it with cases to illustrate, and then put forward suggestions to improve the phenomenon of digital platforms’ manipulation of personal privacy.

II. Challenges to Privacy in the Digital Age

In the modern digital age, the right to privacy is facing an even greater test, and Scott McNealy’s statement in 1999 is still impressive: “You have zero privacy anyway. Get over it.” This viewpoint is too pessimistic, but at the same time, it has a certain degree of truth. This is an overly pessimistic view, but there is also some truth to it. Surveillance capitalism, derived from the digital age, treats personal data as if it were raw material, and this has had a significant impact on modern notions of privacy. The sheer volume of private user data, the flow of data between third parties, and other data-driven features make it difficult to enforce the law.

Terry argues that the existing regulatory framework is difficult to address to solve the privacy issues arising from datafication and data surveillance. (Terry, 2021)

Nowadays, the maintenance of privacy is subject to a lot of challenges, and the potential manipulation of users’ privacy in the process of using digital platforms is a comprehensive representation of the challenges to personal privacy in the digital age.

III. Strategies of Digital Platforms to Manipulate Users’ Privacy

1. Invasion of User Privacy by Digital Platforms’ Terms of Service

The terms of service provided by digital platforms are both the first step for users to make actual contact with digital platforms and the first step for many platforms to begin to exercise in-depth control over user privacy. Existing laws have not been able to constrain the unequal relationship between digital platforms and users in terms of privacy data.Suzor argues that platforms’ terms of service, which give digital platforms a great deal of power and discretion (Suzor, 2019), give platforms an overwhelming amount of power over the services they provide to their users, and that users do not get much assurance of their privacy rights with respect to a term of service. , with only the option of take it or leave it before the user. The unequal status at the outset leaves room for digital platforms to manipulate users’ privacy.

The nature of the terms of service offered by the platforms also helps them to control users’ private data.Suzor argues that most of the terms offered by digital platforms are characterised by vagueness, complexity and legalism (Suzor, 2019). These features reduce the readability of the terms of service, making it impossible for users to read the entire service provision, which makes users behave in a way that they tend to skip the step of reading the provision and accept it directly, and there are many platforms that include terms that collect in-depth data about users’ private data, while users do not really realise how their privacy is affected by the acceptance of the terms of service.

2. Digital Platforms Collect Private Information in Ways that Do not Comply with the Law.

In addition to the information that is allowed to be collected, there are many cases where digital platforms are collecting private information from users without their knowledge, and this collection of data from users is also manipulating their privacy. The digital platforms will have the authority to collect the data such as basic information provided by the users. Digital platforms use cookies to help identify users when they log in, but this technology may also be used by digital platforms to illegally collect information about users, and a persistent, cross-site cookie may collect much of a user’s private data without the user’s knowledge.

Some digital platforms may obtain user data indirectly, through third-party data agents, or even through accounts that are closely associated with the user on the platform, and may obtain more information about the user without obtaining the user’s

consent.

3. Digital Platforms Use Users’ Privacy to Make Profits

Digital platforms profit from different forms of manipulation of users’ private data, Zuboff (2019) points out that Google profits from using customers’ data in two ways, the first way is to use the data to analyse user behaviour and improve the quality of the search engine to attract more users, which is an indirect way of profiting. Improving the search quality of the engine can provide users with a more personalised experience. In this case, Google uses private data to provide better service to users and generate revenue at the same time.

At the same time, it is important to note that the results analysed by digital platforms from user privacy are more for their own benefit, even if they may be detrimental to the user’s interests. Zhen and Ren (2012) argue that there is a phenomenon of “big data kills familiarity” in which platforms offer different prices to different users based on their information. In this phenomenon, the most common is to give a higher price than the average to the more loyal users, which is a typical price discrimination, but in the current digital society, the platform through the way of pushing information, so that most of the time only the user can see the specific price, and there is no opportunity to compare the price with other people’s prices, resulting in such price discrimination is difficult to be detected, the individual’s privacy has been used for price discrimination, which is also a consequence of the manipulation of the user’s private information by the platform. private information is manipulated by the platform.

The second way is direct profit, where digital platforms sell users’ data to third parties as a way to gain profit. Not only Google, but also many other platform companies, monetise users’ private data, and there are many cases of this, the Facebook case is typical, when platforms are able to manipulate users’ privacy, it will be a great manifestation of surveillance capitalism, and personal information will be monetised directly.

4. Results of Digital Platforms’ Manipulation of User Privacy

It can be known through the survey that 57 per cent of the respondents are worried about their privacy being violated by companies (Goggin et al., 2017), as the manipulation of users’ privacy by digital platforms occurs frequently, and the trust of consumers in companies is greatly reduced, nowadays. This distrust of platforms can lead to anxiety that consumers will be spied on when using the platform services, which will make them reduce or even stop using the platform services, and for the platform companies, the distrust of consumers in their handling of private information can lead to a decrease in the stickiness of the users, and the company’s reputation will be hit, so it is very important to have a good regulation to protect privacy nowadays.

IV. Case Study.

Attendees at Uber’s Boston launch party enjoying ‘God View’ (Photos via Uber’s Facebook page)

After Uber has successfully launched its business in a new city, it has to hold a big party to invite local tech companies to celebrate. At the celebration party in Chicago in 2011, Uber’s employees showed their internal tool called “god view”, through which people present were able to see the real-time location of 30 Uber users, and the location of the users in the room. Julia Allison, who was present at the party, even sent a message to one of the Uber users that she knew where he was, and the user was so shocked that he cancelled his Uber service (Hill, 2024).

MacGann, a former Uber executive, also revealed that in 2014, when he emailed an Uber executive about an Uber ride that was stuck in traffic, the executive replied with an email about having been made aware of the time he would be arriving, via god Veiw.

In 2017, Uber settled with New York’s inspector general over the misuse of users’ private data, and along with a fine, was required to strengthen its data security and limit employee access to user data.

God View was created with the intention of understanding user destinations for better and efficient operations.

Just like google mentioned above by understanding user information, it improves the service level through analysis and indirectly increases the company’s revenue. However, in the case of the party, it is clear that the Uber employee was able to locate the user’s location without his knowledge, and shared the user’s coordinates in front of a third party (an invited guest from the tech company), which is a violation of the contract with the user. And although the user’s information was only used for entertainment purposes in this case, the casualness of Uber’s employees in handling private user information can be seen in the context of MacGann’s experience.

This kind of casualness shows the disrespect of Uber’s platform for users’ privacy information, and this kind of disrespect is the fundamental reason why digital platforms manipulate users’ privacy, which makes people no longer trust the safety of Uber’s users’ private data, and even doubt whether Uber will provide users’ data to other third-party merchants.

Uber’s god view series of incidents reflects the platform company itself for the user’s privacy information is not respected, but the user clearly received the location of the information before they realise their own situation, illustrates the platform company for the handling of user privacy opaque, so Uber as a representative of the various types of platforms should be strengthened to enhance the ethical awareness of the staff, the implementation of more transparent privacy policy, and establish a better relationship with consumers. Therefore, platform companies, such as Uber, should strengthen the ethical awareness of their employees, implement more transparent privacy policies, and build a better relationship with consumers. In the face of complexity, it is important to look at the situation from the user’s point of view. If they only think about manipulating users’ privacy in order to make money, they will be hit by both users and regulations.

V. How to Protect Users’ Privacy When Using Digital Platforms

Users themselves should raise their awareness of privacy in the digital age, and remain vigilant in the face of the various terms of service provided by the platform, even if they may not have sufficient legal literacy to understand the terms of service, they can seek help from other relevant people, and they should not display their important private information on the platform.

The platforms themselves should change their dominant position in the relationship with users, give users more respect for their privacy, and give customers more choice in the collection of private data, allowing users to decide what information can be collected, rather than having to choose to agree or leave at the outset. At the same time, platforms should be more transparent in their handling of user privacy to gain users’ trust.

Today’s laws and regulations still fail to address the issue of protecting individual privacy, but regulators are paying more attention to the right to privacy, Nyst and Falchetta (2017) argue that the right to privacy has received a lot of attention in the wake of the Edward Snowden incident, with the United Nations adopting a number of resolutions on the issue. In addressing the issue of privacy nowadays, the regulators should firstly focus on the priority issues, the relationship between the platform and the user’s privacy should be emphasised, and the legal provisions should need to be formulated to better balance the rights of both parties.

VI.Conclution

When platforms manipulate personal privacy information, they may bring some convenience to users, but this convenience is only one of the results of the use of user information by digital platforms.

seen or unseen consequences resulting from privacy breaches are happening all around us.

If the manipulation of user information by digital platforms is not restricted, it will only worsen the relationship between the platforms and the users, affecting the stability of society as a whole. Nowadays, it takes a tripartite effort between platform companies, individuals and governments to ensure that user information is used by platforms within reasonable limits.

Reference

1.Flew, T. (2021). Issues of Concern. In T. Flew, Regulating platforms (pp. 72–79). Polity.

2.Zuboff, S. (2019). Surveillance Capitalism and the Challenge of Collective Action. New Labor Forum, 28(1), 10–29. https://doi.org/10.1177/1095796018819461

3..Suzor, N. P. (2019). Who Makes the Rules? In Lawless (pp. 10–24).

4. Zhen, L., & Ren, C. (n.d.). Consumer Cognition and Behavior System Based on Big Data Technology. In The 2021 International Conference on Machine Learning and Big Data Analytics for IoT Security and Privacy (pp. 466–472). Springer International Publishing. https://doi.org/10.1007/978-3-030-89508-2_59

5.Nyst, C., & Falchetta, T. (2017). The Right to Privacy in the Digital Age. Journal of Human Rights Practice, 9(1), 104–118. https://doi.org/10.1093/jhuman/huw026

6.Goggin, G., Vromen, A., Weatherall, K., Martin, F., Adele, W., Sunman, L., & Bailo, F. (2017). Digital Rights in Australia.

7. Hill K.(2014,October 4).’God View’: Uber Allegedly Stalked Users For Party-Goers’ Viewing Pleasure (Updated).Forbes.

https://www.forbes.com/sites/kashmirhill/2014/10/03/god-view-uber-allegedly-stalked-users-for-party-goers-viewing-pleasure/?sh=784de0c63141

8.Nissenbaum, H. (2018). Respecting Context to Protect Privacy: Why Meaning Matters. Science and Engineering Ethics, 24(3), 831–852. https://doi.org/10.1007/s11948-015-9674-9

Be the first to comment

Leave a Reply