Privacy in the digital age
According to Nissenbaum (2010), the right to privacy means that people can decide if they want to share their personal information with others, whom they want to share it with, and for what purpose, and that privacy is valuable and deserves to be protected.
In the context of the Internet, privacy issues are even more complex. When using online services or purchasing online products, platforms often require users to provide personal information and agree to platform agreements; a large number of cookies are tracked when browsing the web. The enormous amount of information available in the digital age, the trade-off between privacy concerns and access to online services, and the scope for businesses and governments to use user data without the user’s knowledge give new meaning to privacy (Flew, 2021).
Personal information stored and used on internet platforms is also valuable and deserves to be protected, and users’ private data should not be stolen or used for commercial purposes. So, who is more responsible for protecting privacy – the company or the government? Moreover, what can consumers do to protect their privacy? Didi will be used as an example to analyze this issue.
DiDi was fined 1.2 billion for data violation
China’s ride-hailing company Didi Global was exposed to a privacy violation in 2021. Multiple security breaches led to the disclosure of user information. The information included users’ names, mobile phone numbers, and ID numbers, etc. The number of users involved was in the millions. After a year-long investigation, the Cyberspace Administration of China (CAC) announced a fine of up to USD 1.18 billion against Didi on 21 July 2022, saying it violated data security and personal information protection laws.
Didi broke the Cybersecurity Law, the Data Security Law, and Personal Information Protection Law (PIPL) in eight aspects. In summary, Didi illegally obtained information from users in two main ways. The first is to collect information directly from users when they use the app, such as age, occupation, education, ID number, city of residence, etc. The other is to steal information from users’ mobile phones in the background, such as snapshots from photo albums, location, mobile phone clipboard information, application lists, and so on.
The problem was significant as it not only affected the privacy of Didi’s users but also exposed the vulnerabilities of the company’s data protection system. As the industry relies heavily on collecting and analyzing user data to improve services and achieve business goals, many users have become sceptical of ride-hailing apps.
The process by which platforms collect and use consumer information is not fully transparent
- The information voluntarily stored on the platform by users.
According to George (2015), businesses collect and process personal data to form networks and use it for marketing and other activities to consumers, exchanging data through three main functions:
- Data collectors (sources),
- Data brokers, and
- Data users.
Figure 2 illustrates how companies access personal data. The system is complex because there are many departments and players in between for data collectors, data brokers, and data users. All of these players are rarely present in the consumer’s perception.
As can be seen, even though Didi has written out rules for the collection and use of user data on both web and mobile, consumers have no clear idea of who will access their information and where it will be used other than on Didi. Without users being aware that their information is being used illegally, exposing and punishing such privacy violations is difficult.
2. Platform steals backend information from users’ mobile phones.
In addition to the personal information that users knowingly provide to Didi, Didi also uses the app to illegally collect unrelated phone permissions, clipboard information, location information, etc., without the user’s knowledge.
Usually, when using an app, if the app wants to get access to the user’s location, microphone, camera, etc., a pop-up window will appear, giving the user three options:
- Use only once
- Use when opening the app
- Always turn off
By giving consumers more choices, platforms give them more power, allowing them to feel more control over their privacy, thus reducing the perception of risk (Bornschein et al., 2020).
As can be seen from Figure 3, in the case of cookies, when consumers are only informed and do not have the power to choose, then they will perceive higher risk (H2); however, if consumers can engage in choice, then higher perceived power (H1) will reduce risk perception (H3).
However, the fact is that this option only changes the consumer’s perception of risk, not reduces it. As for Didi’s problem, the app still secretly turns on the phone’s location function, microphone, etc., when it is running or even when it is not running. Moreover, this mostly goes unnoticed by the user.
The more information collected about the user, the more accurately consumers can be segmented, targeted, and measured, providing them with more targeted choices, increasing loyalty, and ultimately benefiting the business (George, 2015). While more personalized services provide convenience for consumers, the illegal collection of sensitive information seriously violates user privacy. For example, Didi illegally accesses users’ mobile phone albums, but more picture information does not provide better assistance to the ride-hailing service. What are their photos being used for? This question makes users worried but cannot be answered.
3. Platforms set rules that are unfair to consumers and also risky.
As analyzed above, online service platforms such as Didi inform users of their privacy policies and give them the right to choose (within certain limits). However, this does not mean that the user’s privacy is guaranteed, as the platform sets the policy, and the data collection process is not transparent. This means that consumers are simply told that “we are using your information legally and compliantly” without knowing the truth of this statement. As a result, violations of users’ privacy are often only discovered when they have reached the point of serious consequences.
Figure 4 illustrates the mismatch between the amount of data consumers want to keep private and the amount of data businesses should or should not have access to. To achieve a balance, firms would have to access only the data that users want to make public; however, the amount of data available to them is, in fact, much larger, and consumers are often unaware of this situation (George, 2015).
According to the research (Zeng et al., 2020), we can find that privacy concerns are ever-present. Although companies can use privacy policies to cover up issues arising from security breaches, unauthorised sharing, and misuse, they cannot fully address the inherent privacy concerns of users. Social media platforms set rules not subject to consumer choice or review by an independent judicial body, although the relevant institutions can remove unfair or overly restrictive rules (Suzor, 2019).
Given that the judiciary is potentially constrained in reviewing platform rules, can consumer privacy be better protected by enacting laws that severely punish such practices as excessive data collection and use?
What can the government do about privacy and security?
Various countries and regions have laws to regulate and govern the illegal collection and use of personal information. The policies introduced by these countries can serve as a warning to the companies concerned and ensure the safety of consumer privacy.
The Australian Privacy Act deals with the privacy rights of users and how organisations and institutions must handle their information. For individuals, the Privacy Act can help users have a better way of handling information, including:
- Knowing why personal data is collected, what it is used for, and whom it is ultimately used by
- The right not to identify themselves
- Access to personal information
- Stop receiving unwanted direct marketing
- Correcting incorrect personal information
- The right to complain if an organisation or institution mishandles information
The limitations of collecting and processing information for an organisation or institution’s app can be seen in Figure 5.
This law clarifies the privacy rights of individuals so that people can understand and better protect their personal information. It also sets out restrictions on apps and warns companies to do a better job of regulating themselves.
However, as we can see through the case of Didi, Australian users have expressed concerns about the viability of the law. The specific report can be seen here. Even though some government officials have expressed concerns about the security of Didi’s user data collection, Didi has stated that it complies with the relevant Australian laws. However, because TikTok has already been exposed to data security issues regarding Australian users, dissatisfaction with the privacy law has been exacerbated by claims that this law does not help with illegal data collection.
China’s Personal Information Protection Law (PIPL) came into force in 2021. The provisions of this law apply primarily to the processing of personal information and those who process personal information. Like many other countries’ privacy laws, the PIPL also sets out requirements and restrictions on collecting and processing personal information and lists various rights for individuals concerning their personal information. The law provides for a range of penalties, including an order to correct, a warning, confiscation of the proceeds of the offense, and suspension or termination of services. Fines may also be imposed in varying amounts depending on the seriousness of the situation.
As the public can see, Didi was fined a hefty amount for violating the relevant laws. Its app was also taken down from the app shops in mainland China. However, the amount of data collected and used by its illegal over-collection was so large that most consumers lost trust in the app and switched to other similar app services.
This case shows that, for the most part, government oversight of the processes by which companies use private information is not possible on time. However, substantial penalties can supervise companies in the relevant industries to manage their own databases and processing processes.
The excessive pace of information technology development and change in the digital age, and the inability of regulations to be updated in time to catch up, can create regulatory loopholes. After problems are discovered, the government or judicial authorities also need time to collect evidence and investigate. During which time are the offending companies still operating as usual? Is users’ private information continuing to be over-used? The issue of legal lag also needs attention.
Both companies and governments need to take responsibility for the privacy and security of their users
In the absence of legal regulation, relying solely on the rules set by the platform itself does not fully guarantee the privacy and security of users. Therefore a combination of relevant laws to monitor and regulate platforms is more effective in protecting privacy.
Users’ attitudes to privacy and security
Different users have different attitudes toward using privacy in exchange for services. Some users do not care if they share their privacy, some are “forced” to share their information to use a necessary app, and some firmly refuse to use it. Within shared privacy, there is also a division between highly sensitive and less sensitive information. According to D’Annunzio and Menichelli (2022), users are less hesitant to share low-sensitive information (age, gender, etc.), while they are warier of sharing high-sensitive information (mobile numbers, browsing history, etc.) and are even willing to pay to protect their high-sensitivity privacy. Users’ trust in companies is also essential. They are more likely to pay to protect their privacy on trusted platforms.
In the digital age, personal privacy is being exposed and used in increasingly diverse ways, with or without the knowledge of the consumer. Both privacy security and user rights are being challenged. To better protect privacy, users must first be vigilant in using various apps to avoid more information being collected by companies they do not trust. Nevertheless, the power of the individual is only worth a little in the face of companies that use high technology to collect and use information. This is why government regulation and timely investigations and penalties for violations will better discipline platforms and protect user privacy.
Bornschein, R., Schmidt, L., & Maier, E. (2020). The Effect of Consumers’ Perceived Power and Risk in Digital Information Privacy: The Example of Cookie Notices. Journal of Public Policy & Marketing, 39(2), 135–154. https://doi.org/10.1177/0743915620902143
Brookes, J. (2022, September 5). Senator asks for Didi privacy probe as TikTok review begins. InnovationAus.com. https://www.innovationaus.com/senator-asks-for-didi-privacy-probe-as-tiktok-review-begins/
China fines Didi USD 1.18 billion for data violations. (2022, July 21). Www.simmons-Simmons.com. https://www.simmons-simmons.com/en/publications/cl5v6pa5m1psb0a87n8dzap97/china-fines-didi-usd-1-18-billion-for-data-violations
China’s Personal Information Protection Law (PIPL): Key Questions Answered. (2021, September 8). Morrison Foerster. https://www.mofo.com/resources/insights/210908-chinas-personal-information-protection-law
D’Annunzio, A., & Menichelli, E. (2022). A market for digital privacy: consumers’ willingness to trade personal data and money. Journal of Industrial and Business Economics, 49. https://doi.org/10.1007/s40812-022-00221-5
Flew, T. (2021). Regulating Platforms. Polity Press.
George, M. (2015). The Information Environment and the Privacy Problem. In Digital Privacy in the Marketplace (pp. 1–14). Business Expert Press.
Nissenbaum, H. (2010). Privacy in context : technology, policy, and the integrity of social life. Stanford University Press.
OAIC. (2023, March 10). Rights and responsibilities. OAIC. https://www.oaic.gov.au/privacy/privacy-legislation/the-privacy-act/rights-and-responsibilities
Office of the Australian Information Commissioner. (2022). Australian Privacy Principles quick reference. OAIC. https://www.oaic.gov.au/privacy/australian-privacy-principles/australian-privacy-principles-quick-reference
Suzor, N. P. (2019). Who Makes the Rules? In Lawless: the secret rules that govern our lives (pp. 10–24). Cambridge University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.1017/9781108666428