
The term ‘privacy’ imposes much more value than just changing the settings of the application. Privacy refers to the idea that an individual can keep their personal data, information and belongings away from scrutinisation. There is no doubt that social media platforms are facing a huge crisis at the moment with their ill-treated privacy policies and unregulated consumer protection laws. A news report by NBC News (2018) asserted that Facebook’s careless management of how its data was obtained by app developers had thrown the business into its worst crisis in 14 years; read the article on Facebook’s Privacy mishap here. Achieving privacy for individuals in marginalised societies can be a difficult task, as defined by Marwick and Boyd (2018). The digital divide is a major concern which will be explored in this blog while keeping in mind privacy rights and consumer data protection. The thesis statement is that data privacy is sufficiently significant to require the establishment of appropriate privacy laws, policies, and technical safeguards to preserve individuals’ privacy rights. Following the overview, this blog will lead you through a wide range of understanding on privacy rights, digital privacy, components of digital privacy, privacy imbalance, data protection, consumer protection, an analysis of GDPR, the strengths and weaknesses of the Australian Privacy Acts and a detailed identification of Australian Consumer Data Rights.

Privacy is a human right, and every individual is entitled to this right. According to the Universal Declaration of Human Rights (1948), “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks.” Keeping this right in mind, it is significant to focus on the digital privacy rights framed in Australia as Australians are among the world’s most avid users of social media and mobile broadband, and our country is among the top ten in terms of internet usage. The internet has been responsible for framing the professional and personal lives.
Now you may ask me why is it required if we do not share our personal lives on social media. Well, the catch is, that social media platforms like Facebook are much more than just silently watching everyone share content on the platform. The content you scroll through on your feed has been designed by platform algorithms that actively study your preferences and requirements. This is where privacy concerns come into play. Any platform that deals with user identity and information has a responsibility towards protecting their privacy rights. Sarikakis and Winter (2017) were of the opinion that in 2010, Mark Zuckerberg, Facebook’s CEO, stated that people have become more comfortable revealing their private information online, including challenging the “social norm” of privacy, which, in his opinion, had become outmoded. Thus, privacy has been a major concern as Facebook works more evidently in collecting and storing big data than any other

In Australia, the concept of digital privacy has been regulated by the Privacy Act 1988. The document sets out principles for the citizens, companies, government and private sector on handling personal information. The Act highlighted that every individual has the right to know which company, organisation or government is collecting their personal information and how it will be used and disclosed and with whom it will be shared. This raises a concern among the consumers of social media to be adequately addressed and digitally literate. The more consumers are concerned about the nature of the information and the way it is being used, the more conscious they will become about their digital footprint.
This can be related to Nissenbaum’s revised ‘Principle of Respect for Context’, that ‘companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data’. Respect for context in this category means that consumers must expect that companies, organisations, and the government will collect, use and disclose their personal data. In this way, digital privacy becomes a two-way process. The theory of contextual integrity helps in understanding the issues related to privacy within a social context. Nissenbaum (2015) asserted that privacy concerns are raised only wait personal information is used and regulated without contextuality. This is the exact reason why most of us ignore the terms and conditions page while signing up for an account on Facebook or other social media channels.
One such social media platform which requires a significant mention when it comes to digital privacy and data privacy is Facebook. Considering the recent chaos on privacy issues of the platform, the policymakers and the board of directors have subjected Facebook to a number privacy policy laws and regulation. The data policy of Facebook generates the idea that the platform collects content, information and anything related to a specific account which includes- location, photos, data and other personal information required for signing up for the account. However, despite these strict regulations and continuous updates, the platform has been subjected to severe criticism for using the personal information of the users.

For example, the Cambridge Analytica scandal in 2018. The fact that user behaviour can be a part of the scandal to determine the choices of the users in building a powerful software program during the elections has been a significant move of the scandal. The Guardian commented that a whistleblower told the Observer that Cambridge Analytica, a company owned by hedge fund billionaire Robert Mercer and led at the time by Trump’s key adviser Steve Bannon, used personal information obtained without consent in early 2014 to build a system that could profile individual US voters in order to target them with personalised political advertisements. This rigid exploitation has not only questioned the strengths that Facebook holds to protect millions of user accounts but also raised questions among Australian users.
At the beginning of this blog, we focused on the idea of marginalised people getting imbalanced information. The concept of choices and privacy comes here to play significantly. The fact that you are in the position to give the platform the allowance to study is a matter of privilege. Marwick and Boyd (2018) have been of the opinion that the concept of “privacy” is based on the notion that a private individual is entitled to be “left alone” from being viewed or disturbed by others. In practice, however, the ability to acquire privacy frequently necessitates the power to make choices and construct systems that allow for such liberties.
In this case, the stark reality is that privacy can be difficult to be attempted. For example, economically marginalised people with less education towards digital privacy are often forced to provide their personal data in order to get basic services from organisations and platforms. The authors in the article noted that marginalised people are often compelled towards sharing their personal information because of power imbalances. This showcases that Facebook and other social media platforms do not champion independence and are highly shaped by the power dynamics of society. Users from the marginalised society, for example, LGBTQ+ individuals and disabled users, are often the victims of data mining, surveillance and harassment.
Privacy imbalance leads to data breaches which makes the user accounts vulnerable to targeted attacks. Cambridge Analytica Scandal has affected many Facebook user accounts and their personal data. The ideology of privacy imbalance focuses on the unequal power dynamics between the consumers of social media platforms and the policy makers of the platforms. In this scenario, the policymakers are at the hold of maximum power.

A data from Statista presented that 70.63 million user accounts in the US and 311, 127 accounts of Australian users were affected and compromised in the Cambridge Analytica scandal as of April 2018. This reflects a clear case of power imbalance where Facebook as a well-known established organisation, failed to protect the unregulated big data. The lack of control, privacy violations, discrimination towards the marginalised, lack of consent and transparency have led to a massive scandal.
For example, the lack of accountability and transparency on how the data is being used and for what purpose has created this gap between the users and the platform policy makers. Facebook is in charge of timely regulating the user data that they collect each second in order to keep a secured environment. However, data breach incidents take place through the loss of control over personal data and the way any third-party application can use the similar data. In the Cambridge Analytica scandal, The Guardian reported that the data was collected through an app called thisisyourdigitallife and through his company Global Science Research (GSR), in collaboration with Cambridge Analytica, as hundreds of thousands of people were paid to take a personality test and agreed to have their data harvested for academic purposes through his company Global Science Research (GSR), in partnership with Cambridge Analytica. The app also collected the information of the test-takers Facebook friends, leading to the accumulation of a data pool tens of millions-strong.

Another data breach case study that can be taken into consideration is the Equifax breach. The breach exposed the personal data of 143 million people over Yahoo, and the hackers accessed people’s names, Social Security numbers, birth dates, addresses and, in some instances, driver’s license numbers. The agency is responsible for storing the credentials of users, which are deemed to be extremely sensitive. This reminds us of the importance of content moderation among the organisations to keep the data safe and away from hackers and the third-party applications. A study by Flow, Martin and Suzor (2019) asserted that to avoid data breaches and hackers to make access to the sensitive consumer information, it is significant that content moderation is regularly updated. The Equifax data breach tore apart the fabrications of the ideologies that declarations are not only realised through practice, but they reflect the perception that even the rights asserted to individuals are at significant threat (Karppinen, 2017). Therefore, it can be said that content moderation and regulation of policies are central towards the identification of a safe social environment for the users.
To mitigate these issues, it is significant to take into account the existing data protection principles. The GDPR (General Data Protection Regulation) principles adopted by the EU focused on asking permission and consent from the users before the organisations make an attempt to seek personal data and information. The document further asserted that personal data can be processed lawfully, fairly and in a transparent manner. This includes several rights, such as:
- The right to be informed
- The right of access
- The right to rectification
- The right to erasure
- The right to restrict processing
- The right to data portability
- The right to object
- Rights in relation to automated decision making and profiling.
Find the GDPR report chapters here.
The unauthorised access to information and personal data on Facebook has been highly critiqued. The Guardian released an article on the privacy problems of the application where even a bug malfunction affected up to 6.8 million users, followed by a glitch that publicly published over 14 million users’ information on the web along with hackers frequently accessing and stealing information online. Even though Facebook released news on banning the access to user information to the third-party application, the Cambridge Analytic breach still happened. As a result, one must be digitally educated and aware of the various ways in which privacy can be compromised.
The threats to digital privacy on Facebook and other social media platforms can be avoided if the policy makers can review and adjust the privacy settings to make sure that any kind of outsider access to personal information can be limited. It is also one’s duty to be cautious about the information- like credentials and passwords one is sharing. Facebook must limit to their access to third-party applications and websites. This can be limited and brought under control with regular monitoring and rigid usage of two-factor authentication for user accounts.
References:
Cadwalladr, C., & Graham-Harrison E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election
Equifax Data. (2017). The Equifax Data Breach: What to Do. https://www.penncommunitybank.com/wp-content/uploads/2019/12/The-Equifax-Data-Breach_-What-to-Do-_-Consumer-Information.pdf
Facebook. (2023). Data Policy. https://www.facebook.com/about/privacy/previous#:~:text=We%20do%20not%20share%20information,unless%20you%20give%20us%20permission.
General Data Protection Regulation. (2023). General Data Protection Regulation. https://gdpr-info.eu/
Goggin, G., Vromen, A., Weatherall, K. G., Martin, F., Webb, A., Sunman, L., & Bailo, F. (2017). Digital rights in Australia. Digital Rights in Australia (2017) ISBN-13, 978-0. http://hdl.handle.net/2123/17587
Ho, V. (2018). Facebook’s privacy problems: a roundup. https://www.theguardian.com/technology/2018/dec/14/facebook-privacy-problems-roundup
Karppinen, K. (2017). Human rights and the digital. In H. Tumber & S. W. (Eds.), The Routledge Companion to Media and Human Rights (pp. 95-103). Routledge. https://doi.org/10.4324/9781315619835-9
Marwick, A. E., & Boyd, D. (2018). Understanding privacy at the margins. International Journal of Communication (19328036), 12. https://web.p.ebscohost.com/abstract?direct=true&profile=ehost&scope=site&authtype=crawler&jrnl=19328036&AN=139171463&h=om1iv1yJKsiDoJwpzRkKAzYWMTitqa6U%2fBVmJSgZ4QEfNMt6PyFd2dO%2fnX%2bJWEC6Et%2bK%2bi2QKw3MVsaOKwLnoA%3d%3d&crl=c&resultNs=AdminWebAuth&resultLocal=ErrCrlNotAuth&crlhashurl=login.aspx%3fdirect%3dtrue%26profile%3dehost%26scope%3dsite%26authtype%3dcrawler%26jrnl%3d19328036%26AN%3d139171463
Newcomb, A. (2018). A timeline of Facebook’s privacy issues — and its responses. https://www.nbcnews.com/tech/social-media/timeline-facebook-s-privacy-issues-its-responses-n859651
Nissenbaum, H. (2018). Respecting context to protect privacy: Why meaning matters. Science and engineering ethics, 24(3), 831-852. https://nissenbaum.tech.cornell.edu/papers/Respecting%20Context%20to%20Protect%20Privacy%20Why%20Meaning%20Matters.pdf
Privacy Act 1988. (1988). Rights and Protection: Privacy. https://www.ag.gov.au/rights-and-protections/privacy
Sarikakis, K., & Winter, L. (2017). Social media users’ legal consciousness about privacy. Social Media+ Society, 3(1), 2056305117695325. https://journals.sagepub.com/doi/pdf/10.1177/2056305117695325
Statista. (2022). Countries most affected by Cambridge Analytics Facebook scandal 2018. https://www.statista.com/statistics/831815/facebook-user-accounts-affected-cambridge-analytica-by-country/
Universal Declaration of Human Rights. (2023). United Nations. https://www.un.org/en/about-us/universal-declaration-of-human-rights#:~:text=Article%2012,against%20such%20interference%20or%20attacks.
Be the first to comment