What Can Australia Learn from Europe on Data Protection?

With Australia’s privacy law currently under review, there exists the prime opportunity to implement strong data protection regulations against the novel threats of the digital age.

In the Privacy Act Review Report, released in February this year,the Attorney-General’s Department made 116 recommendations to update Australia’s privacy law.[1]  Feedback on the report has now closed, with Attorney General Mark Dreyfus to announce which recommendations the government will implement later this year.

Relevant recommendations regarding data protection range from implementing new limits on targeted advertising, to ensuring new rights to data erasure and to object to the collection of personal information.

These recommendations would help to align Australia’s laws with the European Union’s, where the importance of data protection has long been recognised. Here, the concept is upheld as a fundamental right, with strict regulations designed to ensure its fulfilment for all European citizens.

The EU’s legal order can therefore provide a useful model for Australia to look towards in designing its own regulations. There are several lessons Australia can draw upon regarding both what Europe has done right around data protection, as well as the challenges that still persist.

But Why Should Data Be Protected in the First Place?

Two major controversies over the last decade highlight just how tenuous our online privacy and personal data really is – and why they should be protected.

In 2013, whistle-blower Edward Snowdown revealed how US and British intelligence agencies had breached the privacy of hundreds of millions of people by cracking online encryption to access their personal data.[2] Technology companies and internet service providers themselves were complicit in the practices.

Then, in 2018, we learnt of Cambridge-Analytica’s large-scale data harvesting involving profiles on over 230 million Americans.[3] Inferences from the data, largely obtained via deceitful means, had been provided to political campaigns to identify, target, and influence voters online.

These high-profile scandals have left many of us with a sour taste in our mouths.

A 2017 report on digital rights, conducted by academics at the University of Sydney, showed, for example, that 57 per cent of Australians surveyed held concerns regarding corporations violating their privacy.[4] Forty-seven per cent were also concerned about privacy violations by government.

A general sense of distrust thus prevails. And at the heart of these concerns, as revealed by the Snowdon revelations and Cambridge Analytica scandal, is the state of surveillance enabled by Big Data.

Big Data can be conceived of as the huge volume of a variety of data produced from various sources.[5]  In our increasingly ‘datafied’ world, businesses and government agencies can draw rich data about our lives from our online activity on social media, email, web browsers, apps, and search engines.[6]

Plus, data is being generated from more and more of our physical-world behaviours in the course of everyday use of fitness and sleep trackers, biosensors, GPS-enabled devices, Internet-of-things devices, and countless other technologies.

In effect, we find ourselves under ubiquitous surveillance, in which data related to the routine and mundane activities of our day-to-day life are collected to create quantified versions of us all.[7]

The Quantified Self. Source: Technori [8]

So, why is this such a big deal? Well for one, it’s only a few entities which control this data – and it’s making them increasingly powerful.

Large tech companies are collecting our data, selling it to companies to enhance advertising, and sharing it with government agencies for intelligence purposes.[9] As a result, these entities know a great deal about us, while at the same time we know very little about them.

Knowledge is power – and in this highly asymmetrical situation we find ourselves in, these entities hold substantial power over us all.[10]

Now, it’s what they’re doing with this power that’s really scary. Through Big Data analytics and artificial intelligence, it’s possible to draw inferences from our personal data about our characteristics, behaviours, preferences, and unmet needs.[11]

One’s gender, ethnicity, religious beliefs, and sexual orientation, for example, can all be inferred just from Facebook ‘likes’.[12]

The use of these inferences then plays a fundamental part in the business model Harvard professor and bestselling author Shoshana Zuboff has labelled “surveillance capitalism”.[13]

In surveillance capitalism, inferences are compiled into profiles used by businesses to target products and services at specific individuals and groups. The idea is that tailored, personalised targeting based on inferential predictions drawn from your personal data is more likely to be effective in achieving the businesses’ aims.   

The key concern for Zuboff, along with various other scholars, is how such targeting is used to shape our behaviour.[14] Our personal autonomy, they argue, is directly threatened, as we are nudged and manipulated in line with predicted outcomes gleaned from inferential analytics.

And the situation is only going to get worse. Zuboff warns that capitalism’s economic imperative for maximum profits is driving ever-more accurate predictions to a point where companies will eventually be able to manipulate all human actions towards their own desires.

To maintain our autonomy then, we must protect our data.

Europe Leads the Way on Data Protection

Over in Europe, they’ve recognised the seriousness of threats to personal data. Under EU law, data protection is held in high regard. 

It is legally enshrined as a fundamental right in Article 8 of the EU’s Charter of Fundamental Rights.[15] And the European Commission’s General Data Protection Regulation (GDPR), which outlines practices regarding the collection and use of Big Data, is often regarded as the ‘gold standard’ of data protection law.[16]

Through these instruments, individuals are granted certain rights, freedoms, and control over their personal data – data, that is, that can be linked back to the individual, such as their name, telephone number, street address, or passport number.

In this way, the EU goes some way in deterring and regulating the surveillance practices enabled by Big Data, providing several lessons Australia can draw upon in designing its own data protection law.[17]  

Importantly, in what is unique to European law, data protection is regarded as its own distinct right, separate from the right to privacy.

While still aiming to safeguard privacy, data protection provides more legal certainty than privacy law, as what breaches privacy is often context dependent.[18] No clear criteria exist to outline when an individual’s privacy has been violated; it must be assessed on a case-by-case basis.

As legal researcher Felix Biekerputs it, “As the true content of information is dependent on the context, and seemingly innocuous data can be highly compromising in certain situations, the protection of all personal data is the only way to ensure that such gaps in protection are averted.”[19]

The concept of data protection then is clearer in that it simply applies at the point personal data are processed, i.e., when the data are handled in any way.[20]  

In so delignating a clear definition, the GDPR is able to assign an enforceable set of rights to individuals upon the processing of their data.[21] These include the right to know about information held on him or her, as well as the right to access and rectify the data, and the right to object to the processing of personal data in certain situations. 

Several obligations, moreover, are placed on organisations collecting and processing personal data under the GDPR, which they must follow or else face heavy fines.[22]

Consent of the person concerned, for one, must be gained before data be processed. Consent is then also needed for any automated decisions affecting an individual based of the processing of their data.

Meanwhile, the collection of sensitive data – such as that which reveals ethnic origin, political opinions, religious beliefs, sex life, and health – is prohibited in order to protect individuals from discrimination.

Collection of data must also be restricted to the minimum amount necessary for processing purposes – and these purposes must be specific, explicit, and legitimate.

Further, personal data may be kept for no longer than is required for processing purposes. And Individuals must also be informed of the purposes of the processing of their data.

Obligations for organisations collecting and processing data are enshrined in the GDPR’s principles. Source: Cyber-Duck [23]

This all means significant restrictions are placed on organisations using European data in order to respect the rights of those whose data may be collected and processed.

In doing so, the GDPR seeks to address the power imbalance between individuals and the dominant Big Data organisations who process their data.[24] More control is given to the individual over their data; more limitations are placed on the processors; and how data is used becomes more transparent.

Orla Lynskey, Associate Professor at the London School of Economics, argues the right to data protection returns a sense of autonomy to the European citizen.[25]

“The additional rights granted to individuals by data protection…allow individuals to better determine how their data is processed, by whom and for what purposes,” she writes. “In other words, they promote informational self-determination.”[26]

Here, European individuals are provided the opportunity to prevent the collection of their data from which inferences can be made and profiles constructed for personalised targeting. Power is returned to the individual.

Or so it seems. Despite the significant achievements gained by Europe’s right to data protection, new challenges have surfaced, allowing the controversial use of inferential analytics to persist virtually unchecked. Personal autonomy is still yet under threat.

Challenges to the Right to Data Protection

A key challenge to the European model of data protection is the lack of individual control and oversight over personal inferences.[27] Focus is placed on protection at the stages of collecting and processing data, neglecting controls over how the data are then assessed to make inferences and decisions about the individual.

Sandra Wachter and Brent Middlestadt, professors at the University of Oxford, have detailed the insufficient protections against inferences in the GDPR.[28] They argue the individual rights around the processing of data are “significantly curtailed when it comes to inferences”.[29]

This is because the individual holds no corresponding rights to know about inferences, rectify inferences, object and delete inferences, be protected against sensitive inferences made about them, or contest decisions based on inferences.[30] In all cases, the interests of the organisations controlling the data, such as their trade secrets or intellectual property, are placed at a higher premium.

Wachter and Middlestadt argue for a greater balance between the interests of these organisations and the rights and freedoms of individuals, calling for a “a right to reasonable inferences”.

This would prohibit inferences that “cause reputational damage or invade one’s privacy” or are “predictive or opinion-based while being used to make important decisions”.[31] Under this model, organisations themselves would have to justify whether an inference is reasonable and whether the methods used to draw the reference are accurate and reliable.[32]

Such a right therefore would go a considerable way in relieving the lack of individual control over personal inferences currently in the GDPR. However, it would still leave another further challenge to data protection: the anonymisation of user data.

None of the existing rights and obligations under the GDPR apply to anonymised data.[33] Rather, the regulation’s sole intention is to protect data that can be linked to an identifiable person.

So, if personal data is collected and then rendered anonymous, effectively de-identifying the individual, data protection no longer applies. Anonymisation is considered sufficient to protect the individual’s privacy and so no further measures are required.

Jane Andrew and Max Baker, professors at the University of Sydney, warn this carries severe consequences. They write, “The GDPR’s focus on individual privacy may be unintentionally providing data controllers with a legal mandate to pursue market surveillance practices… inadvertently crystallising the power of tech elites.”[34]

This is because inferences can still be made on anonymised data.[35]

New technologies don’t need your personal information to infer information and create personalised profiles. They can do so from behavioural information, such as keystroke behaviour and web surfing habit, identifying you as the same individual over a period of time.[36]

This information can then be connected to other sources such as online-browsing records and retail purchases, allowing some institutions to build comprehensive records of you without ever actually identifying who you are in the traditional sense.[37]

As the University of Pennsylvania professor Joeph Turrow points out, quoted in the online business journal Knowledge at Wharton, “If a company knows 100 data points about me in the digital environment, and that affects how that company treats me in the digital world, what’s the difference if they know my name or not?”[38]

Websites can still draw on correlations of information to make inferences about you and treat you accordingly. For example, depending on whether they think you may be a male in a midlife-crisis, a young mother, or any other number of group categories, they may adjust the look, content, and/or prices on their website to be more effective at targeting you.[39]

Even with anonymised data then, surveillance practices still persist and the manipulation of our behaviour through inferential analytics remains a possibility. Data protection regulations, like the GDPR and Australia’s Privacy Act, must therefore go further to implement similar rights and obligations around anonymised data as it does for identified data.

It’s now up to Australia to draw useful insights from these challenges when designing its own regulations. It’s only then that Australians’ data will truly be secure.


[1] Australian Government Attorney General’s Department. (2023, February). Privacy Act Review Report 2022. https://www.ag.gov.au/sites/default/files/2023-02/privacy-act-review-report.pdf

[2] Ball, J., Borger, J., & Greenwald, G. (2013, September 6). Revealed: how the US and UK spy agencies defeat internet privacy and security. The Guardian. https://www.theguardian.com/world/2013/sep/05/nsa-gchq-encryption-codes-security

[3] Confessore, N. (2018, April 4). Cambridge Analytica and Facebook: The Scandal and the Fallout So Far. The New York Times. https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html

[4] Goggin et al. (2017). Digital Rights in Australia. Sydney: University of Sydney, p. 1. https://ses.library.usyd.edu.au/bitstream/handle/2123/17587/USYDDigitalRightsAustraliareport.pdf?sequence=7&isAllowed=y

[5] European Commission (2016). The EU Data Protection Reform and Big Data, Factsheet. https://op.europa.eu/en/publication-detail/-/publication/51fc3ba6-e601-11e7-9749-01aa75ed71a1

[6] Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society12(2), 198.

[7] Swan, M. (2013). The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery. Big Data1(2), 85–99.

[8] Technori (2018, August 8). The Beginner’s Guide to Quantified Self (Plus, a List of the Best Personal Data Tools Out There). https://technori.com/2018/08/4281-the-beginners-guide-to-quantified-self-plus-a-list-of-the-best-personal-data-tools-out-there/markmoschel/

[9] Goggin et al. Digital Rights in Australia, 9.

[10] Vold, K. & Whittlestone, J. (2019). Data Privacy and the Individual: Privacy, Autonomy, and Personalised Targeting. Center for the Governance of Change, 5-6. https://philpapers.org/archive/VOLPAA-2.pdf; Pasquale, Frank (2015). ‘The Need to Know’, in The Black Box Society: the secret algorithms that control money and information. Cambridge: Harvard University Press.

[11] Wachter, S. (2019). Data protection in the age of big data. Nature Electronics2(1), 6.

[12] Vold & Whittlestone, Data Privacy, 5.

[13]  Zuboff, S. (2020). The age of surveillance capitalism: the fight for a human future at the new frontier of power. New York City: PublicAffairs.

[14] Ibid.; Vold & Whittlestone. Data Privacy; Wachter, S., & Mittelstadt, B. (2019). A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI. 

Columbia Business Law Review2019(2), 494–620.

[15] Charter of Fundamental Rights of the European Union. (2000). Official Journal C364.

[16] Andrew, J., & Baker, M. (2021). The General Data Protection Regulation in the Age of Surveillance Capitalism. Journal of Business Ethics168(3), 565.

[17] Lynskey, O. (2014). DECONSTRUCTING DATA PROTECTION: THE “ADDED-VALUE” OF A RIGHT TO DATA PROTECTION IN THE EU LEGAL ORDER.  The International and Comparative Law Quarterly63(3), 587.

[18] Ibid. 581-586; Bieker, F. (2022). The Right to Data Protection: Individual and Structural Dimensions of Data Protection in EU Law (Vol. 34). T.M.C. Asser Press, 178-180; Schreurs et al. (2008). Cogitas, Ergo Sum. The Role of Data Protection Law and Non-discrimination Law in Group Profiling in the Private Sector. In Profiling the European Citizen (pp. 241–270). Springer Netherlands.

[19] Bieker. The Right to Data Protection, 179.

[20] Lynskey. DECONSTRUCTING DATA PROTECTION, 584; Schreurs et al. Cogitas, Ergo Sum, 244.

[21] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). (2016). Official Journal L119/1.

[22] Ibid.

[23] Bluestone, D. (2017, November 8). Get to Grips with the Basics of GDPR. Cyber-Duck. https://www.cyber-duck.co.uk/insights/introducing-gdpr-the-basics-of-the-new-data-protection-regulation

[24] Andrew & Baker. The General Data Protection Regulation, 570; Lynskey. DECONSTRUCTING DATA PROTECTION, 588-595.

[25] Lynskey. DECONSTRUCTING DATA PROTECTION.

[26] Ibid., 590.

[27] Wachter. Data protection in the age of big data; Wachter & Mittelstadt. A Right to Reasonable Inferences.

[28] Wachter & Mittelstadt. A Right to Reasonable Inferences.

[29] Ibid., 500.

[30] Ibid., 542-571.

[31] Ibid., 580.

[32] Ibid., 581.

[33] Andrew & Baker. The General Data Protection Regulation, 571-2; Schreurs et al. Cogitas, Ergo Sum, 242-3; Flew, T. (2021). Regulating Platforms. Cambridge: Polity, 82.

[34] Andrew & Baker. The General Data Protection Regulation, 574.

[35] Ibid., 570; Barocas, S., & Nissenbaum, H. (2014). Privacy, Big Data, and the Public Good. Cambridge University Press, 44-75.

[36] Andrew & Baker, The General Data Protection Regulation, 568; Hildebrandt, M. (2006). Profiling: From data to knowledge: The challenges of a crucial technology. Datenschutz Und Datensicherheit, 30(9), 549.

[37] Baracas & Nissenbaum. Privacy, Big Data, 54; Steel, E., & Angwin, J. (2010, August 4). On the Web’s Cutting Edge, Anonymity in Name Only. The Wall Street Journal. https://www.wsj.com/articles/SB10001424052748703294904575385532109190198

[38] Berger, J. & Fader, P. (2011). ‘Drinking from a Fire House’: Has Consumer Mining Gone Too Far? Knowledge at Wharton, https://knowledge.wharton.upenn.edu/article/drinking-from-a-fire-hose-has-consumer-data-mining-gone-too-far/

[39] Steel & Angwin. On the Web’s Cutting Edge.

Reference List

Andrew, J., & Baker, M. (2021). The General Data Protection Regulation in the Age of Surveillance Capitalism. Journal of Business Ethics168(3), 565–578.

Australian Government Attorney General’s Department. (2023, February). Privacy Act Review Report 2022. https://www.ag.gov.au/sites/default/files/2023-02/privacy-act-review-report.pdf

Ball, J., Borger, J., & Greenwald, G. (2013, September 6). Revealed: how the US and UK spy agencies defeat internet privacy and security. The Guardian. https://www.theguardian.com/world/2013/sep/05/nsa-gchq-encryption-codes-security

Barocas, S., & Nissenbaum, H. (2014). Privacy, Big Data, and the Public Good. Cambridge University Press.

Berger, J. & Fader, P. (2011). ‘Drinking from a Fire House’: Has Consumer Mining Gone Too Far? Knowledge at Wharton, https://knowledge.wharton.upenn.edu/article/drinking-from-a-fire-hose-has-consumer-data-mining-gone-too-far/

Bieker, F. (2022). The Right to Data Protection: Individual and Structural Dimensions of Data Protection in EU Law (Vol. 34). T.M.C. Asser Press.

Bluestone, D. (2017, November 8). Get to Grips with the Basics of GDPR. Cyber-Duck. https://www.cyber-duck.co.uk/insights/introducing-gdpr-the-basics-of-the-new-data-protection-regulation

Charter of Fundamental Rights of the European Union. (2000). Official Journal C364.

Confessore, N. (2018, April 4). Cambridge Analytica and Facebook: The Scandal and the Fallout So Far. The New York Times. https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html

European Commission (2016). The EU Data Protection Reform and Big Data, Factsheet. https://op.europa.eu/en/publication-detail/-/publication/51fc3ba6-e601-11e7-9749-01aa75ed71a1

Flew, T. (2021). Regulating Platforms. Cambridge: Polity.

Goggin, G., Vromen, A., Weatherall, K., Martin, F., Webb, A., Sunman, L., & Bailo, F. (2017). Digital Rights in Australia. Sydney: University of Sydney. https://ses.library.usyd.edu.au/bitstream/handle/2123/17587/USYDDigitalRightsAustraliareport.pdf?sequence=7&isAllowed=y

Hildebrandt, M. (2006). Profiling: From data to knowledge: The challenges of a crucial technology. 

Datenschutz Und Datensicherheit30(9), 548–552.

Lynskey, O. (2014). DECONSTRUCTING DATA PROTECTION: THE “ADDED-VALUE” OF A RIGHT TO DATA PROTECTION IN THE EU LEGAL ORDER. The International and Comparative Law Quarterly63(3), 569-597.

Pasquale, Frank (2015). ‘The Need to Know’, in The Black Box Society: the secret algorithms that control money and information. Cambridge: Harvard University Press.

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). (2016). Official Journal L119/1.

Schreurs, W., Hildebrandt, M., Kindt, E., & Vanfleteren, M. (2008). Cogitas, Ergo Sum. The Role of Data Protection Law and Non-discrimination Law in Group Profiling in the Private Sector. In Profiling the European Citizen (pp. 241–270). Springer Netherlands.

Steel, E., & Angwin, J. (2010, August 4). On the Web’s Cutting Edge, Anonymity in Name Only. The Wall Street Journal. https://www.wsj.com/articles/SB10001424052748703294904575385532109190198

Swan, M. (2013). The Quantified Self: Fundamental Disruption in Big Data Science and Biological Discovery. Big Data1(2), 85–99.

Technori (2018, August 8). The Beginner’s Guide to Quantified Self (Plus, a List of the Best Personal Data Tools Out There). https://technori.com/2018/08/4281-the-beginners-guide-to-quantified-self-plus-a-list-of-the-best-personal-data-tools-out-there/markmoschel/

Wachter, S. (2019). Data protection in the age of big data. Nature Electronics2(1), 6–7. 

Wachter, S., & Mittelstadt, B. (2019). A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI. Columbia Business Law Review2019(2), 494–620.

Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society12(2), 197–208.

Vold, K. & Whittlestone, J. (2019). Data Privacy and the Individual: Privacy, Autonomy, and Personalised Targeting. Center for the Governance of Change. https://philpapers.org/archive/VOLPAA-2.pdf 

Zuboff, S. (2020). The age of surveillance capitalism: the fight for a human future at the new frontier of power. New York City: PublicAffairs.

Be the first to comment

Leave a Reply