Vague Privacy Terms Only Promise Consumers the Ability to Opt Out of Viewing Personalised Ads

Photo by PhotoMIX Company from Pexels
Photo by PhotoMIX Company from Pexels

In the digital platform inquiry report, the Australian Competition Consumer Commission (ACCC) indicated that Some digital platforms’ privacy statements are often broad and vague about the collection, use and disclosure of user data (ACCC, 2019).

These privacy terms all emphasize that consumers enable to opt out of “personalised ads” without receiving marketing communications, but it risks leaving a misleading message for many consumers.

Because consumers don’t notice this doesn’t mean the platform gives consumers the right to opt out of collecting and using their data for the purposes of marketing and advertising.

Even if consumers choose not to look at personalised ads, companies may continue to collect their personal information to create “similar audiences” and target others with similar attributes.

Currently, Internet users are experiencing highly granular data collection on digital platforms, including their interests, shopping habits, social networks, political orientation and historically visited locations.

In this context, emerging forms of advertising, targeted or behavioural advertising and search on digital platforms target and track the online activity and consumption habits of specific people.

As governments around the world tighten regulations on the access and use of users’ personal information by Internet platforms, there is a growing awareness of the potential risks and negative impacts of large-scale data collection and analysis. It’s no secret that digital media companies like Google and Facebook have poor records when it comes to privacy (Dwyer, 2014).

Targeted personalised advertising with privacy

Advertising is a useful way for businesses and brands to connect with their target audience. For most applications, advertising is an important part of the user experience.

The digital advertising market is growing much faster than other advertising markets. The global advertising spending in the digital advertising market is expected to reach US$679.8 billion by 2023 (Statista, 2022)

Targeted advertising is an outgrowth of behaviour-based advertising. Those advertising technologists track user characteristics and online behaviours, building consumer demographics and interests to generate more targeted, personalised advertising messages.

This data generally falls into two broad categories: first-party data and third-party data. First-party data is data that platforms collect directly from consumers through their own channels.

Third-party data refers to data collected and packaged by third parties and sold to enterprises. Third-party data is often shared or sold in data markets or exchanges.

AdTech providers rely on data provided by third parties to provide their services including a customer management platform, a search engine marketing platform and advertising forecasting software that can quickly and efficiently produce personalised content, helping advertisers spend less time and money to achieve higher revenue.

AdTech does not operate consumer-facing services that allow them to collect data directly from consumers.

Postclick, Criteo and VideoAmp are examples of Adtech companies.

The privacy terms of each online marketplace force consumers to “consent” to marketplaces’ access to consumers’ data from third parties, and claims that third-party data is used for marketing, profiling, advertising and product development.

While advertisers claim that you get more targeted ads and buy cheaper products when they collect data to learn more about you, many consumers feel violated and uncomfortable given the software’s ability to track their personal characteristics and online behaviours.

According to Digital Right Australia, research on people’s attitudes to targeted advertising shows that people are nuanced on these issues, with most expressing less concern about what information is collected or saved. Instead, they are more interested in enterprise platforms and what advertisers and others do with that information (Goggin, 2017).

But that raises the issue of advertisers using consumer data to offer different prices to different consumers. Some consumers say they are angry at the idea of personalised unit prices on the Internet, which they believe it encourages discrimination.

An investigation by Consumers International and the Mozilla Foundation found that the dating app Tinder discriminated against prices in six different countries, including the US, Brazil and New Zealand. After participants entered their name, age, gender and sexual preferences and shared their location data, they found that Tinder Plus was unfairly pricing them for the same service  (Euroconsumers, 2022).

As a result, inequality in advertising caused by the collection of user data has drawn attention to highly targeted advertising, and we long for new legal and ethical frameworks.

How does the platform inform users?

It is the standard operating procedure for online sites to provide a “privacy policy”, which includes the requirements of privacy law and consumer law.

These privacy policies are often built into the site’s terms of service, binding users to consent through a process of browsing or clicking. These policies usually contain detailed information about the type of personal information sharing and multiple uses, including third-party uses.

The ACCC’s 2019 review of privacy terms and policies uncovered a series of statements on the collection, use and disclosure of user data that was broad and vague. One of key examples of vague language is the frequent use of the word “may” in the privacy policies of digital platforms.

ACCC noted that the word “may” can mean a variety of things, including expressing uncertainty, permission, possibility, intention or hope. The use of “may” in terms of use or privacy policies gives digital platforms a great deal of discretion (ACCC, 2019).

Twitter’s privacy policy uses the word “may” and it’s past tense several times, including about third parties and third-party integrations.

Katharine Kemp reports that online marketplaces do not emphasize this distinction in their privacy policies regarding consumers’ ability to opt out of seeing personalised ads, and consumers’ ability to opt out of providing data for marketing purposes (Kemp, 2021).

Dr. Katharine Kemp is a Senior Lecture at the Faculty of Law & Justice, UNSW Sydney, she focuses on focuses on competition law, data privacy regulation and consumer protection, including their application to digital platforms.

The purpose for which data is used and where it is exchanged are often described in vague, open-ended, and inconsistent terms. Consumers have no way of knowing how their data will be used by the platforms for internal management demands or checking whether they had consented to third parties disclosing their data, Dr. Kemp added.

The real privacy policies are indeed obscured by vague, complex language that fails to highlight transparently the practices that consumers care about, with the result that they have no way of knowing what they are consenting to.

If you have a Facebook account, I suggest you review your privacy settings. You can find out exactly what Facebook does about user privacy and check if some of your privacy settings put you at risk for data breaches.

Privacy regulation in the EU and Australia

In the EU and the UK, digital platforms are subject to two different regulations on cookies: the General Data Protection Regulation (GDPR) and the ePrivacy Regulation.

Under the GDPR, online services must obtain explicit consent from users before they start to process data. The consent must be specific, informed and freely given to be legally obtained (Dillet, 2022).

For example, the UK authorities have mandated that websites must obtain consent to use analytics cookies. Cookies are small pieces of data that the marketing industry uses to provide low-cost personalised marketing, which allows advertisers to describe their target customers with unprecedented accuracy. In Australia, cookie banners do not appear to be mandatory.

Dr. Kemp points out in one of her articles that the EU has more stringent privacy regulations than Australia and provides more privacy provisions and default Settings for EU users than Australian users (Kemp, 2023).

In Australia, the collection, use and disclosure of user data and personal information are primarily regulated by the Privacy Act. Privacy protections promote healthy competition, technological innovation, and the empowerment of consumers in the digital marketplace.

However, Australia’s existing regulatory framework does not effectively address data practices with asymmetric information and unbalanced bargaining power between digital platforms and consumers.

The recently released Privacy Law Review Act Report proposes an expanded definition of “personal information” to explicitly include the various technologies and online identifiers used to track and profile consumers.

Dr. Kemp argues that the report’s recommendations on targeted advertising do not fully address the power imbalance between companies and consumers. At present, companies have largely accepted sacrificing consumer privacy to meet the demands of the online targeted advertising business  (Kemp, 2023).

Lawmakers expressed their desire to bring national laws closer to Europe’s GDPR, but this would require significant changes as the two bills currently differ far apart in many areas, including user consent.

The conflict between Meta behavioural Ads and the EU’s GDPR

Earlier this year, Mete was fined $410 million by the EU for non-consensual behavioural ad targeting. Data Privacy Foundation, a Dutch privacy advocacy group, ordered Meta to resolve the GDPR violations within three months.

DPS said the tech giant breached EU legal data protection rules, arguing that the company had engaged in unfair commercial practices by failing to obtain users’ permission to process their data for commercial purposes, including advertising targeting.

This is not the first complaint against Meta’s tracking and targeting business model.

The complaints, which date back to May 2018, are about what the tech giant calls “forced consent,” which involves processing users’ personal data to continue tracking and targeting users by building their profiles for behavioural ads.

In the UK High Court of November 2022, human rights campaigner Tanya O’Carroll filed a lawsuit against Meta, claiming that Facebook ignored her right to object to the collection of her personal data, and use her personal data to describe her and show her targeted ads.

Tanya O’Carroll said Facebook was violating Article 21 of the GDPR, which states that users “have the right to object at any time to processing of personal data for such marketing”.

Among these, Tanya O’Carroll was trying to force mate to stop processing her data and analysing her activities for direct marketing purposes.

Luminate, the foundation that funded Tanya O’Carroll, believes the case’s success could set a precedent for millions of search engine and social media users who are forced to accept intrusive surveillance and analytics as an online experience in the UK, EU and beyond (Harries, 2022).

According to noyb, a free tool provided by privacy rights not-for-profit, Facebook’s users do not have the option to refuse their processing of ads, although the GDPR states that if consent is the legal basis for processing personal data, and it must be specific, informed and freely given.

It later emerged that with the implementation of the GDPR, Meta has circumvented the GDPR complaints against its surveillant-based business model by moving from an earlier claim that it had obtained consent from users to process the data to a claim that users had actually contracted with it to receive personalised ads (Lomas, 2022).

This means that Meta can no longer rely on contractual necessity statements to run behavioural ads for the rest of the year — instead, it must seek user consent.

Need to change the situation

In recent years, a steady stream of policy reports represents a growing trend against data mining violations of consumer privacy.

As consumers, we are witnessing digital media companies vying for control of our personal information in the new media environment. We want platform developers to ensure that their communication with third-party service providers (such as Ad networks and Ad data brokers) is clear and that both parties are clear about the reasons and rules for third-party data collection, so that they can in turn provide us with accurate information.

In privacy settings, consumers would prefer simple defaults with no cookies or track without explicit consent.

Platforms need to offer incentives for choosing not to watch behavioural ads, so that consumers are clear about the range of privacy options they have.


ACCC. (2019). Digital Platforms Inquiry Final Report.

ACCC. (2021). Digital advertising services inquiry Final report.

Dillet, R. (2022, April 21). Google to update cookie consent banner in Europe. TechCrunch.

Dwyer, T. (2014). Evolving concepts of personal privacy: Locative media in online mobile spaces. In Locative media (pp. 121-135). Routledge.

Goggin, G., Vromen, A., Weatherall, K., Martin, F., Adele, W., Sunman, L., & Bailo, F. (2017). Digital Rights in Australia. In

Kemp, K. (2021). The Absence of Competition in the Privacy Terms of Online Marketplaces. SSRN Electronic Journal.

Kemp, K. (2023, February 20). Proposed privacy reforms could help Australia play catch-up with other nations. But they fail to tackle targeted ads. The Conversation.

Lomas, N. (2022, November 21). Meta’s surveillance biz model targeted in UK “right to object” GDPR lawsuit. TechCrunch.

New research finds discriminatory personalised pricing is a global problem. (2022, February 07). Euroconsumers.

Statista. (2022). Digital Advertising – Worldwide | Statista Market Forecast. Statista;

Be the first to comment

Leave a Reply