‘Going Incognito’ to protect your online privacy? Think again.

Google's recent lawsuit highlighting a challenge for true digital privacy.

Image of a person using Google's search engine

“Just Google it.”

It’s become a phrase we hear or say for finding any kind of information. For many people, Google’s search engine is a gateway to the rest of the World Wide Web. With digital spaces bleeding into the real-world, real-world issues such as privacy rights have been bleeding across into digital spaces.

Recently, Google agreed to deleting billions of user records containing personal information collected from ‘Incognito Mode’ as part of a lawsuit settlement. The suit, filed in 2020, accused Google of tracking Google Chrome users even after they opened Incognito Mode windows, which lawyers argued “give Google and its employees power to learn intimate details about individuals’ lives, interests, and internet usage.” It also accused the company of creating “an unaccountable trove of information so detailed and expansive that George Orwell could never have dreamed it,” which it called “one stop shopping” for governments or hackers wishing to undermine a person’s individual freedoms (Folk, 2021). Attorney David Boies, representative of the users, said in a statement to ARS “This settlement is an historic step in requiring honesty and accountability from dominant technology companies.” (Belanger, 2024) 

“This settlement is an historic step in requiring honesty and accountability from dominant technology companies.”

~Attorney David Boies

Google maintains that the private browsing solution was never meant to completely hide your identity while using the internet and there is no evidence that the private data was being sold externally. Rather, the data was only being used internally to help target information and ads back to us, for a better online experience.

So then why is this case a concern for our digital privacy?

Introducing… Incognito Mode

A screenshot of Google's Incognito Mode

Google first introduced incognito mode as a form of ‘private browsing’ in 2005, as a response to a similar feature launched by Apple’s Safari in 2004. The trend was a response to growing user awareness around online data tracking and privacy issues in the ever-expanding digital world.

Digital privacy was becoming a public issue.

Large digital platforms are involved in many matters of public concern. Consider how you go to your social media profile to get the latest on social or cultural trends. Or how what’s at the top of a search results algorithm is generally the first link you click. Or how many of your personal preferences you share with subscription services in the interest of convenience. To a more extreme example, consider how much you know about politics or newsworthy stories just because you read something online?

Some of the largest private digital companies are functioning as parts of our public infrastructure (Flew, 2021).

Digital privacy as an emerging right

There has been an increase in the volume of transactions we make online, from the number of times we like, share or comment on a post, to the times we search for any kind of information on the Internet.

We use the internet freely to access a number of services on the daily, but does a ‘free internet’ exist? Or are we paying for these ‘free’ services with our privacy?

Before we define it, it’s important to remember that privacy is not an inherent human right but rather one grounded in social and legal contexts, and the concept of privacy can differ between cultures. According to Rengel (2013), privacy involves: the right to be left alone, the ability to protect oneself from unwanted access, the right to secrecy, control over personal information, protection of one’s personality, individuality, and dignity, and control over one’s own relationships.

Two categories of personably identifiable information (PII) are considered relevant to the digital world.

The distinction between these two types is still blurry – which usually works in favour of digital platforms – but we can consider them as:

Sensitive PII: Which can be used to confirm your specific identity. For example, your full name, address, drivers license number, or credit card information.

Non-sensitive PII: Anything else which doesn’t define you as a person. For example, your race, gender, or behavioural statistics like how many Taylor Swift videos you watch and like.

Privacy as our choice

But the right for privacy can often come at the cost of broader social benefits and expectations. One of the ways personal privacy competes with broader social benefits is the right for police and security agencies to use personal information to maintain public order. In many countries, personal digital data can be given to police upon request for investigations.

A case of privacy vs community health in Australia, 2020:

In an effort to improve the contract tracing efforts during COVID-19, the Australia Government asked for citizens to download an app which would track locations and record who you had interacted with to speed up contact tracing. This breaches some of privacy rights to our control own space and our relationships, but the aim was to protect the greater good. To maintain transparency and build trust with the app being used as public infrastructure, the government made downloads voluntary and released a Privacy Impact Statement detailing that data would not be shared until a positive COVID result had been registered.

The Privacy Paradox

Should we give up privacy for the greater good?

What do we get out of it?

Do you actually care?

Here’s where things get interesting—the privacy paradox. Francis and Francis (2017) noted “although people say they care about privacy, they behave as if they did not”.

We’re all concerned about our digital privacy, yet we keep using the Internet without actually understanding in full detail about what data is being shared and making conscious choices for ourselves.

Now, there may be some people who are more concerned with their digital presences, and academics have studied these different attitudes, to come up with three main perspectives (Westin, 2013):

If privacy rights are about the owner having control over their own information, in the digital world whether you are bothered or not, you have to admit that there is a significant lack of control and transparency over our data and information. Often, we aren’t making informed decisions about our privacy, and there is a imbalance in control between us and the digital platforms we engage with.

In an age where identity threats now come in the form of cyber-attacks and there is the threat that the PII these large digital platforms store (in the interest of showing us more relevant and targeted information and ads) could land in the hands of nefarious cyber criminals to exploit. Choosing to trust a digital platform with our personal information could be a decision that protects our real life identities.

Giving back the choice  

So maybe if digital platforms give us back the choice to share our private details, and understand what types of PII are being shared, we make huge steps towards honest digital privacy rights.  

We’ve seen platforms start to implement terms of service agreements or privacy policies which are meant to be more transparent about how your information is used publicly or even how some apps will ‘give you control’ by including privacy settings where you might be able to turn some things off.

But these aren’t enough.

The Australian Competition and Consumer Commission acknowledged that this false sense of control doesn’t solve the root issues (Suzor, 2019). First, there is the inherent information asymmetry of control between the providers and you. Then, the fact that policies and settings are incredibly confusing and filled with legalese that none of us actually can understand so we just blindly click ‘agree’ – which means we aren’t actually providing free and informed consent. Finally, the blurring of what information is actually more critical to be protected – there are some examples for PII earlier, but companies could even argue that your last name by itself doesn’t actually expose you to harm.

The issue with “Going Incognito” to protect our privacy

Private browsing seemingly offers control over our digital lives.

Giving users the option to browse the internet ‘privately’, put the control of privacy back in our hands, and could theoretically help ‘give back’ the right of privacy.

Incognito Mode sounds like the dream.

But a browser’s incognito mode does not make you universally hidden. It’s all about making sure your activity isn’t logged into the browser itself or shared with any other user profiles. The case found that data collected while users were in Incognito Mode was linked to ‘unique identifiers’ and was being kept internally for Google’s targeting of ads and generating revenue to provide other free services such as Gmail or Google Drive services.

This comes back to the dilemma of a ‘free internet’.

It’s understandable that Google as a company must make money somehow to continue providing free services. We probably know that this is through targeted advertising, with our searches through Google’s normal search mode. The case evidence didn’t find any evidence that Google was selling the Incognito Mode information externally (a small win), but rather was using them internally to provide more useful and relevant ads.

The lawsuit argues that Google did not do enough to help the majority of users to understand that this tool was not actually the idyllic form of private browsing.

Therefore, Google was not helping give the right to digital privacy back to users. Whether or not the browser mode tracked data, Google should have been more transparent to allow all users understand the choice that they were making to maintain control of their privacy.

Forrester senior analyst Stephanie Liu told Forbes that “the heart of this lawsuit was about Incognito Mode’s claim of ‘now you can browse privately.’” Users were not expecting Google to capture data on what users thought were private browsing sessions, Liu concludes with “The rise of privacy-oriented class action lawsuits and complaints shows consumers are increasingly privacy savvy and taking action. Transparency is critical — companies have to explain how data is shared and used” (Winder, 2024).

“The heart of this lawsuit was about Incognito Mode’s claim of ‘now you can browse privately… The rise of privacy-oriented class action lawsuits and complaints shows consumers are increasingly privacy savvy and taking action. Transparency is critical — companies have to explain how data is shared and used”

~ Forrester senior analyst Stephanie Liu to Forbes

The reality of digital anonymity

So, is Incognito Mode the answer to taking back control of our privacy? Not quite. The real issue? Platforms need to step up and educate us about what their tools can—and can’t—do.

Google’s case raises critical questions about the responsibility to educate users about their products’ privacy capabilities. Mozilla sets an example by prioritizing user privacy and disclosing their data practices and delivering transparent privacy analysis and guidance through the Mozilla Foundation (learn more about the Mozilla Foundation).

For the settlement, Google has made some changes in an attempt to build trust and transparency and give back control of personal privacy back to who owns it, the users: Delete information relating to private browsing data such as redacting IP addresses and generalizing any way of linking users to the data; block third-party cookies within incognito mode for at least five years; and update their instructions and guidance.

Google updated it’s Incognito Mode guidance as a settlement for the lawsuit.

What can we take away from this case?

Online privacy is complicated, even with alternatives to private browsing.

You may have heard of a Virtual Private Network (“VPN”). A VPN can run interference for your IP address making it harder for sites to track you as an individual person. But the use of VPNs also raise additional security and privacy concerns – especially as many of them require subscription and payment models and if you pick a cheaper, less reputable VPN who hasn’t been carefully vetted, this attempt at protection could also be exposing your data to untrustworthy cyber criminals.

The Google incognito mode saga is a wake-up call for all of us. We need more accountability, clearer rules, and a little more honesty. Our online privacy shouldn’t just be a nice-to-have which we resolve to always giving up; it’s a choice to be made based on transparency, understanding and trust.

And it’s about time that choice was ours again.


Australian Competition and Consumer Commission. (2018). ACCC Digital Platforms Inquiry: Preliminary Report. Canberra: ACCC.

Belanger, A. (2024) Google agrees to delete incognito data despite prior claim that’s ‘impossible’, Ars Technica. Available at: https://arstechnica.com/tech-policy/2024/04/google-agrees-to-delete-private-browsing-data-to-settle-incognito-mode-lawsuit/ (Accessed: 07 April 2024).

Flew, T. (2021) ‘Issues of Concern’, in Regulating Platforms. Cambridge, UK: Polity, pp. 72–79.

Folk, Z. (2024) Google will destroy ‘incognito mode’ browsing data: Here’s what that means for users, Forbes. Available at: https://www.forbes.com/sites/zacharyfolk/2024/04/01/google-will-destroy-incognito-mode-browsing-data-heres-what-that-means-for-users/ (Accessed: 07 April 2024).

Francis, L. P., and Francis, J. G. (2017). Privacy: What Everyone Needs to Know. Oxford: Oxford University Press.

Google (2024) How chrome incognito keeps your browsing private, Google Chrome Help. Available at: https://support.google.com/chrome/answer/9845881#zippy=%2Chow-incognito-mode-works%2Chow-incognito-mode-protects-your-privacy%2Cyoure-in-control (Accessed: 07 April 2024).

Mozilla (no date) Who we are, Mozilla Foundation. Available at: https://foundation.mozilla.org/en/who-we-are/ (Accessed: 14 April 2024).

Rengel, A. (2013). Privacy in the 21st Century. The Hague: Martinus Nijhoff Publishers.

Suzor, N. (2019). Lawless: The Secret Rules That Govern Our Digital Lives. Cambridge: Cambridge University Press.

Westin, A. (2013). Social and Political dimensions of privacy. Journal of Social Issues, 57(2), 431–53.

Winder, D. (2024) Google must delete over 100 billion Chrome Private Browsing Records, Forbes. Available at: https://www.forbes.com/sites/daveywinder/2024/04/02/google-chrome-privacy-over-100-billion-browsing-records-to-be-deleted/ (Accessed: 07 April 2024).

Be the first to comment

Leave a Reply