Data and Algorithmic Differences Between Google and DuckDuckGo
Jinyi (Dorothy) Liu
Google, as a search engine with a huge number of users, scares me with its data processing and collection systems, as well as its precise pushing and personalization of user needs and experiences. But the emergence of a new search engine in recent years, DuckDuckGo, has eased my anxiety to some extent. Although they are both search engines, they have significant differences in search algorithms, data collection, privacy protection, and advertising strategies. These differences also reflect their different impacts on Internet culture and governance, such as on user privacy, personalised experiences, information diversity, and advertising strategies.
I will be analyse several aspects by comparing the differences in search algorithms, data collection methods, privacy protection mechanisms, and advertising strategies between the two, which will include the directions covered in the unit material, some critical understandings they make of the impact on Internet culture and governance, as well as insightful analyses of the key issues that emerge from them in the governance of Internet culture.
Significant Differences in Search Algorithms
The search algorithm used by DuckDuckGo is more concerned with user privacy and information diversity; it does not keep track of users’ search histories or personal information and does not present them with personalised ads based on that information. (Hollingsworth, 2021) Google’s search algorithm, in contrast, places a greater emphasis on user
personalisation; in order to deliver more relevant search results and advertisements, it will keep track of user search history, location, language, and interests.
As a fundamental component of algorithmic reality, personalisation in essence, construction is done based on user characteristics, behaviour, and location. It increases individualisation in societies, creating both dangerous and endangered individuals: endangered in the sense of more controlled individuals with less privacy and freedom; dangerous in the sense of fragmentation, fewer chance encounters, shared experiences, and decreasing social cohesion. However, due to how it is created, technology creates and provides solutions to these issues. (Just & Latzer, 2016)
When you use Google’s search engine, for example, the results it returns will give preference to promoting Google’s own products and websites that work with Google in some capacity, be it for advertising or some other purpose.
After you have provided a description of your requirements, the algorithm will assist you in locating the optimal solution. In my opinion, the algorithm is based on your search preferences and habits in order to assist you organise the most important information you want to see. This can be accomplished through the use of your search history.
In order to accomplish this goal, collecting data and doing analysis are necessary steps; these are topics that I will cover in the paragraphs that follow. This is different from DuckDuckGo in that it is probably more objective and non-biased than the latter. It only sends the information; it is up to us to decide how we will put it to use once we receive it.
Figure 1. Input–throughput–output model of algorithmic selection on the Internet.
Source: Latzer et al. (2014).
Data Collection and Privacy Protection
The benefit of Google is that it can analyse and present content that is better suited to your usage habits after gathering a lot of user data. The fact that it can only determine the type of advertising to target you with in order to meet its marketing objectives through data analysis makes this a disadvantage as well.
Although I occasionally find it convenient to use, it is also more disorganised due to the abundance of content and formats that are displayed in the search results, including, but not limited to, advertisements, images, videos, and articles.
Contrarily, DuckDuckGo is committed to safeguarding user privacy, does not collect data on users’ search histories or other personally identifiable information, and does not show users personalised search results or advertisements.
The fact that it does not have to process personalised search results and ads makes its search speed faster than Google’s. The drawback is that it is a one-dimensional revenue model.
Former Google CEO Eric Schmidt once said, in response to a question about the company’s privacy practises, “Google policy is to get right up to the creepy line and not cross it.”
He and other influential figures from Silicon Valley are probably better described as not wanting to cross the creepy line. They will feel emboldened to test out creepier, more intrusive, and even predatory practises so long as secrecy can be utilised to weaken market competition and law enforcement. (Pasquale, 2015)
DuckDuckGo, which calls itself “the search engine that doesn’t track you”, conducts about 1.5 billion searches each month. In contrast, Google processes about 3.5 billion searches daily. Despite the unfair competition, DuckDuckGo is expanding. It only had an average of 45 million searches per month in 2012.
Although Google still operates in a parallel universe, the actual differences between the search results you see aren’t that great. In many ways, DuckDuckGo is actually superior. Its search results aren’t overrun with boxes and carousels promoting Google’s family of apps, trying to entice users to spend more time there. (Nast, 2019)
With these two search engines, we can see that people need data and algorithms and think about how to use them. Some people think that the appearance of DuckDuckGo is a reminder of how data and privacy leaks can affect our search results and how we use them.
This is because Google is easy to use, has a lot of features, and adapts to how people use it. So, both Google and DuckDuckGo are built on algorithms that meet the needs of different types of users. This always makes me think of the information in The Atlas of AI.
When individuals first started designing algorithms, their goal was to create software that would make people’s lives easier by allowing complicated and laborious information processing tasks to be performed by computers rather than people in order to either save time or increase productivity.
Even while the accumulation of ever-increasingly complicated data provides more opportunities for searching, the question that really has to be asked is whether or not it has progressed in the way that humans anticipate it will.
Philosophy professor Hubert Dreyfus argued back, saying that he was worried that the engineers in the room “do not even consider the possibility that the brain might process information in a totally different way than a computer.” In his later book, What Computers Can’t Do, Dreyfus pointed out that human intelligence and expertise depend on many unconscious and subconscious processes, while computers need all processes and data to be explicit and formalised. Therefore, computers must simplify, remove, or guess at aspects of intelligence that aren’t as formal. This means that computers can’t process information about situations the way humans do. (CRAWFORD, 2021)
I think that advertising is more than just a way to make money; it’s also a way to spread information. So, I’d like to use a more interesting experiment as an example to show how this works. The advertising company Memac Ogilvy & Mather Dubai launched a campaign for the UN on October 21, 2013, called “The Real Google Search.” In this experiment, the suggested search terms for women of colour were full of discriminatory phrases like：
- “Women cannot: drive; be bishops; be trusted; speak in church.”
- “Women should not: have rights, vote, work, or box.”
- “Women should stay at home, be slaves, be in the kitchen, not speak in church.”
- “Women need to: be put in their places, know their places, be controlled, be disciplined.” etc.
Even though the campaign used Google search results to show how people feel about women, it may have also shown by accident how powerful search engine results are. This campaign showed that search is a reflection of what people think and that society still has different ideas about women based on their gender. (Noble, 2018)
The content that appeared in this experiment was disappointing, but at the same time it also showed that “algorithms are neither intelligent nor capable.” (CRAWFORD, 2021) Fortunately, people eventually discovered the problem and found ways to improve it.
At the same time, it is not difficult to see from the video that although our lives enjoy great convenience built on data and algorithm, to a certain extent, we are also handed over to the machines for management. This is filled with algorithmic processes of “personalized design”, “rational analysis”, and “maximizing interests”.
Although it solves some problems perfectly to a certain extent, people’s perception and judgment of things are also subtly changed under its control. Of course, the advantage is better collection of people’s search preferences in order to create more attractive things to make more money.
In this context, I do not believe it is necessary to talk at length about advertising or the various profit models employed by companies like Google. It is unavoidable to acknowledge that despite the fact that its business strategy in the realm of advertising may be subject to criticism, the company has accomplished a great deal and reached a level of maturity in this area.
To locate its ideal users, DuckDuckGo also markets itself through a number of specialised channels. The reason for this is because the culture of the network that genuinely exerts an influence on people’s actions and thoughts has already taken the form of a distinct system, and we are now a part of it. Instead of stating that it is internet governance, it is more accurate to argue that people are engaging in self-redemption.
“The Google Search index contains hundreds of billions of webpages and is well over 100,000,000 gigabytes in size,” Google says.
Influences on the behaviour of humans A growing number of concerns have been voiced in relation to the altering of human behaviour as a result of the increasing influence of the interface between humans and algorithms. Consider the controversy around the concept of filter bubbles.
Is it possible that the type of machine learning that occurs as a result of using internet sites directs machines towards the sort of people and stuff that reinforces existing opinions while steering them away from the kind of people and content that diverges from existing beliefs?
In the book Regulating Platforms also says, “Impacts on human behaviour Many questions have been raised about the reshaping of human behaviour under the growing influence of the human–algorithmic interface. One example is the debate about filter bubbles. Does the machine learning that results from the use of online sites drive machines towards the kind of people and content that reinforce existing beliefs and away from the kind of people and content that diverge from existing beliefs (Bruns, 2019; Dubois and Blank, 2018; Pariser, 2012)?” (Flew, 2021)
The fact that our routines and the manner in which we take in information are both determined by algorithms is, in my opinion, the facet of internet culture that is most significant with regard to the administration of the internet. This is a fact that should give everyone serious pause.
Humans are the ones who devise algorithms, while computers are the ones who carry them out. They have the ability to learn and adapt over time, as well as the capacity to automatically evaluate works of art, music, movies, and even laws. They are also capable of managing national security, in addition to doing a great deal more.
How can we, as humans, ensure that data and algorithms will not replace us while yet taking use of the ease that they provide? In the times that we are living in, I think this is the most important problem that we need to work to solve.
Andrejevic, M. (2020). The Bias of Automation. In Automated media (pp. 25-43). New York: Routledge. doi: https://www-taylorfrancis-com.ezproxy.library.sydney.edu.au/chapters/mono/10.4324/9780429242595-2/bias-automation-mark-andrejevic?context=ubx&refId=78afef28-a020-4e8d-aaf8-58893bc909f1.
CRAWFORD, K. (2021). Introduction In The atlas of AI: Power, politics, and the planetary costs of Artificial Intelligence (pp. 1-21). Yale University Press. doi: https://doi.org/10.2307/j.ctv1ghv45t.3
Flew, T. (2021). Regulating Platforms (pp. 104–110). Issues of Concern.
Just, N., & Latzer, M. (2016). Governance by algorithms: Reality construction by algorithmic selection on the internet. Media, Culture & Society, 39(2), 238-258. doi:10.1177/0163443716643157.
Hollingsworth, S. (2021, May 21). DuckDuckGo vs. Google: An In-Depth Search Engine Comparison. Search Engine Journal. https://www.searchenginejournal.com/google-vs-duckduckgo/301997/.
Nast, C., & W. (2019, November 24). I Ditched Google for DuckDuckGo. Here’s Why You Should Too. Retrieved April 6, 2023, from https://www.wired.co.uk/article/duckduckgo-google-alternative-search-privacy.
Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: New York University Press. doi: https://ebookcentral-proquest-com.ezproxy.library.sydney.edu.au/lib/usyd/reader.action?docID=4834260&query=
Pasquale, F. (2016). The Black Box Society: The Secret Algorithms Behind Money and information (pp. 1-18). Cambridge: Harvard University Press. doi: http://www.jstor.org/stable/j.ctt13x0hch.3
Slavin, K. (Director). (2020, December 16). How algorithms shape our world? [Video file]. Retrieved April 6, 2023, from https://www.youtube.com/watch?v=Ix1wBp8dAnk&t=44s. Steiner, C. (Director). (2012, October 31). Algorithms Are Taking Over the World: Christopher Steiner at TEDxOrangeCoast [Video file]. Retrieved April 6, 2023, from https://www.youtube.com/watch?v=H_aLU-NOdHM