Have you ever wondered how Netflix can accurately guess what videos you like to watch? For years, the main goal of Netflix’s personalised recommendation system was to suggest the appropriate content at the right time for users. With thousands of films and TV shows under every category page on the Netflix website and billions of user accounts, it is obvious that recommending the most appropriate videos for each user is a top priority. But personalised algorithms can do more than that. How do algorithms discover users’ preferences? How do Netflix package an unfamiliar video so that it can quickly pique users’ interest? This post takes you on a quick tour of the secrets behind Netflix’s personalisation algorithm and uncovers what the algorithm exactly brings to viewers.
- Get A Quick Background Overview
Since 2020, the COVID-19 pandemic and subsequent global “lockdown” has accelerated the transition of the film and television industry towards streaming services, as the pandemic has driven a significant rise in global consumption of online film and television content (The Economist, 2020). As streaming services become the primary way for users around the world to watch video content, various platforms have their own unique ways of attracting audiences and generating revenue. Streaming sites such as YouTube rely on targeted advertising for revenue, Prime Video in contrast is a part of Amazon’s e-commerce entity, while Apple TV directs subscribers to a larger media ecosystem (Khoo, 2023). And this article takes a look at what secrets Netflix is using to capture the market among its main competitors.
Netflix is an American media company founded in California in 1997 by Reed Hastings and Marc Randolph. It offers streaming and subscription video-on-demand services and has annual revenues of 31.6bn USD in FY2022 (Netflix, 2023). This significant annual revenue was achieved thanks to Netflix’s enormous paying subscriber base, with more than 230 million paid subscribers worldwide as of the fourth quarter of 2022, representing an increase of approximately eight million subscribers from the previous quarter (Netflix, 2023).
Then what has helped Netflix stabilise such a large paying subscriber base? And that brings us to Netflix’s recommendation algorithm as a key contributor to its revenue-generating model.
- Netflix Recommender Algorithm (NRA) and Its Operational Logics
NRA is the collective term for a range of proprietary computing tools that Netflix has been developing on an ongoing basis since the early 2000s, it can be used to identify and deliver recommend movies and TV shows that may be of potential interest to users. The Netflix recommendation algorithm is essentially a machine learning algorithm that uses a combination of user behavioural data and movie metadata to make personalised recommendations (Pajkovic, 2022).
Here are a few breakdowns of the different categories of recommendation algorithms that make up the Netflix recommendation system:
- Personalized Video Ranker (PVR): The PVR algorithm runs on the Netflix homepage and sorts the entire video catalogue in a personalised way for each audience profile (Gomez-Uribe & Hunt, 2016). It can also filter certain categories of titles by a particular type of theme (Spandana, 2020).
- Top-N Video Ranker: The objective of this algorithm is to discover the best few personalised recommendations for each user in the entire catalogue (Gomez-Uribe & Hunt, 2016). That means it only focuses on the top-ranking in the entire Netflix catalogue.
- Trending Now: This algorithm is mainly used to identify trends that are popular over a shorter duration or periodically, such as the rise in romantic video watching during Valentine’s Day.
- Continue Watching Video Ranker (CWR): CWR sorts the catalogue based on a prediction of whether users intend to continue or re-watch the video, primarily based on length of time of watching and the point when a user stopped watching (e.g. opening, middle or ending) (Spandana, 2020).
Simply put, Netflix first artificially tags each movie or show it offers with different labels, including the degree of romance, horror, director, cast, plot, and even the moral status of the characters (Pajkovic, 2022). The algorithm will then collect and aggregate tags from content you have previously watched and recommend videos with the same tags or combinations of tags to you. For example, if you’ve watched Interstellar (2014) and Inception (2010) before, then Netflix will likely recommend Tenet (2020) to you as they are both films directed by Christopher Nolan. Or, it could also recommend Stowaway (2021) to you because they all tagged with ‘Sci-Fi Drama’.
The algorithm is also based on a collaborative filtering approach (Pajkovic, 2022), which uses behavioural data from other users with similar tastes to make recommendations. Similar to content-based filtering, this approach compares the watching preferences of multiple users (including watching history, ratings, likes, etc.) and finds other users who have similar patterns of viewing and tastes to you. Then the next step would be check for gaps that the system may recommend movies and shows which your ‘Netflix soulmate’ liked but you haven’t watched before.
The rise of big data today enables the analysis of large data sets in terms of volume, velocity and variety simultaneously, big data analytics opens up the ability to predict the future online behaviour of individuals and groups (Flew, 2021). As a core feature of the operations and brand, the NRA is critical to Netflix’s business model that account for approximately 80% total hours streamed on Netflix (Gomez-Uribe & Hunt, 2016). This is why many scholars point out that big data has triggered a political economy shift, where data is created, collected and circulated as capital (Sadowski, 2019).
- NRA’s Business Value
Did Netflix recommender algorithm is designed really for the good of their audiences or for themselves? Then it’s necessary to see what business value NRA can offer. Most obviously, the personalisation algorithm enables a new user to find the content they are passionate about within seconds, thus preventing them from abandoning Netflix in favour of other streaming sites or entertainment platforms (Gomez-Uribe & Hunt, 2016). Without personalisation algorithms, all users would get the same video recommendations, whereas personalisation algorithms can boost the overall engagement with the platform such as streaming time.
In addition, personalised recommendation algorithms enable relatively niche videos to find their corresponding audiences, even if their audience is too small to sustain significant advertising revenue or to dominate broadcast or cable channel slots (Gomez-Uribe & Hunt, 2016), which has assisted Netflix in capturing this additional niche audience and simultaneously keeping them on the site. For example, Korean drama Squid Game (2021) achieved an unexpected success on Netflix, demonstrating a niche Korean-language show becomes a global hit through algorithmic ranking overlay (Khoo, 2022).
There are many reasons that platforms lose their customers, it could be because of failed payments, and more often than not, users will browse less or even cancel their subscription service because they cannot find videos they like on the platform. That is to say that the combined effect of personalised algorithms could not only help Netflix boost new subscriptions, but could also materially help Netflix reduce the churn of subscriptions and save their money, company estimates that it could generate $1 billion in revenue annually (Gomez-Uribe & Hunt, 2016).
- Concerns Around Algorithm Bias
Taken together, the NRA seems to be the perfect tool for capturing audiences’ preferences, providing personalised recommendations, and make a significant business contribution to the company. However, many concerns about bias and discrimination have been raised regarding Netflix’s recommendation system. For example, some Black users have commented that they have seen posters or cover images with Black characters in marginal roles where the main characters are actually White cast (Gaw, 2022). User critically commented that the algorithm diminished their taste due to their racial background and made some racial titles disappear from their preferences. The explanation of this phenomenon needs to start with algorithmic bias.
Artificial intelligence uses the approach of ‘Machine Learning’, which is generally known as data training and learns from it using algorithms. However, the data used to inform decisions are often skewed in their sampling analysis or reflect some social bias (e.g. crime data is biased by race) (Flew, 2021). Although the big data revolution enabled by digital media is underpinned by core elements such as Datafication and Dataism (van Dijck, 2014), data is still perceived as non-neutral and biased. Issues of transparency and fairness in both the acquisition and use of data in algorithm designing will be reflected in the final operation of the algorithm, which is what we call algorithmic bias (Flew, 2021).
Back to the example just given about personalised thumbnails, there are many more of this phenomenon. One user had seen thumbnails of Love Actually (2003) on his Netflix with the black British actor Chiwetel Ejiofor (who plays a marginal role) while in contrast other users saw the main characters (Spandana, 2020). Netflix‘s personalised thumbnail algorithm was originally designed due to investigations which found that film thumbnails have a strong visual impact being the main factor influencing users’ decision to watch a film, accounting for 82% of their attention spans (Spandana, 2020). Yet, this algorithm cannot hide the fact that films lack diversity in terms of race or sexual orientation; alternatively, algorithmic bias can amplify this into a racist polemic that triggers user dissatisfaction. Over-prioritising certain racially or otherwise biased titles may also narrow the user’s scope of watching.
- Deep Personalisation Makes You Socially Isolated?
Another issue of concern about personalisation algorithms is that continuous and deep personalisation will lead to social isolation. The Internet was once an open, publicly accessible space that allowed any user to explore and navigate the distant geographically constrained worlds. Streaming and social media were initially established to break down walls of information delivery, allowing people to communicate and share without barriers. However, it was paid content that first broke this landscape, followed by personalised recommendations.
The attention economy reflects the fact that the competition on major platforms in today’s internet and digital sphere is over the ability of users’ attention on information or content, that is, the time they spend on it (Tanner, 2020). As mentioned above, Netflix has achieved considerable business success, they retain a huge subscriber base through its recommendation algorithms, but the fact is that personalised algorithms are simultaneously narrowing the choices available to users. Netflix claims to be constantly refining their algorithms in order to provide tailored, even customized programming for each user (Scarlata, 2023), and in fact they did it, except that here we tend to speak more critically. The algorithm picks each user with the “optimal” catalogue of content (Brincker, 2021), in other words the algorithm decides what the user will watch.
Entrepreneur and author Eli Pariser was keen to foresee this possibility in 2011 and published the book “Filter Bubble” (Brincker, 2021), which pointed out the concern that personalisation algorithms selectively predict and deliver content that may be of interest to users based on users’ online behaviour as a signal, potentially creating a form of online segregation (Pariser, 2011). Netflix even introducing artwork personalisation that engage users’ interest by personalising movie posters or thumbnails of TV series (i.e. selecting a thumbnail image of the user’s favourite cast) all of which is driving users to submit to the algorithms with Netflix and comply with the decisions it make for them.
This is not only a concern about the problem of Netflix’s personalised algorithms, but also an ethical question that will be challenged for all algorithms within the Internet domain: does personalisation algorithm have the right to influence and determine human behaviour beyond prediction and classification?
As the streaming wars become increasingly fierce, personalisation algorithms like the NRA will become a key competitive feature within the industry, and the same issues that come with personalisation will be faced by all streamers and the digital media industry. This means that users as consumers of online movie and TV-on-demand services are faced with a semi-automated, algorithmic technology-led consumption model. There are a number of existing issues and challenges that cannot be ignored, and it will be crucial to continue to interrogate the logic behind the operation of personalised algorithms as well as to analyse what impact they have on users and on the Internet culture and governance.
Brincker, M. (2021). Disoriented and Alone in the “Experience Machine” – On Netflix, Shared World Deceptions and the Consequences of Deepening Algorithmic Personalization. Sats (Aarhus), 22(1), 75–96. https://doi.org/10.1515/sats-2021-0005
Brown, S. (2018). Other Black @netflix users: does your queue do this? [Twitter Post]. Retrieved from: https://twitter.com/slb79/status/1052776984231718912?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1052776984231718912%7Ctwgr%5Ec6091c75a81f32da345473f5db5b351fa1809106%7Ctwcon%5Es1_c10&ref_url=https%3A%2F%2Fwww.fastcompany.com%2F90253578%2Fis-netflix-racially-personalizing-artwork-for-its-titles
Flew, T. (2021). Regulating platforms. pp 79-86. Cambridge: Polity Press.
Gaw, F. (2022). Algorithmic logics and the construction of cultural taste of the Netflix Recommender System. Media, Culture & Society, 44(4), 706–725. https://doi.org/10.1177/01634437211053767
Gomez-Uribe, C., & Hunt, N. (2016). The Netflix Recommender System: Algorithms, Business Value, and Innovation. ACM Transactions on Management Information Systems, 6(4), 1–19. https://doi.org/10.1145/2843948
Khoo, O. (2023). Picturing Diversity: Netflix’s Inclusion Strategy and the Netflix Recommender Algorithm (NRA). Television & New Media, 24(3), 281–297. https://doi.org/10.1177/15274764221102864
Netflix, (2023). Number of Netflix paid subscribers worldwide from 1st quarter 2013 to 4th quarter 2022 (in millions) [Graph]. In Statista. Retrieved from: https://www-statista-com.ezproxy.library.sydney.edu.au/statistics/250934/quarterly-number-of-netflix-streaming-subscribers-worldwide/
Netflix. (2023). Netflix’s annual revenue from 2002 to 2022 (in million U.S. dollars) [Graph]. In Statista. Retrieved from: https://www-statista-com.ezproxy.library.sydney.edu.au/statistics/272545/annual-revenue-of-netflix/
Pajkovic, N. (2022). Algorithms and taste-making: Exposing the Netflix Recommender System’s operational logics. Convergence (London, England), 28(1), 214–235. https://doi.org/10.1177/13548565211014464
Pariser, E. (2011). The filter bubble : what the Internet is hiding from you. London: Viking.
Sadowski, J. (2019). When data is capital: Datafication, accumulation, and extraction. Big Data & Society.
Scarlata, A. (2023). “What are people watching in your area?”: Interrogating the role and reliability of the Netflix top 10 feature. Critical Studies in Television, 18(1), 7–23. https://doi.org/10.1177/17496020221127183
Spandana, S. (2020). “Why Am I Seeing This? How Video and E-Commerce Platforms Use Recommendation Systems to Shape User Experiences.” New America. Retrieved from: https://www.newamerica.org/oti/reports/why-am-i-seeing-this/case-study-netflix/
Tanner, S. (2020). Finding value and impact in an attention economy. In Delivering Impact with Digital Resources. Facet Publishing.
The Economist. (2020). Covid-19 is a short-term boon to streaming services. The Economist. Retrieved from: https://www.economist.com/graphic-detail/2020/03/27/covid-19-is-a-short-term-boon-to-streaming-services
van Dijck, J. (2014). Datafication, dataism and dataveillance: Big data between scientific paradigm and ideology. Sureveillance & Society, 12(2), 197–208.