Algorithmic culture: TikTok leads vaping culture among young people

"JUUL Labs Vape/Electronic Cigarette Device" by Vaping360 is licensed under CC BY 2.0.
"JUUL Labs Vape/Electronic Cigarette Device" by Vaping360 is licensed under CC BY 2.0.

The Queensland government had announced Australian-first inquiry into vaping recently in March amid reports of children in primary school were developing vaping addictions (Utting & Lorio, 2023). Over the past few years, we have seen that e-cigarette users had increased exponentially due to the impacts of social media, and vaping even developed into a culture among teenagers. In this process, what roles do the algorithms of social media platforms play? Why could algorithms easily control pop culture? What impacts it will make on human culture? And how can it be regulated?

What is Algorithm?

Algorithm, can be basically interpreted as a pre-set mechanism that is used for problem-solving. This article will focus on algorithmic selection used by social media platforms, which refers to a procedure of assigning contextualized relevance to information components in a data collection through “an automated, statistical assessment of decentrally generated data signals” (Just & Latzer, 2019). In short, it means a set of procedures that assign information specific to the user based on the user’s analysis.

In the field of social media, it covers over all aspects including search, surveillance, content recommendation and production which has increasingly infiltrated our daily activities and social conventions, further influencing the ability we perceive the reality online and offline by changing the way we interact with others (Klug & Qin et al, 2021).

Therefore, it becomes possible that algorithms will have more control over cultural trends as we become more and more dependent on social media, which has caused concerns and calls for the regulation of algorithms from all walks of life.

Social Media and Algorithmic Culture

How do algorithms shape culture?

First, we need to know how algorithms work on social media platforms.

Have you ever had the experience that when you have viewed something at a certain time, your social media will keep recommending relevant content, trying to attract you?

This is the mechanism underlying algorithmic selection on social media; it will guess users’ interests based on their watch history, and then sort content in a certain way to peak and maintain their interests so that they will spend more time on the platforms (Morales &Fahrion et al, 2022). Take TikTok for example, its algorithms will calculate the contents users may like based on the analysis of their past likes and views, and curate an ongoing stream of user-generated videos on the “For You” page (MacKinnon & Lacombe-Duncan, 2021).

TikTok” by Solen Feyissa is licensed under CC BY-SA 2.0.

Once the person likes the video it recommends, it will keep recommending similar content, and then select users with similar interests to spread it. In this cycle, algorithms will unconsciously shape countless virtual communities; each one of which intimately connects users with the same bond, spreading or creating their own shared culture within their respective communities.

As we can see from the above, algorithms have revolutionized the way we receive information and even culture, which has caused a lot of concern for scholars.

Why does algorithmic culture cause concern?

First, I will introduce a concept, Filter Bubble raised by Eil Pariser, which refers to a unique universe of automated and tailored information for particular users (Just & Latzer, 2019). Compared with the traditional mass media, it has fundamentally transformed the way we receive information in which individuals are able to bypass authoritative intermediaries and directly encounter the information that algorithmic selection customizes for them (Just & Latzer, 2019).

In this environment, customized information is fed one-to-one into the user’s device by the algorithm, resulting in the algorithm becoming the arbiter and gatekeeper of culture (Hallinan & Striphas, 2016), that decides whether a culture is spread or generated, and in which population it is to be spread?

However, unlike humans, the first concern of algorithms is whether the user will like the content it recommends. This creates a tendency toward individualization, in which the users will be restricted to the customized bubbles filled only with information that pleases them. It raises great dangers because it will narrow people’s perspectives and isolate them from the diversity of online content outside; at the same time, since the information received is pleasurable, people will be easily immersed in it while being indoctrinated with their own ideas over and over again (Just & Latzer, 2019).

Therefore, when culture is created and spread under this system, it becomes a tool for pleasuring people. The purpose of culture’s existence becomes to satisfy human pleasure but is no longer profound and diverse, and the spread of culture has also shifted to the way that algorithms bring a specific culture into a specific filter bubble of individuals. This is very pathetic, as Hallinan and Striphas (2016) argue that in the commercial loop of satisfying users, culture is gradually reimagined as “a sedentary locus” that conforms to its users, rather than a trajectory of human civilization that confronts its users.

Additionally, because algorithmic culture is easy to immerse people in, especially teenagers who do not have the mature thinking ability, some unscrupulous companies may use algorithms to mislead users for their own benefit. They may create fake and appealing advertisements for targeted distribution, or even shape a culture in advance to lure users to participate. But as algorithms are increasingly becoming a major participant in managing the content of social platforms, there is still too much personalization and uncontrollability, and there are too many difficulties in governing them, which requires more attention and efforts from platforms and governments.

In the following, we will extend the concepts and ideas presented above in the context of a specific case study about vaping on TikTok.

Case Study: Vaping and TikTok

On March 12, the Queensland government had announced a parliamentary inquiry into vaping which is Australian-first and has received great attention and support from all sectors of the community. The inquiry will focus on teenagers and children amid reports of children in primary school developing vaping addictions (Utting & Lorio, 2023).

Source: Queensland government to launch parliamentary inquiry into vaping | ABC News – YouTube

With the official intervention of the government inquiry, we have finally realized that e-cigarettes have poisoned the youth and children so deeply. It was reported that a number of students already had serious symptoms of e-cigarette addiction; in addition, although Australian law requires that only adults over the age of 18 with a prescription can buy e-cigarettes, underage people can still get them from other retailers, while the flourishing of the online black market for nicotine vapes has put a major obstacle in the way of tobacco control (Tobin, 2022, cite from Utting & Lorio, 2023).

Let’s take a close look at the e-cigarette issue. According to the report from the Australian Bureau of Statistics (2022), in 2020-2021, over 21.7% of people aged 18-24 years had tried an e-cigarette or vaping device at least once which is more than one in five; and 7.6% of 15 to 17-year-old teenagers had tried vaping at least once. And National Drug Strategy Household Survey (2019) shows the proportion of e-cigarette users increased from 8.8% in 2016 to 11.3% in 2019 while the proportion of young people among them is significant.

Australian Bureau of Statistics (2022)

During this process, social media has played a notable role in the spread of e-cigarettes. Following JUUL’s legal fiasco in the U.S. in 2018, e-cigarette companies hitched a ride on the “TikTok Express” a few years later.

Despite the fact that e-cigarettes, like tobacco, are not allowed by law to be advertised. But as long as you enter TikTok, you will be able to search for numerous relevant terms such as #vaping #vapetrick #nicotine, of which #vaping has 2.8 billion views. E-cigarette companies are using TikTok algorithms to promote false advertising and try to create a positive atmosphere, creating a vaping culture among the youth population.

The word ‘vape’ was coined to disassociate itself from tobacco, it was borrowed from the word ‘vapour’ and falsely claimed to be a nicotine-free vapour. It even claimed that vape is a culture of spontaneous freedom and rebellion, which appeals to young minds on purpose.

E-Cigarette/Electronic Cigarette/E-Cigs/E-Liquid/Vaping/Cloud Chasing” by Vaping360 is licensed under CC BY 2.0.

According to a study of user-generated content about the disposable e-cigarette on Tiktok (Morales &Fahrion et al, 2022), it claims that most videos portray e-cigarettes in a more positive way; they use jokes and stories to increase user interaction; use TikTok’s editing function to emphasize the colourful packaging of vape devices with pleasant music in order to attract users to watch; and there are many promotional videos from offline distributors and show-off videos from users. It can be argued that the existing user-generated content on TikTok has glorified e-cigarettes while ignoring their dangers. Once teenagers enter the filter bubble, the algorithm will automatically filter out anti-e-cigarette information, placing them in a culture completely filled with pro-tobacco content and reinforcing their positive beliefs about e-cigarettes.

The study also highlights their focus on the phenomenon of Addiction Apathy (2022), which is shown in show-off videos. It demonstrates that the filter bubble makes users overly individualistic and even develops a narcissistic complex that values individualism over social morality (Just & Latzer, 2019). It has been proved that the harm vaping culture has brought to teenagers and children are extremely serious, both physically and psychologically, but promotions for underage groups are very common on TikTok. There are distributor accounts that publicly offer underage people where and how to buy e-cigarettes, and there are also many coded hashtags for trading that were created specifically for underage people in order to circumvent regulation. These are strong evidence of the looseness of control over algorithms that need to be regulated urgently, and it also reflects the difficulties of governing algorithms.

Main Difficulties of Governing Algorithms


Personalization, as a constitutive feature underlying algorithmic selection, diminishes the ability of individuals to receive diverse content online and deepens individualism and personal bias among users, which leads to a decrease in social cohesion (Just & Latzer, 2019). Furthermore, it is hard for individuals to escape from indulging in the customised information environment. It is suggested that governments can regulate the extent of algorithmic personalized recommendations on social media, for example, they could tighten control over media mining of personal information and try to incorporate more diverse content in the algorithmic system.


The operation of algorithms is highly opaque to the general public, such as the extent to which media companies mine users’ personal information and the degree of involvement in commercialization practices. It must require government intervention, especially to focus on protecting the interests of users and regulating companies’ behaviour. For instance, the Queensland government’s investigation into vaping is aiming to make the e-cigarette chain transparent.

Decreasing Controllability and Predictability

As algorithms become increasingly active as participants in managing content on social platforms, the self-learning systems involved behind them will become increasingly complex, which will lead to a decrease in the predictability and controllability of the algorithms and a growing number of unintended consequences as they work (Just & Latzer, 2019). However, social media platforms are now feverishly updating their algorithmic systems in order to attract more audiences, and if this continues, humans will lose control over the algorithms even faster. Therefore, the exploitation of algorithmic systems by media companies also needs to be controlled and intervened.


From the analysis above, we can see that the algorithmic system used by social media platforms poses a great challenge to its governance because of its individuality, opacity, uncontrollability and unpredictability. The case study of vaping culture on TikTok shows that commercial companies use algorithms to mislead the public for their own benefit, and even use the function of algorithms to shape culture to cause great physical and mental harm to young people. In this situation, the governance of algorithms has become urgent. As long as the government and all sectors make appropriate policies to manage their personalization, opaqueness, uncontrollability and unpredictability, I believe that our future with algorithms is still positive.

Reference List

ABC News (Australia). (2023, March 13). Queensland government to launch parliamentary inquiry into vaping | ABC News [Video]. YouTube.

Australian Bureau of Statistics. (2020-21). Smoking. ABS.

Australian Institute of Health and Welfare, 2020, “National Drug Strategy Household Survey, 2019”, doi:10.26193/WRHDUL, ADA Dataverse, V7

Feyissa, S. (2020). TikTok. Flickr.

Hallinan, B., & Striphas, T. (2016). Recommended for you: The Netflix Prize and the production of algorithmic culture. New media & society18(1), 117-137.

Just, N & Latzer, M. (2019) ‘Governance by algorithms: reality construction by algorithmic selection on the Internet’, Media, Culture & Society 39(2), pp. 238-258.

Klug, D., Qin, Y., Evans, M., & Kaufman, G. (2021, June). Trick and please. A mixed-method study on user assumptions about the TikTok algorithm. In 13th ACM Web Science Conference 2021 (pp. 84-92).

MacKinnon, K. R., Kia, H., & Lacombe-Duncan, A. (2021). Examining TikTok’s Potential for Community-Engaged Digital Knowledge Mobilization with Equity-Seeking Groups. Journal of Medical Internet Research, 23(12), e30315–e30315.

Morales, M., Fahrion, A., & Watkins, S. L. (2022). #NicotineAddictionCheck: Puff Bar Culture, Addiction Apathy, and Promotion of E-Cigarettes on TikTok. International Journal of Environmental Research and Public Health19(3), 1820.

Utting, A & Lorio, K. (2023, Mar 12). Queensland government to lauch parliamentary inquiry into vaping, with focus on children and teenagers. ABC News. Queensland government to launch parliamentary inquiry into vaping, with focus on children and teenagers – ABC News

Vaping360 (2018). JUUL Labs Vape/Electronic Cigarette Device [Image]. Flickr.

Be the first to comment

Leave a Reply