Is Nothing Safe from Hate? Exploring Online Hate Speech Through YouTuber Controversy.


The Internet. What a wonderful place, is it not? A digital space with near-endless amounts of data and information, all accessible with a simple click or tap. While originally developed as a way for government officials, intergovernmental organizations, and researchers to share information, in this day and age, it has evolved far beyond that. It is a place, not only for knowledge, but for entertainment. For work. And for communication. 

Now, having said that, let’s look a bit more into that last point, shall we? Communication has become one of the, if not, the most important and distinct features of the Internet and digital spaces with social media and content-sharing platforms such as Twitter/X and YouTube respectively becoming increasingly more and more popular every day. Platforms on the Internet such as these are the epitome of new media, ticking off quite a few of what Terry Flew (2014) notes as the Key Concepts of New Media, namely:

  • Networks: Interconnectivity on the internet with emphasis on the interdependent and relational links between people and institutions
  • Participation: The Internet and new media have a greater degree of interactivity with the content compared to older forms of media and platforms
  • Globalization: The current state of the Internet allows for an ever-expanding scale of social interactions

Social media and forum platforms truly embody these notions and concepts of new media, allowing like-minded individuals and groups to come together and share their thoughts on whatever they may desire. And like what was previously mentioned, it is not for knowledge and professional-related inquiries; entertainment, and by extension, fandoms and critics are a strong and striving force on the internet. 

Due to the accessibility and interconnectivity of digital platforms, fans can come together to support and talk about their topics of infatuation. However, through human nature, this is by no means always expressed through positive means, which is where the topic of hate speech and online harm and abuse comes into play. 

“Hate speech has been defined as speech that expresses, encourages, stirs up, or incites hatred against a group of individuals distinguished by a particular feature or set of features such as race, ethnicity, gender, religion, nationality, and sexual orientation”

(Parekh, 2012)

While the Internet, having said that, is by no means a place void of these types of interactions. Individuals and groups online are given a platform, and by extension, an outlet for them to voice their frustrations and disapproval with few limits; moderation is usually only necessary should matters become extreme or drastically detrimental to an individual or group. In regard to these actions, Flew (2021) goes on to state that the reasons for online harassment  “[feature] political or religious beliefs, physical appearance, race or ethnicity, gender, and sexual orientation…[potentially evolving] into hate speech on a large scale”. With that being said, for the sake of this article, I would like to shed some light on a certain case revolving around the content creator, JoCat, starting from late 2023, along with some insight into the controversies for those less-initiated.

JoCat Content Creation Controversy

Tweet from JoCat’s Twitter/X page on December 19th, 2023

Joseph Catalanello, better known by his online alias, JoCat, is an American YouTuber best known for his animations and comedic takes on video games or various other facets of popular culture, with approximately five years on the platform. The case, and controversy, surrounding JoCat, started on October 6th, 2023, during a livestream of the video game, Baldur’s Gate 3, on the livestreaming platform,, in which he sang a gender-bent parody of the song, Boys by Lizzo, switching out allusions of different types of tastes in men with tastes in women. The parody was then later animated and uploaded as its separate video on YouTube, receiving a “universally positive response from [his] audience, [his] peers, and [his] partner” (JoCat, 2023). 

JoCat’s Parody Video of Lizzo’s Boys titled “I Like Girls – JoCat Animation”

However, despite the initial positive reviews and feedback from the video, as the parody garnered increasing popularity, and by extension, visibility on algorithm-based platforms, the video began to reach outside of JoCat’s initial target audience, which he stated to have “seen and received many assumptions about [his] character, [his] history, [his] beliefs, [his] relationships, and all those of [his] partner, as well as threats of violence to [him] as well as [his] family”, along with ‘doxxing’, or attempts to reveal aspects of an individual’s private information online, attempts onto him. Ultimately, due to the continual negative backlash and hate received, JoCat announced that he would be going on an indefinite hiatus from content creation on December 19th, 2023.

While JoCat remains active on social media sites such as Twitter/X, the amount and frequency of content, namely videos and livestreams, have decreased drastically compared to before the influx of hate comments and threats.

Should you want to have a look into JoCat’s social media sites or the full letter from which excerpts were quoted, links to both are provided here and here respectively.

Why JoCat’s Case is a Textbook Example of Online Hate Speech

Now, with introductions out of the way, let’s look into why this case study can be seen as a prime example of hate speech on social media platforms. The JoCat drama and controversy case is arguably a case that strongly draws upon Flew’s key concepts of new media, with the three most relevant concepts mentioned above, and outlines the intricacies of online speech, once again being brought up by Flew (2021) and Parekh (2012).

First off, how is it that, while the parody video initially received overwhelmingly positive feedback, the video and JoCat started to receive hate comments and threats of increasing severity after a significant amount of time had passed since the initial upload? Arguably, the reasoning can come down to matters regarding the nature of the platforms that were used and the inner workings of algorithms.

Algorithms are an interesting matter to look into. While the traditional definition of algorithms falls under a set of rules of processes established for activities such as calculations, algorithms within the digital world are defined as “processes that assign relevance to information elements of a data set by an automated, statistical assessment of decentrally generated data signals” (Just & Latzer, 2017, in Flew, 2021). Simply put, this means that we, as users of platforms such as YouTube, click and view videos, in this case, JoCat’s parody video, depending on the frequency, duration, like/dislike ratio, and many other factors, the algorithm decides how marketable that video is to people that also view similar content. Once, and if, that marketability reaches a certain threshold, the algorithm would then possibly advertise that content to people outside of the intended target audience in hopes of reaching a wider range. This is confirmed by YouTube’s statement that videos that are recommended are governed by user behavior, in that “[they] track what viewers watch, how long they watch, what they skip over, and more” (Oladipo, 2024).

In light of this, how does the visibility of the video increased by algorithms lead to instances of hate comments? These actions can be explained through a phenomenon called toxic technocultures, defined as “toxic cultures that are enabled by and propagated through sociotechnical networks” (Massanari, 2017), along with the ideologies surrounding toxic masculinity. In regard to toxic technocultures, Adrienne Massanari states”

“Toxic technocultures are unique in their leveraging of sociotechnical platforms as both a channel of coordination and harassment and their seemingly leaderless, amorphous quality. Members of these communities often demonstrate technological prowess in engaging in ethically dubious actions such as aggregating public and private content about the targets of their actions and exploiting platform policies that often value aggregating large audiences while offering little protection from potential harassment victims.”

(Massanari, 2017)

Having read this quote, do you think this sounds familiar to any case you may know of? Rhetorical questions aside, this is arguably a case study in which such definitions can be accurately applied. Due to the nature of social media platforms, specifically Twitter/X, on which the harassment and hate comments were primarily based, it allowed for a mass number of people to voice their opinion and distaste towards the content JoCat uploaded, en masse aka through a mob mentality. Some scholars also refer to this phenomenon as ‘herding’, in which the power law of a certain community or platform “effects around particular material, biasing individuals to mirror the voting behavior of others” (Muchnik et al. 2013, in Massanari, 2017), potentially causing social media users in favor of JoCat’s case to fear showing support at the risk of mob retaliation or the same treatment upon the content creator. Additionally, as social media sites allow users to forgo user their real-life identities, the degrees of animosity and proximity that this provides allow users to speak their minds with little risk of effective consequences being traced back to them. The underlying characteristics of toxic masculinity also do well to promote these behaviors, as the actions of JoCat could be seen to go against and subvert the ideas of hegemonic masculinity,  a concept discussed by Knudsen and Andersen (2020) states that masculinity was traditionally seen as acting in a manner which avoids feminine characteristics and flamboyant displays of sexuality, which further incentivizes conflict and discourse on social media. And such controversies gain more and more traction on these social media platforms, the more people are able to have a say on the matter, regardless of whether their intentions are positive or negative.

Finally, despite the exuberant amount of threats and comments against the content creator, due to the governance methods of the social media platform and the nature of its community, platform politics, meaning “the assemblage of design, policies, and norms that encourage certain kinds of cultures and behaviors to coalesce on platforms while implicitly discouraging others” (Massanari, 2017), if you would, phenomena such as hate speech and improbable ‘cancel-culture’ have the tendency to dwell and continue to fester social media. This can also explicitly be seen on Twitter/X’s home page, clearly showing which topics have the most interaction, creating a cycle of ‘fueling the fire’ by means of algorithms. Thus led to JoCat being overwhelmed by the negativity and hate comments/threats, ultimately leading to his indefinite hiatus.


It is understandable. While such hate speech towards online content creators may seem like it would pale in comparison to other, more prevalent forms of bigotry, such as online acts of racism, it is important to note that these types of actions still affect an individual’s mental and emotional, as well as their livelihood and the state of the industry as a whole. What is important to understand and take away from this case is that hate speech, online violence, and threats can be more common than one thinks, being able to happen to anyone, not just from controversial topics, but from seemingly innocent things such as what was just discussed. While it would be difficult, if not impossible, to outright stop hate online, understanding what, how, and why it is being done is key to making sure fewer and fewer people have to go through something like this.


Flew, T. (2014). Twenty key new media concepts. in New Media. Fourth Edition. 

Flew, T. (2021). Issues of Concern. in Regulating Platforms. 

Jenkins, H. (1992). “Get a life!” Fans, poachers, nomads. In Textual Poachers, Television Fans and Participatory Culture, pp. 9-49. London: Routledge.

JoCat [@JoCat105]. (2023, December 19). If this is what it takes to be a content creator online, I don’t think I’m cut out for it. Twitter.

JoCat. (2021, April 3). I Like Girls – JoCat Animation. YouTube.

Massanari, A. (2017). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. in New Media & Society. Vol. 19, Issue 3. pp 329-346.

Oladipo, T. (2024). A 2024 Guide to the YouTube Algorithm: Everything You Need to Know to Boost Your Content. Buffer.

Parekh, B. (2012). Is There a Case for Banning Hate Speech? in The Content and Context of Hate Speech, pp 37-56. DOI:10.1017/CBO9781139042871.006

Sandvoss, C., Gray, J. & Harrington, L. (2014) Introduction; why still study Fans? In Fandom, identities & communities in a mediated world, pp 1-28. (ed. Gray, J, Sandvoss, C.Harrington, L.)

Be the first to comment

Leave a Reply