Safeguarding the Next Generation: The Impact of Online Hate Speech on Young People and How to Address It

In today’s rapidly evolving digital age, social media software and many online platforms have become an integral part of our lives. These platforms have changed the way we previously communicated, learned and interacted, especially during the last two years when COVID-19 was raging. Young people are spending more time on screens using social networks, short video apps, chat apps and online games. However, the rise of online hate speech has also become an unfortunate consequence of this transformation and is becoming a growing problem. The proliferation of hate speech online has prompted many to question the role of internet culture and governance in promoting or combating this phenomenon. The effects are particularly far-reaching for young people, who are often the most vulnerable to its harmful effects. We will then explore the topic of online hate speech from the perspective of internet culture and governance, how it affects young people and what we can do to address its pernicious effects.

What is online hate speech?

The common perception of hate speech is its indiscriminate verbal attacks on different things and groups of people through the use of violent, aggressive or offensive language online. A conceptualised definition is any form of offensive, abusive or derogatory language directed at an individual or group of individuals on the basis of race, religion, gender, sexual orientation, disability or any other characteristic (Tilano Vega et al., 2021). Due to the imbalance of power based on the Internet and social networks, anyone can be repeated, systematic and uncontrolled through digital media, and the act is often ideologically motivated (Tilano Vega et al., 2021).

“Social media provides a global megaphone for hate.” —— ANTÓNIO GUTERRES, United Nations Secretary-General, 2021

Online subcultures and hate speech

Internet culture is complex and defies a single explanation (Allebach, 2020). It creates a unique space for individuals to express themselves, communicate with others, and form communities based on shared interests or beliefs. However, the anonymity and lack of accountability that comes with online communication can also lead to the spread of hate speech (Tilano Vega et al., 2021).

Due to the diversity of online cultures, Online subcultures have gradually emerged. It is a group of people who share similar beliefs, values or interests that can contribute to the normalisation of hate speech (Falinge Park High School, 2023). In some cases, these subcultures may be created around hateful ideologies, such as white supremacy or misogyny, and they may use language that reinforces these beliefs (Falinge Park High School, 2023). They use derogatory terms or slurs to refer to people of different races, genders or sexual orientations and may use memes or remixes to spread their messages.

As a typical Online subculture, memes are images or videos with captions that are usually widely shared on the Internet (Rogers, 2023). There is a degree of parody or irony behind styles, behaviours or ideas that are popular on the internet (Technologies, 2021). The use of memes also contributes to the normalisation of hate speech (Chen & Pan, 2022). Some memes may be created to spread hate messages or stereotypes, while others may be used by hate groups to spread their ideology. Pages like Reddit have always attracted a tech-oriented male demographic that likes to post or read entertainment content anonymously (Allebach, 2020). The most egregious speech attacks on the internet today are misogyny, anti-feminist, etc. Women are targeted daily with hate and threatening speech, which includes attacks on young women. There is research to suggest that hate speech can be seen to some extent as reinforcing strict gender norms, especially as women move out of traditional roles and into male-dominated spaces (Laura Griffin – Lecturer, 2023).

The normalisation of hate speech can have a significant impact on how it is perceived online. When hateful language is accepted or even celebrated in certain online communities, it can make it easier for individuals to spread their message and for others to accept it. This can lead to the creation of echo chambers where individuals are only exposed to views and ideas that reinforce their existing beliefs, further entrenching their attitudes and behaviours (Harel, 2020).

The Impact of online hate speech on young people

Young people are currently at high risk of experiencing hate speech online, and certain effects can be devastating and long-lasting. While that risk does not automatically constitute harm, exposure to online hate does increase the possibility of personal or social harm and is differentiated according to the various forms that harm may take (SELMA Partners, 2019). Research data from the European SELMA project shows that, 57% of teenagers experienced hate speech online at least once or multiple times within a three-month period, the data from October 2018, from both Australia and New Zealand, religion, political views, race, and gender were identified as the most frequent factors leading to the experience of hate speech (esafety, 2023).

Psychological effects

Online hate speech can have both direct and indirect effects on the short and long-term mental health of young people individually, and the extent of damage caused by psychological victimisation can be much greater than physical harm in the ordinary sense. Exposure to hate speech can lead to anxiety, depression and feelings of low self-esteem. It can also cause emotional distress and trauma to young people, with consequences that are similar in form to those experienced by the recipients of traumatic experiences, leading to long-term mental health problems for victims. Victims may exhibit low self-esteem, sleep disturbances, increased anxiety, and feelings of fear and insecurity, while “feeling excluded, like they have no friends” – one interviewee from the UK ( SELMA Partners, 2019). In time, they begin to shut themselves off and refuse to engage with the outside world, leading to depression and extreme behaviour such as self-harm or suicide.

Social effects

Usually, young people who have experienced hate speech feel targeted and isolated, resulting in feelings of helplessness and despair. They may withdraw from online platforms and avoid interacting with others to avoid further abuse. This leads to their social isolation and lack of connection, while negatively impacting on the well-being and development of society (SELMA, 2019). When discrimination, intolerance and hateful attitudes and behaviours are normalised, young people may develop prejudices and stereotypical influences. They may feel that hate speech in this society is normal, swearing, and name-calling becomes normal because they are used all the time. This poor expression has been around forever and young people are easily able to see and learn it on the internet ( SELMA Partners, 2019 ).

On the other hand, when the authorities engage in governance and moderation, freedom of expression is thrown into the limelight and becomes a “sword” for opponents to attack. They righteously said: “Freedom of expression has been abolished. If you become a target of hate speech, you will start not expressing your beliefs. A society that does this will not progress without expressing opinions and ideas ( SELMA Partners, 2019).”

Otherwise, online hatred may increase polarisation and division within society. Young people exposed to hate speech may become more entrenched in their own views and less willing to engage in dialogue and compromise with those who hold different views. This is a threat to social cohesion, a breakdown in communication and understanding between different groups, and an increase in social tension and conflict ( SELMA Partners, 2019). The consequence of polarisation is that the next generation of young people will grow up to do the same thing, and society will remain divided, like two divided peoples, forever on opposite sides, unable to achieve unity and peace.

Now Stop it

Hate speech was identified as a “precursor to atrocity crimes, including genocide” (United Nations, 2023).

What we can do?

Parents approach

Parents need to give their children a certain amount of education and awareness, starting with understanding the dangers of hate speech and encouraging them to report any incidents of abuse. Also supervise their children’s online activities appropriately by turning on adult supervision mode, or underage mode browsing, a feature that many platforms now have. Next, as parents, it is important to model positive behaviour online when educating the next generation, and to teach children by example that they should be friendly and respectful of others, both online and offline in their social activities ( Jones, 2023).

Regulation and governance

With the exception of parents and young individuals, regulators need to take immediate action against reported hate speech, such as promptly reviewing and removing relevant malicious content. Warnings or relative sanctions can be imposed by law for exceptional targets. Platforms can also provide incentives to encourage users to report hate speech.

eSafety:

The eSafety Commissioner (eSafety) is Australia’s independent regulator for online safety (eSafety, 2023). It serves as the world’s first government agency to keep people safe online, an independent statutory office that promotes online safety for all Australians.

A range of responsibilities related to online safety (eSafety, 2023):

1. Provide online safety education and resources for children, young people and adults.

2. Investigate and respond to complaints about cyberbullying, image-based abuse and other online safety issues.

3. Provide support and advice to victims of online safety issues, including appropriate support services

4. Develop policies and guidelines to promote online safety and prevent harm.

5. Work with industry partners to develop and implement best practices for online safety.

6. Advocate for the rights of children and young people online.

The eSafety Commissioner was established in 2015 as part of the Online Safety Act and is responsible for implementing the Australian Government’s online safety policies and initiatives. As the Children’s eSafety Commissioner and is at the forefront of the fight against the risks and harms faced by adults and children online. The Commissioner plays an important role in ensuring that all Australians can participate safely and confidently in the online world (eSafety, 2023).

How eSafety can help you deal with cyberbullying

If faced with hate speech cyberbullying, the first thing to do is to send and report the malicious content on the relevant platform so that the platform can prioritise it. If there is no response after 48 hours and the cyberbullying is severe enough, then eSafety can ask them to remove the harmful content (Cyberbullying, 2023).

Data shows that of the 1,542 complaints received by eSafety in 2021-22 about serious cyberbullying against Australian children, 217 informal takedown requests were made and 88% of the content was successfully removed (eSafety, 2023). This also demonstrates the positive impact the project has had on reducing the incidence of cyberbullying. They have been successful in raising awareness of the issue and providing support to people who have experienced hate speech or cyberbullying.

Overall, eSafety has made progress in reducing cyberbullying, but it is important to note that the problem still exists and more needs to be done to address it. Not only is continued parental education, awareness raising and the enforcement of laws and policies related to online safety necessary to ensure that the next generation can safely and confidently participate in the online world. People need to first be aware of the seriousness of the current problem of online hate speech and then address it systematically, although the process is bound to be long and arduous.

Reference:

Jones, S. (2023). Impact of online hate on young people. Internet Matters. Retrieved April 12, 2023, from https://www.internetmatters.org/hub/question/what-is-the-real-world-impact-of-online-hate-speech-on-young-people/#comments

SELMA Partners. (2019, April 11). The consequences of online hate speech – a teenager’s perspective. SELMA – Hacking Hate. Retrieved April 12, 2023, from https://hackinghate.eu/news/the-consequences-of-online-hate-speech-a-teenager-s-perspective/

Allebach, N. (2020, July 31). A brief history of internet culture and how everything became absurd. Medium. Retrieved April 10, 2023, from https://medium.com/swlh/a-brief-history-of-internet-culture-and-how-everything-became-absurd-6af862e71c94

Chen, Y., & Pan, F. (2022, September 12). Multimodal detection of hateful memes by applying a vision-language pre-training model. PloS one. Retrieved April 10, 2023, from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9467312/

Cyberbullying. (2023). Cyberbullying. eSafety Commissioner. Retrieved April 13, 2023, from https://www.esafety.gov.au/key-issues/cyberbullying

esafety. (2023). Online hate speech. eSafety Commissioner. Retrieved April 12, 2023, from https://www.esafety.gov.au/research/online-hate-speech

eSafety. (2023). Tackling individual harms and systemic reform in 2021-22. eSafety Commissioner. Retrieved April 13, 2023, from https://www.esafety.gov.au/newsroom/blogs/tackling-individual-harms-and-systemic-reform-2021-22

eSafety. (2023). Who we are. eSafety Commissioner. Retrieved April 13, 2023, from https://www.esafety.gov.au/about-us/who-we-are

Falinge Park High School. (2023). What are internet subcultures? How can these be extreme or dangerous? Retrieved April 10, 2023, from https://www.falingepark.com/wp-content/uploads/2020/07/Y8-PSHE-Echo-chambers-and-subcultures.pdf

Harel, T. (2020, May 12). The Normalization of Hatred: Identity, Affective Polarization, and Dehumanization on Facebook in the Context of Intractable Political Conflict. Retrieved April 10, 2023, from https://journals.sagepub.com/doi/full/10.1177/2056305120913983

Laura Griffin – Lecturer, L. T. U. & N. S.- P. D. candidate. (2023, January 31). The gender gap in our hate speech laws. La Trobe University. Retrieved April 10, 2023, from https://www.latrobe.edu.au/news/articles/2018/opinion/the-gender-gap-in-our-hate-speech-laws2

Rogers, K. (2023, March 31). meme. Encyclopedia Britannica. https://www.britannica.com/topic/meme

SELMA (2019) Hacking Online Hate: Building an Evidence Base for Educators. www.hackinghate.eu.

Technologies, I. N. (2021, April 1). 10 internet subcultures you need to know. Indus Net Technology – Award Winning Full Stack Digital Service Company. Retrieved April 10, 2023, from https://www.indusnet.co.in/10-internet-subcultures-need-know/

Tilano Vega, L., López, H., Casta˜ no-Pulgarín, S., & Suárez-Betancur , N. (2021, April 6). Internet, social media and online hate speech. systematic review. Aggression and Violent Behavior. Retrieved April 10, 2023, from https://www.sciencedirect.com/science/article/pii/S1359178921000628

United Nations. (2023). Hate speech and real harm. United Nations. Retrieved April 12, 2023, from https://www.un.org/en/hate-speech/understanding-hate-speech/hate-speech-and-real-harm

Be the first to comment

Leave a Reply