It has been a long debate on whether the recommendation algorithm is leading to opinion polarization, the spread of fake news and other forms of concerns in the age of social media. This article will closely examine concerns raised on Chinese social media platforms using filter bubbles and echo chambers as theoretical frameworks – two theories that relate to recommendation algorithms. Through in-depth analysis, this article will uncover how filter bubbles and echo chambers stimulate opinion polarization, the spread of fake news and misinformation, and online scams.
This article focuses on two profound concerns on Chinese social media platforms. One relates to radical debates and polarization over whether the Covid-19 vaccine is effective and safe. The other is about misinformation, fake news and scams encountered by older people when using Douyin and how scammers target them through recommendation algorithms. The investigation will then delve into whether platforms failed to fulfil their duty and what Douyin is doing to prevent scams and the spread of misinformation from happening again. Additionally, this article will explore ways to tailor an age-appropriate platform environment and decrease filter bubble and echo chamber effects from the perspective of both individuals and platforms.
What are ‘Filter Bubble ’and ‘Echo Chamber’?
Filter bubbles and echo chambers are terms inspired by the idea that information is selected to meet personal preferences (Kitchens et al., 2020). However, the definitions, causes and effects of the two terms differ in many ways.
The Filter Bubble
Filter Bubble was introduced by Internet activist and entrepreneur Eli Pariser around 2010. Mr Pariser indicates that the new internet is becoming more personalised and more about ‘you’. The Internet filters or recommendation algorithms examine your past behaviour and preferences of people similar to you and then predict your future preferences and feed you with personalised information, content, and news. On top of this, algorithms are constantly fine-tuning to personalise content and news for you as your actions slightly change. The above process shapes a filter bubble, which Mr Pariser designates as a ‘distinct informational universe for each individual user’. Filter bubbles reshape the way that we engage with ideas and information by restricting exposure to viewpoints outside of our existing beliefs and preferences. (Pariser, 2011)
With boosting news and information in the age of information explosion, it is understandable that we need recommendation algorithms to prioritise information and get us what matters the most. However, the negative effects of filters and recommendation algorithms apply to both society and individuals are significant. To start with, the recommendation algorithms may create filter bubbles that cause many issues. First, the filter process is invisible, and not many of us know that we are exposed to limited information, making it harder to burst the personalised bubble and interact with diverse ideas and information. People keep absorbing similar information, and reinforcing their existing beliefs will render them stop learning, resulting in mental rigidity and a lack of creativity. Also, Filter bubbles may intensify polarization regarding both social and political (Pariser, 2011). Moreover, filter bubbles contribute to a higher possibility of accepting and sharing fake news because people tend to believe what they prefer, even if it is not true (Saeed, 2022). Last but not least, misinformation will spread quickly when polarization is at a high-level (Cinelli et al., 2021).
The Echo Chamber
The term echo chamber was raised by Cass Sunstein in 2001. Researchers then generate their understanding and definition of an echo chamber. Dr Bruns indicates that an echo chamber is where a group of people connect with each other out of similar preferences and exclude outsiders. The more connections a group creates among its members, the higher possibility that they may cut off connections with outsiders (Bruns, 2017). An echo chamber can reinforce a prior opinion among a group and lead the entire group toward more extreme positions (Cinelli et al., 2021). An echo chamber not just proliferates polarization but also stimulates the spread of rumours on social media platforms(Choi et al., 2020).
Cases Study of Chinese Social Media Platforms
Algorithms have a specific impact on human behaviour (Flew, 2021). Dr Kitchens and his colleagues endorse that social media platforms foster an information-limiting environment for individuals in some way (Kitchens et al., 2020). The recommendation algorithm is one of the reasons for the information-limiting environment.
Discussions over vaccinations have continued on Chinese social media platforms. Chinese citizens are concerned and suspicious about whether vaccines are safe and useful from an early year. During the pandemic, discussions about the Covid-19 vaccine’s side effects reappear on social media. Countless online posts and content from Weibo, Douyin, WeChat etc., claiming that they or their acquaintances were diagnosed with leukemia, diabetes or other immune system diseases after getting the Covid-19 vaccination. The online information about the Covid-19 vaccine has sparked intense debate and triggered societal panic. In July 2022, the Joint Prevention and Control Mechanism of the State Council responded to the online information. It announced in a press conference that Covid-19 vaccines would not lead to leukemia, diabetes, or other forms of immune system disease.
By conducting an in-depth search on different Chinese social media platforms, it is worth pointing out that many people believe the COVID-19 vaccine can cause immune system diseases, whether before or after the press conference. The hashtag # The COVID-19 Vaccine Will Not Cause Leukemia or Diabetes, created by Beijing Headline on Weibo, aimed at spreading and amplifying the belief. However, there are significant numbers of posts posted by users with the hashtag stating that they or people they know have encountered the vaccine’s side effects.
The concerns about the COVID-19 vaccine have demonstrated a typical echo chamber phenomenon. People are generally divided into two groups regarding vaccines’ possible side effects. One firmly believes that there is a problem with vaccines. In contrast, the other believes that getting vaccination is one of the most effective ways to protect against COVID-19 and that getting leukemia after vaccination is just a coincidence since there are always a certain amount of people diagnosed with leukemia yearly. Additionally, the two groups are exclusive to contradictory opinions. Those who believe that the Covid-19 vaccine is problematic interact with each other by commenting, liking, and sharing stories of being diagnosed with immune system disease. The formation of an echo chamber is partly related to selective exposure, which means that users selectively consume the information they believe (Fine & Berkowitz, 1987). Still, the recommendation algorithms factor also counts. The recommender filter may feed users a significant amount of similar content without searching. Additionally, it makes it easier and more sufficient to get information related to the topic and communicate with people with the same belief, which reconfirms and reinforces the prior belief in the process.
Through investigation on different platforms, it is worth pointing out that platforms did not do much to decrease the likelihood of the echo chamber phenomena but removed controversial posts and pushed official news. According to Dr Cinelli, high polarization can quickly proliferate misinformation (Cinelli et al., 2021). It is noted that some individuals may spread terrifying information that potentially causes societal panic. Dr Duboi and Dr Blank indicate that more media diversity reduces the possibility of being in an echo chamber in the political domain (Dubois & Blank, 2018). Diverse media consumption can reduce the possibility of being in an echo chamber. It is recommended that Individuals try to use different media to search for more information.
Issues faced by older people
According to the 2021 China Silver Economy Insight Report posted by Mob Tech, nearly 80% of people over 60 years old who use smartphones in China are using short-form video platforms such as Douyin. Among all, news content is one of their favourite consumptions (Mob Tech, n.d.). This study will precisely investigate the consumption of older people and the issues they face in Douyin.
To sign up for Douyin, users have to register by using their mobile number or link their account to other social media platforms where they have already provided demographic information such as gender and age. Once Douyin has identified the age and other data of users, it starts to generate and recommend users with personalised video content or video content that people like them may like (Pariser, 2011). The recommendation algorithm predicts what older people would like and keep feeding them with similar content. By building a filter bubble through the recommendation process, there is a high possibility that older people may find it hard to learn something new, expose to contradictory opinions or other forms of video, but stick to videos of low quality and similar beliefs. Here is where the fake news, misinformation and scams come into play.
According to a 2019 research conducted by Dr Guess, older Americans were more likely to share fake news on social media (Guess et al., 2019). The situation is similar in China. Though no valid data and studies show such, there are significant numbers of posts of social media users saying that their older family members are obsessed with fake news. According to the Report on the Investigation of the Digital Literacy Gap between Elderly Populations in Urban and Rural Areas, it is noted that nearly 60% of older people in rural areas have encountered rumours while using short-form video platforms. Online fake news, misinformation, and scams are concepts they are not familiar with. Not all older people have a sufficiently high level of digital literacy to identify false information. Additionally, being trapped in bubbles and chambers through recommendation algorithms make it worse.
Scammers target older people by using recommendation algorithms and taking advantage of the filter bubble effect. According to a news report by Mengma News in March of 2023, a 68-year-old woman was cheated out of 30,000 yuan, nearly 6.5k Australian dollars, in an online romance scam with fake Jin Dong. It needs to be noted that Jin Dong is a famous actor in his 40s in China. The scammer uses videos of Jin Dong and animated effects to create videos (Figure 1). Scammers dub videos with lyrics like, “Dear, why don’t you reply to me? I miss you so much.” And “Can you subscribe to my account and like my videos? ” The fake Jin Dong videos give the older audiences the illusion that the actual Jin Dong is interacting directly with them, even though the effects are of low quality. Nevertheless, the same thing happened in the exact same way in 2020. According to Hongxing News, Ms Huang, a 61-year-old woman, is trapped in an online romance scam with fake Jin Dong.
Filter bubbles would not be the only factor that caused this issue. The lack of attention to women’s emotional needs in China, as well as the unfamiliarity of older people with new technology, also count. However, a filter bubble not only keeps older people in a highly personalised environment but also keeps the younger generation in one. This means the younger generation is restricted from those scam videos and exclusive from the bubbles. They have no clue the scammers are targeting older people nor leaving new viewpoints to remind older people. To make Douyin a more age-appropriate platform, Douyin conducted a special governance action to prevent the inducement of older people interaction in 2022. Douyin safety centre blocked significant numbers of accounts and used pop-up prompts and such to intervene and keep older people safe. However, the same scam reappeared this year. There is more Douyin can do. To burst filter bubbles, Douyin can moderate or push videos that older people enjoy to the video feed of younger users and can request their assistance through surveys to help review the content（Figure 2）. Additionally, Douyin can push scam prevention videos to older people and such.
Even though people benefit from algorithms in the age of information explosion, it is still worth considering what recommendation algorithms on social media have brought us towards. It is inevitable that recommendation algorithms may worsen the filter bubble and echo chamber effect in daily practice. Filter bubbles and echo chambers provide ground for social polarization and proliferate the spread of fake news and misinformation. What’s more, scammers can even take advantage of recommendation algorithms to target different groups and commit scam crimes. However, a filter bubble is neither unbreakable nor an echo chamber. Such a situation can be changed by both individuals and platforms in many ways.
Platforms must come up with long-term governance actions to take care of the users and not just fix problems till they arise.