Fall-from-Grace: The ethical issues of virtual idols

Due to the swift advancement of virtual reality and AI technology, the distinction between the tangible and digital realms is continuously becoming less clear. Virtual idols, also known as virtual Youtubers (V-tubers), gaining global popularity as digital creations in the online realm. As one of the emerging consumption fields of subculture groups, they are considered to be the balance between the real and virtual worlds. Virtual idols are artificially intelligence-driven generated images (CGI) or digital characters, often designed to have both the expressive appearance and personality of anime characters and human-like behaviors and reactions, which makes it possibly to have the background and potential to appeal to the target audience. With the continuous upgrading of technology, the action performance and fluency of virtual idols are infinitely similar to human beings (Yu, Kwong & Bannasilp, 2023), however, some invisible ethical issues are gradually exposed. One of the most controversial is digital labour exploitation behind virtual idols.

Virtual idols never “collapse”

The concept of virtual idol originated from Japanese anime and idol culture and has developed for more than 40 years. In 1984, “Lynn Minmay”, the first virtual idol to achieve major success in the real world, was one of the main characters in the mech anime series “Super Dimension Fortress Macross” (Kong et al., 2021). Since then, thanks to the maturity of AI technology, a number of virtual idols have appeared in the global entertainment industry and social media platforms, such as “Hatsune Miku” from Japan, “Luo Tianyi” from China and “Lil Miquela” from the United State. They interact with audiences in real time through online platforms, such as YouTube and Tiktok, and can even use AI technology to hold concerts and live stream (Liu, 2023).

Compared with real idols, virtual idols rarely suffer from public scandals caused by infidelity or love (Yu, Kwong & Bannasilp, 2023), which makes the fans feel more comfortable supporting them. In Japan, where it originated, virtual culture has become a cultural phenomenon, not only gaining mainstream acceptance, but also integrating technology more closely with reality, making them more popular than human idols.

The Ghost behind AI

Virtual idols are almost entirely AI-generated images in the eyes of the audience, controlled directly by human. But in practice, advanced AI algorithms can be used to produce more realistic and fluid animations, but still rely on capturing human movements for some visual effects. Therefore, the virtual idols are more like a digital avatar controlled by a human programmers, simulating the virtual character’s movements, voice, and interaction with the audience. This may trigger ethical issues about whether the overwork of virtual idols mean that human programmers probably may not get enough rest and whether their work is undervalued.

Figure 1: An actress’s job (source from Rest of World)

An actress weeping while wearing masks

Labor exploitation caused by virtual idols has attracted public attention in recent years, especially the news that Carol, the lead singer of Chinese virtual girl group A-SOUL, was declared dormant in 2022. Although the company has repeatedly stressed that this does not mean that the member is quitting the girl group, such statements are generally considered to be retirement in the field. This has been questioned, protested and attacked by fans on social media, due to workplace bullying and labor exploitation suffered by the “people inside” of virtual idols.

Figure 2: A-SOUL

Back before the news released, fans stumbled across a post by the actress behind Carol on NetEase cloud music app, which is similar to China’s spotify. The content is about long-term, intense work and work-related physical harm, which is not disclosed on mainstream social media. Although the authenticity of this account cannot be determined for the time being, since many of the content can correspond to A-SOUL’s daily activities, the unfair treatment of the actress has almost shattered the illusions of fans about A-SOUL.

Figure 3: A-SOUL’s Internet topic (source from Weibo)

A screenshot of a suspected A-SOUL employee’s salary further escalated public opinion. According to the pictures, the basic salary of a real employee is only 11,000 yuan (about $1,500), and the bonus part is 1% of the income commission. This means that the rewards after deducting various expenses of media platforms and brokerage companies, the actual amount of the account is only 0.6%. By contrast, the five members of A-SOUL have brought great benefits to the company. According to Lehua Entertainment (A-SOUL’s brokerage company), from its founding in 2020 to 2022, the five idols have brought the company nearly 16 million yuan (about $2.2 million) in revenue, which is equivalent to 80% of Lehua’s entire pan-entertainment business revenue. Such a stark disparity infuriated fans: Where did all the money go?

Figure 4: The salary structure of an actress (source from Internet)

A senior Internet practitioner pointed out that in the virtual idol project, the “director” who command and plan behind the scenes makes a large amount of money, almost three times or more than the actress. In addition, there are rumors on the Internet that agents force members to work and bully them during training. In order to reduce the negative impact of public opinion, the head of the operation team posted an apology and announced the early termination of the contract with Carol’s actress, but this did not succeed in calming the anger of fans.

Digital labor exploitation

In the case of A-SOUL, these actresses are required to engage in extensive motion-capture activities to make the characters more human-like, which may require excess repetitive physical and mental work (Wang, 2022). In a report in Rest of World (2022), Mengyu Peng, director of branding at virtual avatar service company SuperACG, states that capture actresses cannot decide how many hours they work, because it depends on the operation team behind them. He also mentioned that most actors or actresses are expected to work 22 days a month for four to five hours a day, but it is not always clear to fans because they care more about spending a long time with their idols. Anthony Fung, a professor of media and communication at the Chinese University of Hong Kong, adds that actress’s work is invisible in the process. The unique advantage of virtual idols not falling in love may also lead to a certain degree of labor exploitation (Wang, 2022), which may cause surveillance and interference in the actress’s private life.

Nelson Colon Vargas (2024) pointed out that this phenomenon of labor exploitation is closely related to capitalism and racial oppression. Marginalized groups responsible for operating or developing AI technologies may have to endure with low earnings, constant surveillance, and security risks while contributing the necessary workforce. The author also suggests that this not only harms their economic stability and well-being, but can also lead to mental health problems, such as older women of color who face economic exploitation and mental health challenges.

While the targets of exploitation in the A-SOUL case may be young people, they are also treated as marginalized groups in the same context. Such neglect highlights the capitalist way of calculating that advances in AI take precedence over the well-being of the people developing the technology. This development has come at a huge human cost while ensuring technological progress and profit margins. This is confirmed by Just and Latzer (2017), who further emphasize that global corporations are oriented towards profit maximization rather than public interest goals and social responsibility.

Unwitting workforce

Apart from the labor exploitation of the actors, the audience is also one of the exploited to some extent. If the former is a lack of social attention, the latter can be considered a lack of awareness. Pasquinelli and Joler (2020, as cited in Morreale, Bahmanteymouri, Burmester, Chen & Thorp, 2023) describe that the original data of AI is formed by long-term accumulation of human labor and data. Surprisingly, the people doing these microtasks may be the audience themselves, who are unaware of the data manipulation they perform online, actions such as filling in the verification codes, classifying harassing emails as spam, and so on (Morreale, Bahmanteymouri, Burmester, Chen & Thorp, 2023). They are used to train AI data acquisition capabilities.

Morreale, et al. (2023) classifies it as a special category of labor because technology companies unilaterally and systematically extract surplus value from individuals. In the case of A-SOUL, while watching the performance of virtual idols, users are highly likely to report their preferences, demands and check technical anomalies to media platforms and technology companies, through comments or filtering mechanisms. This process is actually in contributing unrewarding labor. A similar situation can be seen at Google. Users unconsciously trade their privacy, personal information, and non-material labor for the free features and services provided by Google, while platforms and companies benefit enormously from data mining (Noble, 2018).

Governance and challenges

Munn (2024) argues that the current workforce may be facing significant labor problems in AI systems, in terms of extremely low pay, physical or psychological harm, and global racial issues. While these structural problems are difficult to solve in a single way, the author also suggests some possible methods to solve a dilemma of the digital workforce. For example, the Mutual Aid Forum is a form of solidarity and support that has long been used by workers. It can not only be used to exchange information and share experience to achieve the purpose of improving conditions and improving prospects, but also effectively break the isolation of platform labor and information block. In the context of the digital age, it is necessary to extend this intervention to the field of AI to re-evaluate the value of personal and intangible labour. In addition, he points out that practice guidelines emphasizing labor privacy and transparency (Allen Institute for Artificial Intelligence, 2019) and ethical principles for mitigating harm and improving conditions (Shmueli et al., 2021) are desirable approaches.

However, not only does governance need to be considered, but there are also challenges. The increasingly fierce market competition determines the profit-oriented strategy of technology companies and platforms. In this environment, even a small amount of ethical work can lead to investor opposition due to high costs and low returns. While tech companies and platforms are willing to comply with minimum standards laws, the global data industry’s salary and conditions competition make AI actors vulnerable to exploitation and abuse. This is because, on a global scale, perceptions of the workforce needed to develop and exercise AI vary widely among countries (Casilli, 2021). Companies developing AI technologies tend to be located in the high-income global North, while low-cost labor is likely to be located in the global South. This means that even if the rights of workers at home can be protected, workers on the other side of the world may not be covered. It is also evident from this that there is a lack of uniform governance approaches or widely applicable regulations (Munn, 2024).

Conclusion

Virtual idols are assuming a more varied and significant role in both the entertainment industry and digital society, and the seamless integration of Al technology offers users unparalleled audio-visual experiences that surpass the realm of reality. Nevertheless, currently, they are still unable to eliminate the need to depend on humans to supply high-quality data for accurate training and optimization. This process has resulted in the exploitation of labour for both actors and audiences, with the labour value being transferred from vulnerable and marginalized groups to capitalized technology companies and media platforms through compulsive work, even manipulating and exploiting the audience without their knowledge.

In this cases, “soft” approaches, such as mutual assistance, practice guidelines, and ethical standards, could hold the promise of alleviating their exploitation. In order to effectively manage these issues and achieve long-lasting solutions, it is essential to establish uniform and widely applicable “hard” rules.

Reference

Casilli, A. A. (2021). Waiting for robots: the ever-elusive myth of automation and the global exploitation of digital labor. Sociologias, 23(57), 112–133. https://doi.org/10.1590/15174522-114092 

Just, N., & Latzer, M. (2017). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157 

Kong, R., Qi, Z., & Zhao, S. (2021, December). Difference between virtual idols and traditional entertainment from Technical perspectives. In 2021 3rd International Conference on economic Management and cultural industry (ICEMCI 2021) (pp. 344-349). Atlantis Press.

Liu, J. (2023). Virtual presence, real connections: Exploring the role of parasocial relationships in virtual idol fan community participation. Global Media and China. https://doi.org/10.1177/20594364231222976  

Morreale, F., Bahmanteymouri, E., Burmester, B., Chen, A., & Thorp, M. (2023). The unwitting labourer: extracting humanness in AI training. AI & Society. https://doi.org/10.1007/s00146-023-01692-3 

Munn, L. (2024). Digital Labor, Platforms, and AI. In: Werthner, H., et al. Introduction to Digital Humanism. Springer, Cham. https://doi.org/10.1007/978-3-031-45304-5_35 

Nelson Colón Vargas. (2024). Exploiting the Margin: How Capitalism Fuels AI at the Expense of Minoritized Groups. arXiv.Org. https://doi.org/10.48550/arxiv.2403.06332 

Noble, S. U. (2018). Algorithms of oppression : how search engines reinforce racism. New York University Press.

Pasquinelli, M., & Joler, V. (2021). The Nooscope manifested: AI as instrument of knowledge extractivism. AI & Society, 36(4), 1263–1280. https://doi.org/10.1007/s00146-020-01097-6 

Tasioulas, J. (2019). First steps towards an ethics of robots and artificial intelligence. Journal of Practical Ethics, 7(1).

Tobin, M & Zhou, V. (2022, July 28). The overworked humans behind China’s virtual influencers. Rest of World. https://restofworld.org/2022/china-virtual-idols-labor/

Wang, Y. Q. (2022). A Brief Analysis of the Development of Chinese Virtual Idol Industry Empowered by 5G+Motion Capture Technology——Taking the Virtual Idol Group A-SOUL as an Example. Journal of Physics: Conference Series, 2278(1), 12011-. https://doi.org/10.1088/1742-6596/2278/1/012011 

Yu, Y., Kwong, S. C., & Bannasilp, A. (2023). Virtual idol marketing: Benefits, risks, and an integrated framework of the emerging marketing field. Heliyon, 9(11), e22164. https://doi.org/10.1016/j.heliyon.2023.e22164 

Be the first to comment

Leave a Reply