When thinking about algorithm power in the media industry


The connection of the Internet, artificial intelligence (AI) and big data has overturned the business form, concept, and law of the traditional media industry and made the news media industry enter a new era of intelligent media, forming a super Internet platform and spilling over to the Internet platform on a large scale. Meanwhile, ethical issues such as hegemony emerged and escalated to the relationship between intelligence technology and human beings, making algorithms ethics an unavoidable link in the discussion of communication ethics. Algorithms appear in the face of technology and seemingly have nothing to do with ethics. However, the map of human thought has been hidden behind its steps. The algorithm empowers users and empowers its control ability to user identity, accelerating the pseudo-environment to change and resulting in users gradually being kidnapped by the “understanding you” pseudo-environment. Thus, the ethical issues of AI and big data have primarily evolved into matters of the generation, use and effectiveness of algorithm power. This blog will start by explaining what AI is. Then it will combine with various cases of the media industry and try to condense algorithm power’s governance scheme by thinking about algorithm power’s concerns in communication ethics.

What is AI?

A fascinating question is often asked in AI – “can machines think?” It is from the famous computer scientist Alan Turing’s test. Suppose a person asks a computer a question without seeing it, and he cannot tell whether the answer is from a human or a machine. In that case, the computer has passed the test. These machines are called “Artificial Intelligence” because one may not be able to tell whether a machine has its mind when answering questions. For this reason, AI systems are described as simple but human-like forms of intelligence that can be carefully cultivated like children (Crawford, 2021, p.5).

However, AI can be called “intelligence” because it can provide people with an excellent feasible path rationally in a fast situation so that people can work more efficiently and quickly. In other words, to achieve this effect, most of the time, machines must engage in extensive, computationally intensive training using large data sets or predefined rules and rewards, called “machine learning” (Crawford, 2021, p.8). Hence, these machines, capable of rapidly interpreting large amounts of data, can produce compilations with reliability and capabilities that far exceed those of previous anthropologists, perhaps even at the highest level (Crawford, 2021. p. 7). Nevertheless, where do these figures come from?

Crawford (2021, p.7) argues that AI depends on broader political and social structures and that the capital needed to build AI on a large scale and methods to optimise AI systems are ultimately designed to serve existing dominant interests. AI is a registry of power. AI is driven by data extracted from multiple sources, analysed and driven “without awareness” by people providing it. It is important to note that this data may not be supported by contexts or the consent of the other parties. In this way, by looking at the payers of trained data that shape and inform AI models and algorithms, collecting and tagging data about the world is a social and political intervention, even if it is disguised as a purely technical intervention. The myth of data collection as an act of goodwill covers the operation of its power (Crawford, 2021, p.121). The following blog will use the media industry as a case study to analyse and reflect on the hidden concerns of algorithm power.

The Concerns of Algorithm Power in the Media Industry

Algorithms, hardware and data are the three pillars of artificial intelligence. The data can find the interdependence between data through machine learning and algorithm learning (Megorskaya, 2022). Society is witnessing algorithm power, which has played a subversive role in the production and dissemination mechanism of news, affecting the news environment and participating users.

Algorithms reshape people’s perception of the world:

Lippmann proposed that people’s specific tendency to the objective world is primarily influenced by the pseudo-environment constructed by the media industry. The algorithm of intelligent media and social platforms can aggregate information and produce it in real-time. Likewise, the diversity of the pseudo-environment is forming, and the information received by people is also affected by algorithms, thus shaping people’s contact and perception towards the world (Jacobsen, 2020). For example, in the news industry, algorithms cover almost all manuscripts, including producing structured manuscripts and writing quick reviews and in-depth reports (Thurman et al., 2019).

For example, the Associated Press (AP) has increased its number of articles about earning reports from 300 per quarter to 3,700 since joining forces with Automated Insights (Peiser, 2019). AP’s partnership with an algorithmic robot could help human journalists spend more time on substantive work. It utilises the speed and wide range of algorithms, enabling the algorithm to generate news in a timely and real-time manner. The sources of news releases are not limited to humans (Dörr & Kunert, 2017). The intervention of algorithms expands the information sources in information production, diversifies the original material of news production, and people can get richer stories than before in terms of content.

The pseudo-environment that is kidnapped:

However, according to Shearer (2021)’s research survey on the way Americans obtain news, young people aged 18-29 broadly use social media to obtain news (Fig. 1). Most of these Internet platforms are driven by algorithms to push content they like to this group of young users, resulting the one-dimension linear production mode of “content is king” of “point to point” of traditional media has been gradually deconstructed by algorithms. In this process, the algorithm is improved with the passage of time and the increase of data. By absorbing more diverse data, the algorithm can find the rules of each participating user to recommend and “feed” news to users accurately. Interestingly, this process is called “personalised information customisation”. Nevertheless, is this a fact?

(Fig. 1)

In the past two years, among these social media platforms, TikTok has achieved nearly twice the number of users accessing news within the platform, which is in sharp contrast to others (Fig. 2, Matsa, 2022). However, even if this does not mean that social media has become the dominant news source, it has captured a new generation, and the number of users engaging with it will increase. The reason is that TikTok generates a mode of self-expression and identity creation through the algorithm, which differs from the mode of “the networked self” on other social media (Bhandari & Bimo, 2022). Furthermore, TikTok’s algorithms interact frequently with users, allowing them to constantly create and accept different facets and potential faces of themselves. Notably, TikTok offers users a new type of social media by refusing to subscribe to established classification schemes – eschewing the forms of interaction generated by content communities, blogs, and social networks, presenting a radically different social vision based on repeated engagement “algorithm” (Bhandari & Bimo, 2022). The algorithm is constantly reading users’ behaviour, acquiring data about their TikTok usage, such as their state and how they interact with watching certain content and combining that data to feed back to users what they might care. In this process, users stay on the platform longer and have less incentive to get news on their own. From this point of view, the algorithm recommendation emphasises the strong position of technology. Personalised content recommendation violates the objectivity of the natural environment, seemingly customising an “understanding you” pseudo-environment for users, but in fact, it is also a “kidnapped” – human subjectivity is further dissolved, and user interaction as a fragmented “human being” element also used to refine algorithms like petroleum.

(Fig. 2)

Digital hegemony of algorithm power:

Therefore, users are confined to the field they are interested in and no longer need to take the initiative to get news. Content automation refers to the process of cultural production, distribution, and consumption. Media platforms shape sociality using specifically related media’s architecture and material base (Bucher, 2018, as cited in Andrejevic, 2022). It can be said that algorithms act as gatekeepers in producing news and providing it to users, which makes the pseudo-environment change. Power increasingly resides in algorithms in a society where media and code are ubiquitous.

TikTok’s algorithm and automated recommendation system represent a kind of emotional capitalism. This technology allows companies like Twitter and TikTok to collect and sell data about users’ interactions with content they like. Then, through the process of aggregation, abstraction and categorisation, presenting the profile of the user – the categories derived from the user’s online behaviour are eventually projected back onto them, presenting them as an identity formation that is mathematically inferred from the category of anonymous person’s identity (Bhandari & Bimo, 2022). Through a such algorithm, affective capitalism has succeeded in enacting a form of control that has far-reaching implications – control over the identity of users. It is a form of digital hegemony: people’s information control has been “ceded” to algorithms. In this sense, digital hegemony is monopolistic and opaque, single-handedly changing the agenda and creating a mechanical user portrait that violates moral values and undermines user autonomy.

Reflection on the Governance of Algorithm Power

The algorithm power that new media relies on is crucial because it affects the construction of the pseudo-environment. Through the above example of the TikTok algorithm, social platforms are active promoters of algorithms in different vertical fields. Meanwhile, companies like Twitter and TikTok are the main actors of aggregate algorithms for data integration and sale. With their core node position on the Internet and big data resources, they have become the control of algorithm power, building an interdependent relationship among platform owners, users and algorithms (Cotter, 2018). Combined with the above, the characteristics of algorithm power are capitalisation and platformisation, and it needs more regulation and governance.

The realisation of algorithm power regulation:

The pseudo-environment mainly constructed by algorithm power mostly appears in an online virtual society. From the media perspective, algorithm selection significantly impacts both media production and media consumption. The algorithm selection on the Internet influences what we think (agenda-setting), how we think (framing), and, therefore, how we behave. It shapes the construction of individual reality, namely individual consciousness, and thus influences the culture, knowledge, norms and values of society, namely collective consciousness, thus shaping the social order of modern society (Just & Latzer, 2017, p.246). To a large extent, the basic principles of virtual and real society are similar. The governance concepts, laws and methods accumulated over the long years of human society are the commonwealth of virtual society and real society. Cyberspace is not a place outside the law. Hence, the regulation of algorithm power holders and algorithm power should be based on the essential criteria of the real society, namely the realisation of regulation. Following the basic principles of display, rules and systems similar to actual social governance, such as identity traceability system, algorithm presentation, and data archive system, can be studied, which is conducive to ensuring that the algorithm develops its strengths and avoids its opposing sides.

Ecological regulation of algorithm power:

Algorithms become social ecosystems because they are generated by or related to the information systems used and adopted by social users. Given the social impact of AI, be aware of the entanglement of the algorithm with its ecology – with the technology and human environment that interpret and execute a particular set of instructions (Shin et al., 2019). The intelligent media system is a new ecosystem. According to the above explanations of the algorithm, cultural awareness and platform, a new media ecological linkage diagram can be roughly obtained (Fig. 3, Shen & Li, 2022). Core values representing ideology and government governance, and technology representing artificial intelligence and big data are input to the system. Platforms media representing the main controllers of new media is the core node with its clear input and output channels. The way platform media influence is mainly through algorithms. Thus, it is vital to consider the role of algorithms in determining what we see in the world and how we perceive it. In that case, algorithms should be designed and developed in a human-centred and socially responsible manner to promote more transparent and equitable development, thereby having a significant positive impact with clear accountability (Shin et al., 2019).

(Fig. 3)


AI has a significant impact on the media industry. Driven by algorithms and technology, intelligent media with platforms as the head subverts the communication mode of traditional media and establishes a richer, more real-time and more diverse pseudo-environment. However, while algorithm in the media industry reshapes users’ perception of the world, it carefully pushes content for users by obtaining their personalised data, which is the “depriving” of the dominance of users’ information by algorithms, becoming the hidden concern of intelligent media. Generally, the practice development of algorithms and artificial intelligence in the media industry is ahead of the theoretical research and governance understanding. This blog tries to condense two possibilities to regulate algorithms, starting from the empathy of human ontological values and social governance, stripping away the technology cloak of algorithm power and returning to the technology to the sound source and the humanistic tradition. With the view and thinking of system and ecology, along the input and output channel, focusing on ecological type and constructing governance logic.


Andrejevic, M. (2019). Automated Culture. In M. Andrejevic (Eds), Automated Media (pp. 25-43). Routledge.

Bhandari, A., & Bimo, S. (2022). Why’s Everyone on TikTok Now? The Algorithmized Self and the Future of Self-Making on Social Media. Social Media + Society, 8 (1), https://journals.sagepub.com/doi/epub/10.1177/20563051221086241

Bonfire Media (2019, December 17). How Do Social Media Algorithms Work. Bonire Media Limited. https://www.bonfiremedia.hk/blog/2019/12/8/how-do-social-media-algorithms-work

Cotter, K. (2018). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21 (4), 895-913. https://journals-sagepub-com.ezproxy.library.sydney.edu.au/doi/full/10.1177/1461444818815684

Crawford, K. (2021). Introduction. In K. Crawford (Eds.), Atlas of AI (pp. 1-22). Yale University Press.

Jacobsen, B. N. (2020). Algorithms and the narration of past selves. Information, Communication & Society, 25 (8), 1082-1097. https://www-tandfonline-com.ezproxy.library.sydney.edu.au/doi/full/10.1080/1369118X.2020.1834603

Just, N & Latzer, M. (2016). Governance by algorithms: reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39 (2), 238-258. https://journals-sagepub-com.ezproxy.library.sydney.edu.au/doi/full/10.1177/0163443716643157

Matsa, K. (2022). More Americans are getting news on TikTok, bucking the trend on other social media site. Pew Research Center. https://www.pewresearch.org/fact-tank/2022/10/21/more-americans-are-getting-news-on-tiktok-bucking-the-trend-on-other-social-media-sites/

Megorskaya, O. (2022, June 27). Training Data: The Overlooked Problem Of Modern AI. Forbes. https://www.forbes.com/sites/forbestechcouncil/2022/06/27/training-data-the-overlooked-problem-of-modern-ai/?sh=63c23706218b

Peiser, J. (2019, February 5). The Rise of the Robot Reporter. The New York Times. https://www.nytimes.com/2019/02/05/business/media/artificial-intelligence-journalism-robots.html

Shearer, E. (2021, January 12). More than eight-in-ten Americans get news from digital devices. Pew Research Center. https://www.pewresearch.org/fact-tank/2021/01/12/more-than-eight-in-ten-americans-get-news-from-digital-devices/

Shin, D., Fotiadis, A., & Yu, H. (2019). Prospectus and limitations of algorithmic governance: an ecological evaluation of algorithmic trends. Digital Policy, Regulation and Governance, 21 (4), 369-383. https://www-emerald-com.ezproxy.library.sydney.edu.au/insight/content/doi/10.1108/DPRG-03-2019-0017/full/html#sec002

Thurman, N., Dörr, K., & Kunert, J. (2017). When Reporters Get Hand-on with Robo-Writing. Digital Journalism, 5 (10), 11240-1259. https://www.tandfonline.com/doi/abs/10.1080/21670811.2017.1289819?journalCode=rdij20

Thurman, N., Lewis, S. C., & Kunert, J. (2019). Algorithms, Automation, and News. Digital Journalism, 7 (8), 980-992. https://www.tandfonline.com/doi/full/10.1080/21670811.2019.1685395

沈雪,李欣.人工智能时代算法权力的隐忧与反思[J]. 未来传播期刊,2022(4). Translation: Shen, X., & Li, X. The Hidden Issues and Thinking of Algorithms in the AI era. Future Communication Journal, 2022 (4).

Be the first to comment

Leave a Reply