Don’t be the dough in the hands of the platform

Nowadays, the network information society has brought the globe into the information age. In human society, where the mainstream influence lies, if an individual does not use information platforms, it is as if they are isolated from the world. People use these platforms, enjoying unprecedented convenience and wealth brought by big data(Zuboff, 2020). For most people, the application of platforms is positive, allowing more people to see a broader world within limited space and environment. However, under the beautiful visual appearance, there is always some dark underside. For example, the social platform TikTok not only changes the way we access information buta also subtly influences the social structure and personal behavior(Noble, 2018).

I will take TikTok, as a case study to discuss how it cleverly hooks users, and as a case study of how it inadvertently promotes cyber violence, to consider how platforms manipulate users and even reshape us.

Photo from Midjourney Bot”Information world”

“Mind” control of the platform

All mainstream social platforms widely use the “information cocoon room.”(Pariser, 2011) The TikTok app, known for its famous information filtering algorithm, filters users’ information streams, making users highly sticky. As of 2022, TikTok’s global active users have exceeded 1 billion, and this platform continuously immerses people in endless videos, with an average user session time of 11 minutes, higher than many competitors. A 2021 study found that, compared to randomly pushed content, TikTok’s tailor-made “For You” page is 1.5 times more likely to connect users with content that matches their interests, and this highly personalized strategy keeps users on the platform longer. TikTok uses its control over information to guide traffic direction, using its platform algorithms to continuously learn user preferences and precisely push content that can trigger clicks, only showing content that matches personal preferences, promoting user clicks and attachment(Bostrom, 2014). Although TikTok also uses its ability to collect and share information to filter information, on one hand, this filters some unnecessary information, protecting users and the platform, but on the other hand, it also achieves its own purpose at times: only allowing you to see the information the platform wants you to see. This information filtering, while effectively protecting users from information harassment, also deprives users of the opportunity to access diverse viewpoints. This not only directly or indirectly affects users’ thinking but even deprives users of the ability to think independently, thereby reshaping users. It enhances user platform dependency and meets the platform’s commercial and political objectives. At the same time, it also allows some TikTok users to find gaps in social regulation, find exploitable spaces hidden in the platform, use them for their own purposes, guide and control traffic, and reshape public consciousness. Although you may dislike this feature, it is undeniable that this feature is necessary on any platform, as it actually provides full protection to users at certain times, so when this capability is not properly regulated, it is easy to induce social issues like cyber violence.

Photo from Midjourney Bot “The information cocoon”

Guiding traffic, buying traffic, guiding trend direction, is the basic formula for any successful platform operation, and TikTok is no exception. It uses its sophisticated algorithms through an operational model that attracts and maintains user attention to successfully attract massive traffic, and the platform relies on this formula to attract more users. For internet celebrities, some sacrifice their moral bottom line for views and profits, creating conflicts, spreading, and inciting statements. “A keystroke, a fortune.” “After all, there are many good people in the world, many bad people online, good deeds stay in the world, bad deeds go online.” This strategy of very low cost and high return brings great profits; the more cruel, absurd, and unethical the story, the more traffic and funds it brings. Which capitalist wouldn’t be envious? This prompts the uncontrollable development of internet explosions and also expands the impact range of cyber violence. Ambiguous statements and speculative descriptions are like horns in the platform’s hands, calling and attracting traffic from all directions. They control the battlefield by manipulating public emotions and viewpoints, like chess players on the battlefield, directing the wind of war(Srnicek, 2017). Although staring at a sheep in the pen to shear its wool will eventually be discovered, on the internet, no one cares about this sheep. Like the waves constantly hitting the seaside rocks, trying to engulf it into the sea, sink to the seabed, the victims gradually break down under this constant pounding, from initial hardness to final shattering, eventually becoming a discarded pawn of capital. What about other users? Most people do not have the ability to refuse this fragmented entertainment and the adrenaline stimulation it brings simultaneously. However, the platform’s algorithm will also start to actively recommend more content with similar attributes, thereby starting a repetitive cycle, forcing users to gradually deform their consciousness and thoughts, gradually conforming to the platform’s requirements for users. Users go from being needed by the platform to needing the platform(Twenge, 2017).

So how did the Internet explosion come about?

15 minutes for everyone

Andy Warhol once said, “In the future, everyone will be world-famous for 15 minutes.” However, the second half of his statement was: “Then you kill a celebrity.” In the United States, about 55% of high school students report having been subjected to cyber violence, and it seems to have become a common occurrence online, even a form of entertainment for some with distorted thinking. Eighty percent of teenagers think it’s fun to expose others online, seizing on trivial matters to guide, position searches, and expose, seeking thrills and showing a tendency for “online disinhibition.” The internet greatly reduces the risk of retaliation, making condemnation no longer a difficult task. Because there is no need for self-imposed moral constraints and psychological restraints, individuals lose moral and psychological cognition online. Under the influence of anger, they allow rational laziness, seeking self-affirmation and stress relief by harming others. The orientation of public opinion makes the dark forest law more evident in online comments, leading most internet users to believe that it is the victim’s fault. Even if the victim had previously appeared as a figure of justice, like the first black spot on a white sheet, it draws all attention to this flaw, overshadowing all their previous social value and contributions. “Justice heroes” fulfill certain inner needs by continuously attacking and undermining others, seeking affirmation of their self-worth(Solove, 2008).

“Look at your face and You look like a bad guy.”

Photo from Midjourney Bot “You are at fault.”

Everyone’s thoughts and viewpoints are different, leading to different levels of problem consideration. This is known as the fundamental attribution error, also called the overattribution effect. When we speculate on the reasons behind others’ behaviors and attitudes from an observer’s perspective, we often attribute them to personal intrinsic qualities, while ignoring external situational factors. The “god’s eye view” makes the victim’s panicked behaviors look clownish, leading third-person internet users to believe that the victim themselves has issues, firmly believing in the victim’s guilt, and attacking the victim again, causing secondary harm. Most people, who do not know the truth, become justice heroes on the internet, proclaiming, “Every word you say is important and bears legal responsibility.” However, on the internet, this statement is infinitely magnified, and every rebuttal by the victim is nitpicked by countless individuals trying to defeat the victim in the details, to affirm their correct thoughts and steadfastly believe in their own rightness. The victim, trying to refute and prove their innocence, cannot overcome the vague and ambiguous few words of a crowd;“ where there is a will to blame, there is a way(If one is determined to find fault, one need not worry about not finding a charge.) ”In this era of big data on the internet, we are all naked, and no one can withstand such detailed “flaying.” For internet celebrities, one cyber attack is already devastating, not to mention those unknown victims. People only see and are willing to believe what they want to see, no matter how absurd the situation is described or even if some victims are portrayed as villains or conscienceless individuals, thereby creating a strong tide of cyber violence. Like an invisible heavy punch, it continuously attacks the victims’ spirits, causing undue disaster to the victims(Schneier, 2015).

The role of platform in Internet explosion

For such situations,most online commentators enjoy seeing these situations. However, in fact, online platforms play a significant role in all instances of cyber violence. The main functions of these platforms are to collect information, share content, and create forums. The review and filtering of information and the understanding and analysis of users are integral parts of the platforms’ core technologies. Therefore, when platform oversight is inadequate and data analysis models are flawed, it can easily lead to significant negative online impacts. This is also why many bystander users get unintentionally involved in major online explosions and participate in them. Even a user who is not interested in such information may accidentally click on an exposed piece of information, and the platform’s algorithm, recognizing new click traffic, assumes this is content of interest. In order to increase its own traffic, the platform continually recommends related content to the user, hoping to make them addicted to the platform and thereby generate fees to operate the platform. Similarly, users who share a model/profile with this user, or associated users, may start receiving recommendations for similar content due to this “erroneous click,” as the platform assumes this content interests them. Therefore, platforms are not only the main forces filtering hate speech but also the key players in directing public opinion(Noble, 2018).

Those who wrong you know better than you how unjust it is. The cunning know better than you how foolish you are.

Photo from “Let the Bullets Fly”film by Jiangwen
From “Let the Bullets Fly”film by Jiangwen

In this online storm, every participant got what they wanted, as if they had just been to a grand carnival. Only the victim remained, pitifully holding their wounded body, clutching the gut-wrenching evidence, muttering to themselves, trying to use their last bit of energy to shout, “Listen to me, I’m innocent.””‘I only ate one bowl of noodles, did you see it?’ Liu Zi desperately pleaded with everyone: ‘Look, it was just one bowl, I really only ate one bowl…’ The evidence in his hands was bloody, but the audience watched the victim’s struggles indifferently and chose to leave as if they had never been there. They were unwilling to admit their mistakes, unwilling to compromise their newly awakened reason, as if even though 1+1=2, at that moment, 1+1 could never equal 2. ‘I know you only ate one bowl of noodles, I just mentioned it casually, why take it seriously and blame me?’ The fact is, aside from the victim, no one else cared about the truth of the matter. They don’t care whether it’s right or wrong. In the information age, massive amounts of data continuously flood into people’s view, and no one wants to stop for a few seconds to think, ponder, or reflect. Those who know how to tell stories know too well how to guide thoughts, using provocative content to fuel people’s anger, leading to universal defamation. More sadly, users aren’t interested in knowing the truth, because truth takes time, and the“tittytainment” has taught most people to relish and become addicted to mindless happiness.(Bagdikian, 2004)”

How to do

To prevent becoming dough in the hands of capitalists, governments should step in and enact stricter regulations for internet governance, using legal measures to bolster data privacy protection and penalize inappropriate online behaviors, thus compelling users and platforms to uphold social baselines and basic moral standards. It is also essential to enhance the legal framework for social platforms to reduce, compress, and ultimately eliminate the spaces exploitable by illegal users, ensuring that platforms do not become tools for harming society and individuals.

Platforms should strengthen their control over quality and content supervision, expand their intelligent systems, continuously proactively upgrade their automation algorithms, incorporate AI technologies, and utilize manual interventions and reviews to make the platforms more humane and adept at discerning right from wrong. Platforms must maintain their ethical standards, ensuring that they do not spread harmful or misleading information. They should enhance the protection of individual users’ personal information to prevent its exposure from adversely affecting users’ lives. Moreover, platforms should bolster users’ sense of responsibility on social networks, establish more precise content standards, and delineate the consequences of violations. By handling violations transparently and publicly, platforms can make users realize that the internet is not a lawless zone, encourage them to confront their own actions, take responsibility for them, and enhance their ability to discern information(Foucault, 1995).

Photo from Midjourney Bot ”Black sheep effect”

For users, each individual is a creator and disseminator of information, and we must be responsible for the information we share, recognizing that each piece of disseminated information could lead to unforeseen consequences. This responsibility is not only towards our actions but also serves as self-protection. Furthermore, we should try to detach ourselves from the mainstream and avoid being influenced by platform-driven information, adhere to our moral standards and independent thinking, avoid the “black sheep effect,” and refrain from becoming hypocritical arbiters of justice. We should adopt a gentle approach to the world and strive not to be influenced by what we wish to believe. We must consider not only whether things are correct but also whether they truly are as they seem.

Conclusion

We live in an era of information explosion, where we all play different roles on this digitally orchestrated stage. While online platforms have brought us tremendous convenience, it is time to awaken. As users, we need to proactively improve our ability to discern information, enhance our information literacy, critically receive and process the information we encounter, maintain our capacity for independent thought, and take responsibility for our actions on the internet. As a society, we need to work together to protect ourselves and others. Simultaneously, governments and platform operators must collaborate to forge a new order in the digital world, fostering the development of a healthier, safer, and more beneficial online environment. Only by doing so can the public avoid being swallowed by the undercurrents of the net, becoming plasticine in the hands of capital. Let the internet become a force for social progress(McChesney, 2013).

Photo from Midjourney Bot”The dough in the hands of the information platform”

References

Zuboff, S. (2020). The age of surveillance capitalism : the fight for a human future at the new frontier of power (First Trade Paperback Edition.). PublicAffairs.

Pariser, Eli. (2011). The filter bubble : what the Internet is hiding from you. Viking.

Srnicek, N., & De Sutter, L. (2017). Platform capitalism. Polity.

Noble, S. U. (2018). Algorithms of oppression : how search engines reinforce racism. New York University Press.

Solove, D. J. (2008). Understanding privacy. Harvard University Press.

Schneier, Bruce. (2015). Data and Goliath : the Hidden Battles to Collect Your Data and Control Your World. (1st ed.). W. W. Norton & Company, Incorporated.

Twenge, J. M. (2017). iGen : why today’s super-connected kids are growing up less rebellious, more tolerant, less happy– and completely unprepared for adulthood (and what this means for the rest of us) (First Atria Books hardcover edition.). Atria Books.

Bostrom, Nick. (2014). Superintelligence Paths, Dangers, Strategies. Oxford University Press.

Bagdikian, B. H., & Bagdikian, B. H. (2004). The new media monopoly ([Revised and updated edition].). Beacon Press.

Foucault, M., & Sheridan, A. (1995). Discipline and punish : the birth of the prison (2nd Vintage books ed.). Vintage Books, a division of Random House.

McChesney, R. W. (2013). Digital disconnect : how capitalism is turning the internet against democracy. New Press, The.

Be the first to comment

Leave a Reply