
“How platforms are governed matters. Platforms mediate the way people communicate, and the decisions they make have a real effect on public culture and the social and political lives of their users. (Suzor, 2019, p. 10).”
Introduction
This blog post is centered around some of the concepts and ideas from the chapter, “Who Makes the Rules?” of the book, “Lawless: the secret rules that govern our lives” by Suzor P. Nicolas (2019), which is a required reading from Week 4 of this unit- Issues of Concern: Privacy, Security and Digital Rights. On the whole, Week 4’s content material sheds light on the concept of ‘Privacy’, the complexity of its meaning, and the different ways in which it can be perceived. However, in an effort to enrich the scope of this blog post, I especially focus on the aforementioned reading and based on that— I synthesize that most social media corporations are continuously contributing to the mythification of privacy by strategically inclining their ‘Terms of Use’ in favor of themselves. In addition, I assert that the users of these social media platforms, consciously or subconsciously remain oblivious to the mythification of their privacy at large.
I will be systematically presenting a case study which is my humble effort in populating the contemporary scholarship that investigates a subset of the myth of privacy- The Myth of Net Neutrality. Through this case study, I will be making sincere efforts to tie the concept of Net Neutrality with the concept of Algorithmic Bias. In doing so, I will take into account a study by Are (2021) that revolves around the exploration of gender-based discrimination in the display of pole dancing content on Instagram. My rationale for the selection of this study as the focus is that I believe that the phenomenon investigated and the analyses developed on the basis of that, are not just unique to the issue addressed in this study, but can be mapped onto larger social media moderation research space.
Understanding Concepts Under Investigation
As covered in the lecture, although privacy is a highly complex terminology that differs in its meaning depending on various factors such as history, culture, philosophical influences, etc. — for the purpose of this blog post, let us understand privacy as one of the crucial parts of an individual, that is meant to be protected at all costs. So it can be said that “An invasion of privacy is a violation of or intrusion into something valuable that should be protected. (Becker, 2019, p. 307).”. In a digital space like social media wherein all its affordances are judiciously configured to make it easily possible for its users to present and share parts of themselves, protection of privacy becomes all the more important.
The social media corporations pretend to understand the aforementioned importance and the product of their pretension is called the ‘Terms of Service’ or in other words, ‘Terms of Use’ (both can be used synonymously). Terms of service can be defined as a set of “Contractual documents that setup a simple consumer transaction: in exchange for access to the platform, users agree to be bound by the terms and conditions set out. (Suzor, 2019, p. 10).” Throughout the reading there are instances of recognition and acknowledgement of the mismatch between social values and legal realities that put these social values at stake (Suzor, 2019). Terms of service is one such legal reality or an amalgamation of a number of legal realities.
One of the aforementioned ‘mismatches’ between social values and legal realities can be said to be the notion of Net Neutrality. Net neutrality has been described as a ‘complex sociocultural phenomenon’ by many scholars in the field of Internet Cultures and Governance (such as- Yamagata-Lynch et al., 2017). However, simply put net neutrality can be defined as, “The principle that Internet service providers and governments should treat all Internet traffic the same. This means that ISPs should not block or slow down traffic on their local broadband networks based on individual users or the type of traffic those users are accessing or by the type of service that is sending the content. (Madhvapaty & Goyal, 2014).
The Myth of Net Neutrality- The Case of Shadowbanning Pole Dancing on Instagram
“There is no such thing as a “neutral” platform; all platforms make decisions, in their rules and in their technical design, that shape the kinds of content that people can post and the kinds of content made visible. (Suzor, 2019, p.12).”
Are (2021) brings up a very novel, interesting and valuable concept in her study called- ‘Shadowban’ which she defines as “A form of light and secret censorship targeting what Instagram defines as borderline content, particularly affecting posts depicting women’s bodies, nudity and sexuality. (Are, 2021, p. 2002).”. Shadowban is a user-generated term that is used to describe Instagram’s ‘vaguely inappropriate content’ policy that works towards the dramatic reduction in the visibility of the posts that are identified as ‘vaguely inappropriate’ (Are, 2021). Instagram is a Facebook owned social media platform and just like Facebook, it has been criticized by various artists, performers, activists and celebrities for the bans of pictures of female bodies but not male bodies. Suzor (2019) also provides various examples of the aforementioned situation.
In this study, Are engages in systematically and consistently recording her own experiences of uploading pole dance and related content on her Instagram handle- @bloggeronpole. She expresses what pole dancing means to her. It is clear from what she says that pole dancing is more than just a dance form for her as it holds a special place in her life. It was a means for her to smoothen her exit from an abusive relationship. Although she initially took pole dancing up as a recreational activity, she started to get better at it with time and so developed the quality of content she posted on Instagram by investing in better video recording instruments.
Instagram’s censorship rules made her fear losing the content she shared on her Instagram handle. She categorizes her experience into pre and post-shadowbanning and also shares the responses of Instagram’s press team on the matter. Are also engaged in activism by encouraging many pole-dancers to sign a petition that forced Instagram to officially make an apology to this Instagram community and was successful in being able to have 20,000 pole dancers sign the same. Her experience is exemplary of the emotional toll that Instagram’s shadowbanning can take on its users. In addition, it makes its users question its censorship from the intersectional feminist perspective.
Although Are acknowledges the downside of an autoethnographic study, in which she states that the “Critics of autoethnography claim examining one’s own experience results in researchers being overly immersed in—and not impartial about—their own research. (Are, 2021, p. 2011).”. However, I absolutely agree with her statement about how the benefits of analyzing the shadowban from a user perspective outweigh the disadvantages (Are, 2021). After all, it is only the users that can express the intensity with which such censorship principles of big social media corporations such as Instagram affect them and prompt them to take action to ameliorate the same. In addition, Are’s (2021) acknowledgement of the subjectivity of the concept of ‘risk’ and how some people may possess a greater capacity of defining risk as compared to others, points towards the possibility that some people may also possess greater tolerance for risk as compared to others. If defined in terms of the aforementioned definition of privacy (refer to ‘Understanding Concepts Under Investigation’ section), risk can be anything that poses a threat to privacy. Therefore the whole idea of standardizing protection of privacy (with the term of use document) becomes a fallacy.
Conclusion
It is important to understand that “The legal relationship of providers to users is one of firm to consumer, not sovereign to citizen. (Suzor, 2019, pp. 10-11).”. However, once this understanding is developed and concretized in the minds of the users of the social media platforms, it is very challenging to actively keep this fact afloat in the mind as the affordances of social media platforms are holistically designed to make it a ‘private’ or ‘personal’ space wherein users easily express themselves and display the parts of themselves that they feel are worth sharing. In a space that is bound to be deemed as a ‘private’ or ‘personal’ space, the fine line between the firm (social media corporations) and the consumer (social media users), is also bound to dwindle and this is unfair.
In the concept of ‘terms of service’, power is lopsided innately, that is- power is more concentrated in the hands of the operators (Suzor, 2019). These sets of documents are strategically developed and serve to safeguard the commercial interests of the corporate platforms (Suzor, 2019). Although it seems logical for the social media corporations to serve themselves through terms of service as the users of these social media platforms are not required or forced to use them, it is also logical to at the least consider changing the term ‘terms of service’ as it can easily be misconstrued as the ‘protector’ of the interests of the social media users.
It is something to think about that all it takes for social media users to be legally restrained by the social media corporations is a fraction of a second that it takes for them to click the ‘accept’ button below the ‘terms of service’. Such a huge repercussion of such a small action with no way out. If it is true that- “The legal reality is that social media platforms belong to the companies that create them, and they have almost absolute power over how they are run. (Suzor, 2019, p. 11).”, then what we, as users of these social media platforms, are asking for is, government’s regulation of these platforms which is against the principle of laissez-faire- a prevalent economic system in which private businesses are free from government intervention (Carver, 2014). What is contrasting is that democratic values such as ‘freedom of speech’ do not apply to these social media platforms, even though they operate in democratic nations. One of the examples of the ways in which social media has deeply penetrated into our lives is its use to plan, initiate and spread political action (Suzor, 2019).
It is fascinating to see how the fact that ‘social media-using is a choice’, is bent in favor of the social media corporate platforms and simultaneously serves to be the reason for its users’ helplessness (Suzor, 2019). What is problematic is that the only way that these users can really help themselves is by quitting to use the platform. But what is more problematic is that it is difficult to be faced with this decision given the deep penetration of social media in our everyday lives. Is this a systematically constructed conspiracy to extract personal information of its users? If so, what is this extraction taking place for?
References
Are, C. (2021). The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Feminist Media Studies, 1–18. https://doi.org/10.1080/14680777.2021.1928259
Becker, M. A. (2019). Privacy in the digital age: comparing and contrasting individual versus social approaches towards privacy. Ethics and Information Technology, 21(4), 307–317. https://doi.org/10.1007/s10676-019-09508-z
Carver, T. F. (2014). The Encyclopedia of Political Thought. Wiley eBooks. https://doi.org/10.1002/9781118474396
Madhvapaty, H., & Goyal, S. (2014). Net Neutrality – A Look at the Future of Internet. IOSR Journal of Computer Engineering, 16(4), 71–77. https://doi.org/10.9790/0661-16427177
Suzor, N. (2019). Lawless. Cambridge Core, 10–24. https://doi.org/10.1017/9781108666428
Yamagata-Lynch, L. C., Despande, D., R., Do, J., Mastrogiovanni, J. M., Teague, S. J., & Garty, E. (2017). Net Neutrality and Its Implications to Online Learning. International Review of Research in Open and Distributed Learning, 18(6), 1-11. https://files.eric.ed.gov/fulltext/EJ1155805.pdf
Be the first to comment