Privacy and Digital Rights Issues in the Age of Social Media

Introduction

Do you know how many people use social media in this era when digital information is significantly developed and continues to do so every minute? By 2024, more than 5.04 billion social media users will be worldwide! It accounts for 62.3% of the global population (Kemp, 2024).

Based on these statistics, we can all agree that this is a considerable number. However, the increase in social media consumption and various technologies has increasingly led to the emergence of privacy and digital rights concerns, with personal information being made available on the web or the dark web (Ross, 2017). The continuous and chronic breaches of online users’ privacy and digital rights have resulted in a demand for more accountability from numerous online platforms where personal information is easily accessible by unauthorized individuals.

The Importance of Privacy and Digital Rights

Imagine that we all live in a fully transparent glass room, and people outside can spy on our every move anytime and anywhere. This is our digital life now! With the rapid development of artificial intelligence and data collection and analysis, our information and even private lives can be analyzed and viewed, and security loopholes emerge one after another (Kerry, 2020). The importance of protecting online users’ privacy and digital rights has been influenced by the rapid emergence of new smartphones and other devices that transmit and store data in larger volumes than ever. According to Kerry (2020), in a world where about quintillions of bytes of data are generated on a daily basis, there is a dire need to make privacy and digital rights a global public policy issue. The surge in the amount of data generated has made it impossible for governments and even organizations to implement effective measures to prevent privacy breaches.

Privacy settings in social media [Photograph]. (2023, April 10). Data Privacy Manager.

Not only are private information and privacy threatened, but people with different opinions than those being championed by the majority or the powerful in society have found themselves denied freedom of expression. According to Taisonthi et al. (2024), Spyware such as Pegasus has emerged as a new threat to privacy and digital rights because it can access personal and sensitive information such as locations and bank accounts without authorization, which is extremely terrible! The never-ending cycle of technological advancements means that new invasive Spyware that can invade, steal, and control devices is being used to breach privacy and digital rights against political activists or opponents (Taisonthi et al., 2024). Therefore, protecting your privacy and digital rights is not only for personal safety but also for maintaining the standard of freedom and the foundation of a democratic society.  

Privacy Challenges Brought By Digitalization

The development of digitalization and social media has enriched our social life and become a part of our daily lives. At the same time, digitalization also makes shopping, banking, educational activities, and other tasks more convenient. Then, behind this convenience lies a significant threat to personal privacy. According to Taisonthi et al. (2024), digitization has brought new ways that privacy can be compromised, such as sharing information about ourselves or other gatherings sharing it with others, such as pictures, which can lead to reputation damage, harassment, or identity theft. Have you ever found a beautiful picture of a person on Facebook and decided to download it? Well, that is one of the privacy challenges brought by digitization because the personal information of billions of people is easily accessible on most popular social media platforms.

iStockphoto. (n.d.). Data privacy [Photograph].

Do you remember giving social media access to mobile phone data? It is almost impossible for anyone who downloads an application and is prompted to read “terms and conditions” to do it. Most people will click ‘accept’ without reading any information about privacy and digital rights, not knowing they are given all the rights to the software to collect personal information. However, based on the estimation provided by Alexis Madrigal, an average American would require more than 53.8 billion hours to read the privacy policy on every website they visit (Koopman, 2019). Therefore, despite such a small move allowing the platform to collect a lot of information about you, there is little you can do about it as an individual.

Additionally, the study and development of artificial intelligence have intensified the in-depth study, analysis, and prediction of personal information. It also brings privacy violations, such as facial recognition technology, and may lead to algorithmic discrimination based on gender and race and challenge social fairness (Hardesty, 2018). The challenges of privacy and digital rights brought by digitization are immense. Creating more sophisticated technologies like AI makes protecting citizens from violating their privacy rights even more challenging.

When faced with these challenges, it is essential to strengthen the laws and policies of personal privacy protection. At the same time, raising individual awareness of information protection is necessary. The existence of algorithmic discrimination in AI has resulted in marginalized communities and groups such as people of color and LGBTQ+ members (Kerry, 2020).

Data Pop Alliance. (2021, March 5). LWL #25: Discrimination in Data and Artificial Intelligence. Data Pop Alliance.

Therefore, privacy and digital rights violations are embedded in technological tools and platforms, and there is a need for a joint effort to deal with the problem as a society. For example, in a recent move, 26 civil rights and consumer organizations signed a joint letter with the primary intention of calling for the prohibition of personal information for discrimination (Kerry, 2020). The move proves that personal privacy can be effectively protected only with the joint efforts of the whole society.

Complexity of Social Media Rule-making

In the sea of social media, making rules is like sailing in a storm. The situation on the sea surface is complex and thrilling. As the book “The Secret Rules of Our Digital Life” says, it is simply an idiotic dream to try to satisfy users from all over the world and with different backgrounds. Apart from making rules, the current ones meant to help the online user protect their personal information are either ignored or, for many, not important to consider before they are compromised. In a survey conducted by Joseph Turow found that many internet users need to learn about the scope of data that is collected from them and many fail to read the privacy policy because they are too technical and contain a lot of jargon (Smith, 2014). This means that making rules is not the main problem because it is essential to fit all users’ needs, and most consumers are unaware that they exist.

Unknown Author. (2018,Jun20.). Title of the webpage. Entrepreneur.

The complexity of content review lies in the fact that the platform cannot formulate rules that satisfy everyone, and the content review standards are not open and lack transparency. An excellent example is the accusations made by Republicans and conservatives accusing social platforms of favoring the Democrats. In contrast, the liberals accuse online platforms of favoring the election of Republican candidates (Samples, 2019). The formulation of rules remains a contentious issue because it is difficult for platforms used globally, such as Facebook, to satisfy everyone on their privacy policy. Additionally, despite the internal standards, the audit process is too subjective because the auditors come from different backgrounds, so users often need to learn why the posts are deleted.

The Congressional Research Service points out that although users feel that they are in their own private space on social media, these platforms are owned by operators, so users’ rights are greatly restricted by policies (Brannon, 2019). Therefore, adherence to privacy and digital rights is so complex that finding a legal solution to the problem will take time. Therefore, it takes work to make rules, taking into account both cultural differences and legal restrictions. It is an excellent method to constantly adjust the rules according to user feedback, improve the transparency of rule-making and audit technology, and create a fairer and more open environment for social platforms.

Real Case Analysis

The social media platform content audit faces many problems and challenges, especially when discussing special cases.

Imagine that you want to send a photo to discuss your views on girls’ menstruation to society, but your post has been deleted.

Kaur, R. (2015, May 3). Instagram Deletes Artist’s Period Photos: Her Account Is Back But Here’s Why It Matters. HuffPost.

Rupi Kaur is experiencing this problem. She uploaded a series of photos with menstruation as the theme (Rao, 2015). One of them shows her lying in bed with blood stains on her clothes and sheets. Because she challenged the platform to discuss the social taboo of menstruation, she was deleted twice by Instagram because she consistently violated the community rules (Rao, 2015). The privacy policies, rules, and regulations in many social platforms have proven to be more Westernized, with many users outside Europe or the U.S. facing increased surveillance or even deactivation of their accounts for posting information or pictures that represent their culture. The Rupi Kaur photos of menstruation have aroused global concern and discussion, especially public doubts about artistic expression and social activism. In the end, Instagram apologized to Rupi Kaur and restored the photos. This incident highlights the opacity and inconsistent implementation of the platform during the review.

Celeste Liddle’s photos of Australian Aboriginal Women

In her speech on International Women’s Day, Celeste Liddle showed images of the older women of Arrernte aborigines in Central Australia and shared photos of their activities (Graham, 2016).

Chris Graham, A. (2016, March 13). Prominent Black Feminist Writer Petitions Facebook as More Users Suspended Over Offensive Images. New Matilda.

Although the photo was designed to celebrate indigenous culture and women’s power, Facebook’s automated content review system called it illegal content and suspended her account several times. The action taken by Facebook highlights the difficulty in content review, especially when it comes to content relating to marginalized communities. This incident highlights the cultural ignorance and arbitrary handling of social media in the trial of aboriginal culture and female images.

Origin analysis of COVID-19

At first, some people questioned that the information in the original post on COVID-19 might be false, and Facebook immediately restricted users from discussing on the platform that COVID-19 might be leaked by the laboratory (Hern, 2021).

Ortutay, B. (2021, October 25). Facebook papers reveal deep conflict within over misinformation, hate speech. AP News.

Then, with more people calling for further investigation into the virus’s origin, the Facebook platform adjusted its policy and allowed people to discuss it (Hern, 2021). This shows that the flexibility of scientific discussion and the reasonable dialectical degree of misleading information are challenges for content audit. The ability to filter relevant and accurate content is a difficult task when dealing with privacy and digital rights because people on social media share so much information in seconds that it will have to take a long time long time to determine whether it is the truth or propaganda.

Through three cases, we can find the complexity of social media in the audit decision-making process. It also explains the problems and challenges of content audit.

Double-sided standards of content review: Social media will have different standards when dealing with different types of content, and such implementation standards will make users question the fairness of the platform.
Low discrimination of art and culture society: when it comes to culture, art, social customs, and other content, the auditing standards of social media lack discrimination.
The rules of censorship are opaque: the platform does not specify the auditing standards for users but only gives the auditing results, especially the auditing of the same type but with different treatment methods, which makes it difficult and vague for the creators of platform users to understand the rules.
Maintaining the imbalance between the platform order and the user experience: too much maintenance of the platform order ignores the feelings and experiences of users in all aspects, which makes the platform maintenance unbalanced.

Conclusion

As internet and social media use continues to surge, the problem of privacy and digital rights violations will continue to be reported. Despite all the websites having some privacy policy when a person is registering or browsing, evidence shows that data mining, including personal information, is ongoing, and there is nothing the governments can do about it. Are online users left at the mercy of social media and other online developers? This question can only be answered if all the stakeholders, including the consumer, push organizations collecting data to adhere to existing rules and regulations regarding privacy and digital rights protection.

References

Brannon, V. C. (2019). Free Speech and the Regulation of Social Media Contenthttps://crsreports.congress.gov/product/pdf/R/R45650

Graham, C. (2016, March 13). Facebook re-re-Suspends Black feminist, writer for ‘Offensive’ images of Aboriginal ceremony. New Matilda. https://newmatilda.com/2016/03/13/prominent-black-feminist-writer-petitions-facebook-as-more-users-suspended-over-offensive-images/

Hardesty, L. (2018). Study finds gender and skin-type bias in commercial artificial-intelligence systems. MIT News | Massachusetts Institute of Technology. https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212

Hern, A. (2021, May 27). Facebook lifts ban on posts claiming COVID-19 was man-made. the Guardian. https://www.theguardian.com/technology/2021/may/27/facebook-lifts-ban-on-posts-claiming-covid-19-was-man-made

Kemp, S. (2024, February 20). 5 billion social media users. DataReportal – Global Digital Insights. https://datareportal.com/reports/digital-2024-deep-dive-5-billion-social-media-users#:~:text=

Kerry, C. F. (2020, June 27). Protecting privacy in an AI-driven world. Brookings. https://www.brookings.edu/articles/protecting-privacy-in-an-ai-driven-world/

Koopman, C. (2019, July 28). No one reads online privacy policies. The CGO. https://www.thecgo.org/benchmark/no-one-reads-online-privacy-policies/

Rao, M. (2015, December 7). How cultural bias and sexism catapulted the period photo that broke the internet. HuffPost. https://www.huffpost.com/entry/rupi-kaur-instagram-period-photo-series_n_7213662

Ross, R. (2017, March 9). Why security and privacy matter in a digital world. NIST. https://www.nist.gov/blogs/taking-measure/why-security-and-privacy-matter-digital-world

Samples, J. (2019, June 18). Why the government should not regulate content moderation of social media. Cato Institute. https://www.cato.org/policy-analysis/why-government-should-not-regulate-content-moderation-social-media

Smith, A. (2014, August 17). Half of online Americans don’t know what a privacy policy is. Pew Research Center. https://www.pewresearch.org/short-reads/2014/12/04/half-of-americans-dont-know-what-a-privacy-policy-is/

Taisonthi, C., Nanthaseree, A., & Donkervoort, E. (2024). Importance of Digital Privacy in Light of Emerging Technology. American Bar Association. https://www.americanbar.org/advocacy/rule_of_law/blog/importance-digital-privacy-light-emerging-technology/#:~:text=

Be the first to comment

Leave a Reply