DIGITAL GENERATION INVOLVES ETHICS AND REFLECTION ON PERSONAL PRIVACY

In 1960, the use of cameras first introduced the concept of privacy. Today we have entered the era of digital economy. The Internet connects everything. Like all technologies, the Internet is a double-edged sword. Whether it has good or bad effects depends on how we use it and how we manage it.

(lifeweek)

In ancient times, humans developed axes and hammers. These technologies changed the way people lived, but did they change our humanity? All technologies are tools, and the Internet in the digital economy era is just like an axe. It is a tool. The key to the role of a tool lies in how it is guided and managed. In the age of the digital economy, there are various means and methods of obtaining information, and all people are exposed to information gathering. In particular, the analysis of digital generation can predict the behavioral tendencies of all people.

Given the collection and storage capabilities of the era of digital generation networks, everyone will have nothing to hide from persistent searches in the future. It is fair to say that the information revolution in the age of the digital economy, driven by mobile intelligence, has changed our world in just a few decades. It has also raised questions of privacy and encouraged the emergence of new social norms. Suzor (2019) states that social media platforms will eventually develop complex rules to assist their moderators in determining when content is unacceptable. Typically, they strive to create regulations that meet user expectations. Today, the application of digital generation is no longer limited to a certain field. Some organizations use digital generation knowledge to mine valuable connections from various data. In particular, digital generation can be used to reduce information asymmetries between the data being held and the sources of the data: digital generation researchers can obtain information from individuals and other agents.

ETHIC ISSUE

1. Tech roots: Data sharing and analysis on a large scale Sharing is inherent to the digital generation. One of the fundamental requirements and characteristics of the digital age is the widespread sharing of data, which also marks the beginning of the erosion of privacy. The goal of data mining is to find hidden value, which means that important connections that were previously missed might be found by analyzing data that at first glance seemed inconsequential. According to Handajani (2018), the primary distinction between the “digital generation” and the conventional notion of “data” in the past is that the hidden value also resides in action prediction. 


2. Social consequences: As a result of the digitalization of subject identity, people are now portrayed in society as collections of disparate data since their actions and even their personal identities are shown as data on data platforms. According to Marwick & B (2019), a person’s approach to privacy can also differ significantly based on social dynamics and circumstances. For instance, people’s perspectives on and assessments of health data are very different from those of healthy individuals when they fall ill and face the possibility of losing their health insurance. The ease with which this datafication improves people’s daily lives and access to healthcare is a positive; the potential for privacy concerns arises from the fact that the subject’s identifying information is highly. The potential for privacy problems arises from the subject’s identifying information being highly visible, real-time, and transparent. ization, very little privacy exists. Individuals become data representations; this is not the person the subject wishes to become, but rather objective data that reveals the subject’s true nature. When you get to know someone, for instance, you frequently discover more about them based on digital generation information about their physical characteristics, health, credit history, etc., while mostly disregarding information about their character, interests, and personality. This leads to a form of technological alienation where people’s subjective status is lost and their core qualities are severely impacted. This is one of the causes of the issue, which is that as people use the digital generation to alter the natural world, the technological advancements and their outcomes have turned into a power that oppresses and dominates people.

3. Practical reasons: Data sharing and mining are the primary causes of privacy problems in complex technical systems because of the divergent objectives of the various parties in the data sharing platform. Ethical privacy concerns are primarily caused by each person in this new system. Flew (2021) notes that algorithmic recommendation systems, material distribution, and content removal will all continue to be decided by data sharing platforms. The new regulations offer them more credibility, which might solidify their influence in public discourse. Organizations and users in this system frequently behave in their own self-interest and seek to maximize gains, which can be detrimental to the interests of other stakeholders and have an impact on privacy. Subsequent investigation reveals that the fundamental ethical source of privacy concerns in the digital age appears to be the ethical imbalance brought about by the disparate interests of various stakeholders.

(baidu)

In 2016, there were reports that the Baidu App had problems with user privacy leakage. According to the report, sensitive information such as users’ search history and geographical location is uploaded to Baidu’s servers and stored in clear text in its database, which allows hackers to easily steal users’ personal data. The incident has raised public concerns about the protection of personal privacy and prompted regulators to scrutinize and supervise how Internet companies handle user data. Cases like this highlight the importance of personal privacy protection in the digital age and the need for Internet companies to be more careful and transparent when handling user data.

REFLECTION AND RESPONSE

But from an individual perspective, privacy is the ability to maintain or ensure information asymmetry. In other words, individuals use privacy rights to restrict others from snooping on their information. Therefore, the original intention of digital generation and privacy are potentially or fundamentally opposed. Therefore, how to deal with the relationship between them has become a very important issue. Generally speaking, as digital generation increases the volume and scope of data, privacy gradually decreases. Although there are appropriate boundaries for privacy, people generally believe that they no longer have that much privacy in a digital generation environment.

1. Improve value transparency in data use

Although there is an ongoing debate about whether technology is neutral or valuable, technology optimists and pessimists have their own perspectives. When using digital generation technology, both the technical design process and the product usage process differ. There is an implicit value, but users are often unaware of it. Turilli & Floridi state that (2009) because it is common for customers to be unaware that their personal information is being gathered and utilized for purchasing purposes, businesses must make their usage of various data more transparent. Acknowledge and accommodate people’s fear of the unknown while providing users with clear information about the types of data being gathered and processed, their possible uses, the benefits of using data, and the risks involved.  This is in line with the principles of autonomy and informed consent in the context of ethical decision-making. . Flew (2021) notes that this There is the phenomena of excessive deletion, wherein platforms impose more regulations than necessary in order to shield themselves from accountability and extra expenses associated with sophisticated content moderation procedures. Giving people back their freedom of choice can help lower the possibility of drawing dangerous conclusions while using digital generation. Certain operations involving the use of personal data may be communicated to individuals through emails or announcements in order to increase feasibility and control costs. This communication must respect the individual’s right to object to the use, maintain their right to anonymity, and ensure the integrity of the context in which the data are used.

2. Adjust personal privacy views

The theory of privacy as situational integrity develops the definition of privacy of information and its meaning from the perspective of different social domains (Nissenbaum, 2018). Since in the era of digital generation, individuals’ views on privacy are one of the most important reasons for privacy problems, promoting society’s awareness of privacy and adapting privacy concepts is the only way to solve the problem. Enhancing privacy awareness will help you develop a privacy view that suits you, achieve unity of privacy behaviors and concepts, and reduce the occurrence of conflicts. Improved privacy awareness will also help you to pay attention to privacy protection when using digital generation products. Use software sparingly, for instance, steer clear of revealing private information in messages and images, and be aware of your legal rights when the corporation infringes upon your privacy. Of course, the individual’s right to choose is still not very strong at the moment. This calls on people to modify their conceptions of privacy in the era of digital technology, to keep up with contemporary thinking, and to continuously look for new and improved ways to safeguard their own private.

3. Build a common value platform

Sharing value between organizations and individuals can help reduce conflicts arising from divergent interests when it comes to privacy. In the development of digital generation products and services, personal values and organizational values are combined so that all stakeholders can reach an initial consensus on privacy issues. Suzor (2019) has found that the specific rules of social platforms tend to be subjective value judgments that limit the ability for some people to express themselves and perpetuate harmful biases. Consider user acceptance when developing products and produce products that are consistent with shared values to reduce conflicts that arise when it comes to privacy issues.

4. Seek reasonable ethical decision-making points

Since they will have an impact on the extensive usage of data, ethical choice points are especially crucial when designing digital generation products. For both organizations and individuals, a workable solution to this issue will involve identifying moral decision points that strike a balance between interests and coming to an understanding of concepts through investigation and discussion. Zuboff (2020) states: Since organizations and individuals as decision subjects often start from their own interests, it is difficult to make decisions objectively. To impartially look at and collaboratively identify ethical decision points, an outside organization can be brought in. To find out what other people believe, first carry out a comprehensive ethical study and initiate an ethical dialogue utilizing user consent forms and questionnaires. Second, assess whether the product to be designed is in keeping with the identified values and within the acceptable range of users by analyzing, processing, and evaluating the survey data. Thirdly, integrate both sides’ needs to arrive at a reasonable value. Lastly, provide information about the decision-making process’ outcomes and the distribution and use of the data. As third party businesses continue to grow, they can become personal data agencies—that is, people will provide permission to these organizations to handle their personal data.

Of course, solving the ethical issues of protecting the privacy of the digital generation cannot be achieved overnight. It also requires the cooperation of sound policies, laws and regulations, education and other stakeholders. In summary, with the changes brought by the digital generation, and avoid ethical issues with the privacy of the digital generation so that the digital generation can develop healthily. Respecting the value of the individual and improving the coherence of values and actions are issues that workers need to pay attention to together.

Finally, this article will call on the global community to pay attention to privacy issues in the digital age, advocate and jointly promote international cooperation, and jointly build a digital society that is both vibrant and fully protects personal privacy. It is hoped that through an in-depth analysis of this hot issue, the public will have a clearer understanding of the privacy dilemma in the digital society and contribute thoughts and suggestions to building a harmonious society in the digital age.

Flew, Terry (2021) Regulating Platforms. Cambridge: Polity, pp. 72-79.

Handajani, S. (2018). [Book Review] 21 Lessons for the 21st Century. Humaniora, 30(3), 342-344. https://doi.org/10.22146/jh.v30i3.39310

Knockel, J., McKune, S., Senft, A., & “Easily decryptable” encryption In this report we sometimes utilize the phrase “easily decryptable” in referring to the encryption used by Baidu Browser. Here we discuss what we mean by this phrase. (2023, May 22). Baidu’s and don’ts: Privacy and security issues in Baidu Browser. The Citizen Lab. https://citizenlab.ca/2016/02/privacy-security-issues-baidu-browser/

Marwick, A. & boyd, d. (2019) ‘Understanding Privacy at the Margins: Introduction’, International Journal of Communication, pp. 1157-1165.

Nissenbaum, H. (2018). Respecting context to protect privacy: Why meaning matters. Science and Engineering Ethics, 24(3), 831-852.

Privacy: Satirical illustrations, illustration, Social Art. Pinterest. (2017, April 8). https://www.pinterest.com.au/pin/406942516318015910/

Suzor, Nicolas P. 2019. ‘Who Makes the Rules?’. In Lawless: the secret rules that govern our lives. Cambridge, UK: Cambridge University Press. pp. 10-24.

Turilli, M., & Floridi, L. (2009). The ethics of information transparency. Ethics and Information Technology, 11(2), 105-112. https://doi.org/10.1007/s10676-009-9187-9

Zuboff, S. (2020). The age of surveillance capitalism : the fight for a human future at the new frontier of power (First Trade Paperback Edition.). PublicAffairs.

百度被曝泄漏用户信息 百度官方回应遭质疑. Radio Free Asia. (2020, October 11). https://www.rfa.org/mandarin/yataibaodao/meiti/yf2-02262016101032.html

中读. 被材料改变的人类. (n.d.). https://www.lifeweek.com.cn/h5/article/detail.do?artId=124785

    Be the first to comment

    Leave a Reply