User Privacy and Security Issues Brought by Precision Marketing Digital Content

This is about each of us!

Social networks have remarkably met the practical needs of people to show their personalities, conduct social activities and communicate with each other. At the same time, consumers are becoming more self-conscious, emphasizing their individuality, and their needs are becoming more diverse. Traditional marketing methods have made it difficult for companies to capture and meet the needs of dispersed customers. Therefore, in the Internet era, data has become a new factor of production and a resource that the industry is scrambling for. With the help of big data, enterprises can easily break the shackles of traditional marketing through precision marketing, bringing users more comprehensive services and personalized consumer experiences. However, along with the convenience comes the user privacy and security issue, which deserves our attention and careful consideration. In this article, the authors will focus on how digital content for precision marketing affects user privacy and security. Whether digital platforms can use new technologies to protect users’ privacy and security and how to balance the two is a hot topic that every user should be concerned about.

What is user privacy security, and why do they matter now?

According to the survey (Goggin et al., 2017), most respondents feel they have no control over online privacy. More than half of these respondents (57%) are concerned about their privacy being invaded by companies, and the vast majority (78%) want to know how social media companies handle their data (Goggin et al., 2017). People are increasingly concerned about privacy, security, and digital rights. The rapid growth of the Internet has facilitated the generation of massive amounts of online data, such as text messages, images, videos, audio, and other information materials. These massive data contents have significant commercial value and rich knowledge. Individuals and organizations are increasingly developing social network analytics tools and exploring analytics methods to tap into this data and capture its value. Digital platforms can use this data content to analyze users’ private information, such as their gender and age, social attributes, consumption habits, preferred characteristics, and lifestyle habits. In this process, the analysis of online information can threaten the privacy and security of users by making the data publicly available for research, thus extending a unique concept of “digital privacy.” If this information is made public or misused, the damage to the user is incalculable. For example, collecting geolocation data from apps, smartphones, or WiFi can raise security and potential harassment concerns (Goggin et al., 2017).

Figure 1: Bruce Mendelsohn, Online Privacy, 2021

What is precision marketing?

Precision marketing uses modern information network technology, relying on powerful data resources and data analysis capabilities to accurately measure and analyze the behavior of users. Moreover, through data analysis, customer selection, and practical information delivery to meet the individual needs of consumers. At this time, new behavioral technologies have been released by Big Data to achieve this precision marketing and maximize advertising effectiveness. Sentiment analysis, opinion mining, and predictive data analysis – are built on a new emotional economy that can regulate and influence desires and emotions in a way that manipulates the preferences of individual viewers and consumers (Andrejevic, 2013, p. 59). The browsing habits and browsing history of Internet users, videos, pictures, and statements published in cyberspace, news, and reports on news sites pointing to individuals can be used as objects of precision marketing analysis. Precision marketing is a digital platform that will analyze user behavior information with solid privacy and use it for advertising and marketing.

Why does precision marketing raise privacy and security issues?

1. Users’ privacy is the price of enjoying the platform’s services

According to Suzor (2019), there is a severe disconnect between social values and harsh legal realities. In legal terms, terms of service are contractual documents that establish simple consumer transactions. In exchange for access to the platform, users agree to be bound by the terms and conditions set forth. Thinking back to the contractual rights we agreed to when signing up for a new service, have we read carefully which of those lengthy privacy collection notification terms?

Figure 2: Allen Mireles, Social Media Terms of Use, n.d.

It is a shame, but one must admit that almost no one reads them because they are often written in dense legal jargon, and there is no opportunity to negotiate the terms (Flew, 2021). In general, terms of service documents give operators much power. Especially for large enterprise platforms, these TOS are written in a way designed to protect their commercial interests (Suzor, 2019, p. 11). They retain absolute discretion over the platform operators to set and enforce the rules as they see fit. Even though users are aware of the extreme privacy of their data information, they have to accept concessions and compromises of privacy to access the platform. As Flew (2021, p. 73) suggests, most users are willing to sacrifice personal privacy for personalized services. To some extent, almost every digital platform user exchanges their data for convenient content and services.

In addition to accessing the platform, another critical reason why users are willing to give up their privacy is to improve search efficiency. When users are confronted with a considerable amount of media content, they want to find the content they are most interested in most efficiently; they may have to trade some of their privacy to get the related services from the platform – big data will recommend the content and advertisements users are most interested in based on algorithms, instead of spending much time on fruitless searches on digital platforms. For example, Gillies (2022) mentions the addictive and unethical algorithm of TikTok, which applies a personalization algorithm to recommend similar content for viewing based on the user’s race, age, and occupation. Therefore, while users enjoy the TikTok personalized recommendation service #foryou, they should be alert that their privacy is also receiving security threats.

Figure 3: TikTok, Discover more of what you love on TikTok, 2021

Therefore, to access the platform and the right to enjoy its personalized services, users use their privacy as a bargaining chip in exchange. The provider’s legal relationship user is the company’s right to the consumer, not sovereignty over the citizen (Suzor, 2019, p. 12).

2. Unethical use of data exists on digital platforms

The exponential growth of cybercrime has made online user information more vulnerable than ever before. In the face of these risks, platforms, and organizations must commit to ethically using data to protect their information systems better. Because of the attractive benefits that can be derived from users’ private information, specific platforms may do things that violate ethics and infringe on the rights of others. According to the test results (Xu, 2021), 56.3% of active App applications on major application platforms in China are at risk of violation, with mobile game apps being the most at risk of violation (69.8%). Among the active apps with the violation, risks are “collecting personal information without user’s consent,” “collecting personal information unrelated to the services they provide in violation of the principle of necessity,” and “not expressing the purpose, manner, and scope of collecting and using personal information,” and “collecting personal information without user’s consent.” and “providing personal information to others without consent” was in the top four of the violation risk list.

A common practice of unethical digital platforms is to listen to users and read their stored information privately. Chinese social chat software, WeChat, has been accused of monitoring users’ text chat data and repeatedly reading users’ photo albums in the background for precise advertising, which seriously violates and threatens users’ privacy and security and has a harmful social impact.

Figure 4: Ricky Spears, Is WeChat Safe To Use In 2023?, 2023

In addition, Flew (2021) argues that a new model of surveillance capitalism is also at play. The human experience presented through digital platforms becomes raw material processed by automated machines that “not only know our behavior but can also shape it on a massive scale. As Fiske (2022) suggests, Twitter privately employs control and experimental groups to intervene in the flow of information to users. In more detail, Twitter kept 1% of accounts as an untreated control group when they launched personalized feeds for their users. They then compared the control group to a random sample of 4% of everyone else. The most unethical part is that when users sign up for the platform, they are, in principle b by default consenting to be manipulated and curated (Fiske, 2022). Despite the responses and improvements made by the major platforms following the ethical issues, it still needs to eliminate users’ concerns when using them. These apps seem to be questioned for “spying on users’ privacy.” Privacy and security issues have forced users to worry about whether their data will be sold or liquidated, so there are many advocates of “InPrivate” (refers to allowing a person to surf the web without leaving any traces of private information). These measures are all designed to address the privacy and security concerns that come with big data.

How to protect users’ privacy and security?

Everyone talks about the importance of privacy, but everyone likes personalized services. What is the premise of personalized services? The concession of users’ privacy needs to establish a balanced relationship with the platform. Because in the online media environment, users often participate in the marketing process intentionally or unintentionally and become information sharers. Privacy is relative, and fundamentally, it balances the information users are willing to share and the services they wish to enjoy. In this “exchange” model, user data held by Internet platforms is based on the principle of “mutual consent,” provided that the Internet platform needs to ensure that the data is used appropriately and does not threaten users’ privacy. Thus, moderation is a “necessary, defining, and constitutive” aspect of what platforms do, and they cannot survive without the practice of moderation (Flew, 2021, p.74).

Fletcher (2022) suggests that experts agree on ethical principles that guide data-driven decision-making: empathy, data control, transparency, and accountability. Empathy refers to making more ethical decisions when platforms apply user data. Data control means that organizations should prioritize users’ rights to their digital, i.e., ownership and control. Users can decide what they like, and organizations should support this. Transparency means whether users have actual knowledge or consent to data collection. This actual knowledge is not the user’s consent but the actual knowledge of what information is collected, what it does, and why it is collected. In particular, the user’s consent to the platform’s privacy-related terms should be written in easy-to-understand language. Accountability means that the organization is responsible for maintaining the security of the information it collects. When the algorithm of data significantly impacts the individual rights of data subjects, we, as data subjects, should have the right to complain and defend our rights.

From the user’s point of view, we should strengthen our awareness of privacy protection. It is glad that more and more users are now gradually increasing privacy awareness and paying attention to protecting personal information. Suppose the lagging awareness of users’ privacy and security cannot be compared with the rapidly developing big data technology. In that case, it is not enough to rely on the platform side to accept the constraints of the law. The improvement of user security awareness is significant. The protection of their privacy security not only s refers to their names, home addresses, and other important information belonging to the privacy issues. More importantly, remember that browsing records, and action tracks, also belong to the privacy category.

In conclusion, this paper explores the user privacy and security issue brought about by digital content for precision marketing in the era of big data. The digital content of precision marketing is essential for providing personalized services to users. However, while digital platforms bring convenience to users’ lives, they also bring privacy and security issues. To summarize the characteristics and attributes of users, digital media need to obtain private information for user profiling; only then can users receive more accurate content and service recommendations. It is a two-way process. From the platform’s perspective, they should take action to focus on how personal data is collected, used, stored, and shared to manage and protect user privacy and security more effectively. From the perspective of individuals, we should be alert to digital platforms for privacy and data rights violations. When faced with unfair privacy violations, we should actively defend our rights. Overall, Platforms and users should balance their relationship to build a safe and orderly network media ecological environment.

References

Andrejevic, M. (2013). Infoglut. Routledge. https://doi.org/10.4324/9780203075319

Fiske, S. T. (2022). Twitter manipulates your feed: Ethical considerations. Proceedings of the National Academy of Sciences, 119(1). https://doi.org/10.1073/pnas.2119924119

Fletcher, C. (2022, February 26). Why the ethical use of data and user privacy concerns matter. VentureBeat. https://venturebeat.com/datadecisionmakers/why-the-ethical-use-of-data-and-user-privacy-concerns-matter/

Flew, T. (2021). Regulating Platforms. Cambridge: Polity.

Gillies, S. (2022, March 10). TikTok’s Addictive and Unethical Algorithm. SI 410: Ethics and Information Technology. https://medium.com/si-410-ethics-and-information-technology/tiktoks-addictive-and-unethical-algorithm-3f44f41f1f3c

Goggin, G., Vromen, A., Weatherall, K., Martin, F., Webb, A., Sunman, L., & Bailo, F. (2017). Digital Rights in Australia. https://ses.library.usyd.edu.au/handle/2123/17587.

Mendelsohn, B. (2021, January 8). 21 Easy Steps to Protect Your Digital Privacy & Security in 2021. Www.linkedin.com. https://www.linkedin.com/pulse/21-easy-steps-protect-your-digital-privacy-security-bruce/?trk=pulse-article_more-articles_related-content-card

Mireles, A. (2012, March 2). WARNING: Social Media Terms of Use. What You Don’t Know CAN Hurt You! Allen Mireles Consulting. https://allenmireles.com/warning-social-media-terms-of-use-dont-know-can-hurt/

Spears, R. (2022, December 27). Is WeChat Safe To Use In 2023? [Security Guide] | Ricky Spears. Www.rickyspears.com. https://www.rickyspears.com/how-to/is-wechat-safe/

Suzor, N. P. (2019). “Who Makes the Rules?”.Lawless : the secret rules that govern our digital lives. Cambridge University Press.

TikTok. (2019, August 16). Discover more of what you love on TikTok. Newsroom | TikTok. https://newsroom.tiktok.com/en-us/discover

Xu, Y. (2021). Nearly 60% of apps are violating user privacy. COVERSTORY.

Be the first to comment

Leave a Reply