
Last week, my friend planned to go on a trip. When we sat together to research the price of the airplane ticket, we found a very confusing situation The price of the airplane ticket on the same date and destination displayed on my mobile phone. was more cheaper than his. In order to confirm, we opened another ticketing APPs and found that the situation was the same.Since I don’t use these APPs so often and he likes travels and buys tickets frequently, I realized that big data backstabbing incident also has happened to me and my friends.
“Big data swindling,” also known as “big data backstabbing,” is a type of algorithmic discrimination in which pricing for the identical goods and services online are raised for current clients(Stella Chen,2021). In recent years, online shopping platforms, food delivery services, taxi and ride-hailing services, flight and hotel booking platforms, and others have widely used data-enabled price discrimination. With the adoption of the People’s Republic of China’s Personal Information Protection Law in August 2021, the practice encountered additional official prohibitions.
This phenomenon is not uncommon in China

According to the results of the questionnaire survey and experience survey on the Internet consumer big data “discriminate acquaintances”, the Beijing Consumers Association came to the following survey conclusions: more than 70% of the interviewees believe that there is still the phenomenon of “discriminate acquaintances” with big data, and more than 60% of the respondents said that they have been ” discriminated” by big data. In this chart ,more than 70% (76.77%) of the respondents believe that there is a phenomenon of big data “discriminate acquaintances”, which is more than 20% (22.97%) compared with last year (99.74%); more than 60% (64.33%) of the people said that they had been “discriminated” with big data, a decrease of more than 20% (22.58%) compared to last year (86.91%). That is to say, although the proportion has dropped significantly, most respondents still believe that the phenomenon still exists, and said that they had the same experiences.
The negative role of algorithms in this phenomenon
An algorithm is defined as a “process or rules for (esp.machine) calculation” by the Concise Oxford Dictionary. An algorithm’s execution must not include any subjective decisions, nor must it need the use of intuition or creativity. When we talk about algorithms, we’ll largely be talking about computers. Nonetheless, more methodical strategies for problem solving could be included.(Gilles Brassard and Paul Bratley,2017)
Around 1970, with the development and popularization of computers, computer algorithms were widely used, and experienced the evolution process of technology, product and tool. However,after a comprehensive sorting and research on the application of algorithms, improper application of algorithms will cause serious problems such as data abuse, algorithm discrimination, algorithm black box, algorithm manipulation, etc.
Algorithmic Collusion in Market Competition
In the field of market competition, algorithms can effectively improve service personalization and accuracy, and help reduce decision-making and operating costs by predicting user behavior and market development trends. At the same time, issues such as self-preferential treatment, algorithmic collusion, and transaction restrictions in the platform economy have had a serious negative impact on the order of market competition.Platform operators collect “label” information such as user portraits, payment capabilities, and willingness to pay, and use algorithms to price consumers discriminatoryly. For the same target, high-frequency users are priced higher than low-frequency users and new users, resulting in the same business, Different consumers may face different prices for the same product, forming an abnormal phenomenon of “one price for one person” and “thousands of prices for thousands of people”, which violates consumers’ right to know and right to fair trade. In recent years, “big data discriminate acquaintances” has become a hot issue in society, and improper application of algorithms has played a role in fueling the flames
On October 7, 2018, a famous writer Wang Xiaoshan posted two blog posts on Weibo, accusing Ali’s Fliggy Travel App of using big data to “discriminate acquaintances”. The reason is that Wang Xiaoshan, as an old user of the platform, not only found the ticket price of Fliggy is much higher than that of other platforms by more than 1,000 yuan, and the price displayed on the mobile phone of the same flight on the Fliggy APP is more than 700 yuan higher than that of new users of the same company. Afterwards, Fliggy Travel officially denied the claim to the media, saying that “big data has never been and will never do the harm the things to the benefits of consumers.”

Every click records our usage habits, every purchase forms our preferences, location information is collected wherever we go, user’s gender, age, occupation, browsing history, comments, and the “sky” of the Internet, leaving Every trace is used to accurately portray the user portrait. The core of big data is prediction.
The core of big data is prediction. Big data will create an unprecedented quantifiable dimension for human life. Big data has become the source of new inventions and new services, and more changes are on the way.(Viktor Mayer-Schönberger,2013). However, once this change lacks adherence to ethics and morality, the result will be counterproductive. Big data marketing will become algorithmic hegemony, which will directly damage the public interest and cause a crisis of public trust in the company.
Data Misuse in Algorithm Training

With the all-round penetration of algorithms into production and life, the collection of personal data and even private data has increased significantly. For the purpose of obtaining more data to create greater profits, some online platforms have shown a general trend of excessive collection of personal information. According to the survey results, there are many types of permissions that mobile APPs need to obtain, the most prominent ones are access to location information and access to contacts; moreover, users’ privacy permissions are obtained when the app’s own functions are not necessary, which increases the risk of personal information leakage.For example, a mobile teaching software ‘You Academy’ accessed mobile photo albums and files nearly 25,000 times within ten minutes, and another office software ‘Team’ tried to start itself nearly 7,000 times within an hour, and kept reading the address book .These data are often used illegally, without the knowledge, permission or even the express refusal of the web user or data subject, and are used illegally, or even sold, to the detriment of the legitimate rights of other web platforms and web users. For example, in the case of the largest online recruitment platform “Wisdom Recruitment”, employees of the company were involved in the dumping of personal information. The case involved more than 160,000 pieces of personal information of citizens, with a price tag of about 5 yuan for a resume.
The Internet is no longer a bright force that carries knowledge, education and participatory democracy; it is becoming a shadowy force that is highly commercialised, distracting and hyper-imaginative of people’s privacy.(Robert W. Mc Chesney,2013)Our search results match our preferences, while at the same time, platforms using algorithmic rules place the products we are willing to pay the highest price for on the front page, all based on the analysis of our data, a hidden “calculation” of us with huge data platforms and advanced technology; the “calculation” that user data is packaged and sold to third parties, resulting in privacy leaks, is also a violation of public rights.
Thoughts on Algorithmic Power Regulation
Algorithmic governance has become an important issue of common concern around the world. More than 50 countries, international intergovernmental organisations, non-profit organisations and industry associations have published principles and conventions related to AI and algorithmic governance.
Top-level design: bringing the law of arithmetic within the bounds of the law
The introduction of algorithm-related ethical norms and special provisions, the development of special provisions to regulate the specific application of algorithm technology key aspects, especially for the painful and difficult issues involving data security and protection of individual rights, market competition order, etc., to speed up the revision of relevant laws and regulations interpretation and introduction, to achieve a rule system that connects law and policy, legislation and ethics, domestic legislation and international rules.
The individual is the owner of the power and the bearer of the power, seemingly an equal game between the platform and the user, but in fact the winner has long been predetermined to the value system of the algorithm. Legislation is the most direct and effective way to regulate this power, to bring the power of algorithms under control and to prevent consumers and users from becoming victims of power manipulation.For instance,In February 2022, the US Senate introduced the Algorithmic Accountability Act of 2022 to re-evaluate the depth of application and extensive control of algorithms in social life, to provide more innovative and refined provisions for applicable subjects, and to establish new regulatory mechanisms.
While traditional theories of mediation usually see mediators as regulators between two pre-existing things, Don Eade and Verbeke take a post-phenomenological view of things as being constituted by the mediating relations that lie between them. “The mediator thus becomes the origin of things, rather than being in the ‘middle’ of things.(Verbeek P.)Algorithms are like a kind of media intermediary, the algorithmic technology in the platform, itself, has no standpoint, the people behind the technology are the true leaders of values, ethics.The subjectivity of human beings and the objectivity of the world are not so much given in advance as they are the result of a mediating relationship.The user information collected by the APP exceeds the necessary element information of the order and belongs to the collection of non-essential information, while the collected user information can be freely shared by the APP to other platforms, making the user a “data frank” person on the Internet. The development of big data algorithm technology should follow the principle of moral bottom line and constantly construct ideas about algorithm ethics in order to put the algorithm in the cage of morality.
International coordination for shared governance
The common nature of information technology determines the common nature of governance rules. At the international level, openness, transparency, fairness and equity have become common goals for algorithmic governance in all countries.Intergovernmental international organisations can conducting international dialogue, coordination and cooperation to accelerate the development and application of algorithms. On the premise of respecting each country’s data On the premise of respecting the data, algorithms and AI governance principles of each country, strengthen the exchange and cooperation among countries in terms of transparency, interpretability and accountability of algorithms, and promote the formation of a consensus on governance around algorithms at the global level. On the premise of respecting the data, algorithm and AI governance principles of each country, we will strengthen the exchange and cooperation among countries in terms of algorithm transparency, explainability and accountability, promote the formation of a consensus on governance around algorithms at the global level, and promote the healthy and orderly development of the AI industry worldwide. This will promote the healthy and orderly development of the AI industry on a global scale.
Of course, we must not forget that neither a highly developed biometric system nor a system of identity data linking all people and living beings can actually be separated from a huge algorithm and control system behind it.Most people do not upgrade, and so also become a new lower class, dominated by both computer algorithms and the control of the emerging superhumans.(Yuval Noah Harari,2015) In the final analysis, after all, it is a “game between people“.
Reference
- JACK M. BALKI. 2016 Sidley Austin Distinguished Lecture on Big Data Law and Policy: The Three Laws of Robotics in the Age of Big Data[J], Ohio State Law Journal, 2017, 1217(78):1237-1239
- Robert W. Mc Chesney,Digital Disconnect: How Capitalismis Turning the Internet Against Democracy, The New Press. NewYork,2013. p.46
- Stella Chen,2021,THE CMP DICTIONARY Big Data Swindling https://chinamediaproject.org/the_ccp_dictionary/big-data-swindling/
- Data research:https://chinamediaproject.org/the_ccp_dictionary/big-data-swindling/
- www.bj315.org/xfdc/202209/t20220909_35058.shtml
- Gilles Brassard and Paul Bratley,2017,ALGORITHMICS:Theory and Practice,p.1
- Verbeek P. Expanding Mediation Theory[J]. Found Sci., 2012, 17: 391–395 DOI: 10.1007/s10699-011-9253-8
- Viktor Mayer-Schönberger,2013,Big Data: A Revolution that Will Transform how We Live, Work, and Think .An Eamon Dolan book .Business book summary,
- Kenneth Cukier Houghton Mifflin Harcourt, 0544002695, 9780544002692
- Yuval Noah Harari,2015,Homo Deus: A Brief History of Tomorrow,p331
Be the first to comment