
Figure 1.Draw that represents the Internet of Things, signed by the author (Wilgengebroed, 2012).
With the development of the times, the Internet of Things (IoT) has gradually come into the public’s attention and become one of the popular terms in the digital age. As an advanced concept, IoT refers to the Internet connecting various objects, devices, sensors, and all kinds of items in daily life, which in turn endows these items with computing and communication capabilities (Rose, Eldridge, & Chapin, 2015). From smart homes, wearable devices, self-driving cars, etc., IoT has been popularized and penetrated in various fields, spreading across human society’s food, clothing, housing, and transportation, gradually changing people’s lifestyles and work patterns. However, crisis follows, IoT realizes the data monitoring of people’s daily living and behavioral habits by connecting various daily devices, which brings great risks to personal privacy issues. Based on the theory of Panopticon proposed by Michel Foucault in his book Discipline and Punish: The Birth of the Prison (Foucault, 1977), this paper combines it with privacy dilemmas and digital human rights in the IoT ecosystem, proposing the metaphor of the visible prisoner and the invisible jailer, to explores how individuals are controlled by invisible surveillance in the digital age. Users, as visible prisoners, are exposed to the surveillance of IoT, while invisible jailers, such as large technology and Internet companies, control and guide individual behaviors and choices through the collection of user data and algorithmic analysis and processing. Through this theory, this paper provides an in-depth analysis of the data surveillance and privacy dilemma posed by the IoT, examines how digital human rights are threatened in this ecology, and, through case studies, the practical implications and challenges of this issue.
Foucault’s Panopticon and Digital Surveillance

Figure 2. Plan of the Panopticon by Jeremy Bentham (1843), from The Works of Jeremy Bentham, Vol. IV, pp. 172–173.
Foucault expanded on the theory of the panoramic prison based on Bentham’s (1843) idea of a model prison design. Bentham’s design featured a central tower with an ombudsman who oversaw the prisoners’ activities in their cells (Galič et al., 2016). This design creates an illusion of constant surveillance for the inmates. The inspector, on the other hand, is seen as an invisible omnipresence, “a point of total darkness” in the all-transparent space of the prison (Božovič, 2018).Foucault takes this idea and goes beyond prisons to cover the whole of society, such as schools, factories, and hospitals, where the behavior of the individual is constantly observed, recorded, and categorized that make people self-regulate in an invisible surveillance.
Foucault’s theory is especially evident in the age of the Internet of Things. Surveillance today is no longer just cameras and police, but smart devices, the Internet, big data and other technologies. Compared with traditional surveillance, digital surveillance is more invisible and ubiquitous. The decentralization of the surveillance subject is difficult for users to circumvent and resist.
Visible Prisoner’s Privacy Dilemma
In the IoT ecosystem, the situation of the individual is extremely similar to that of a prisoner in a panoramic prison. For the most part, tech companies’ products offer the option for users to voluntarily hand over data in exchange for ease of use. Instead of passively accepting surveillance. Surveillance and convenience coexist, and users invariably cede some of their privacy while accessing personalized services. Whether it’s smartphones, connected cars, smartwatches, etc., almost all connected everyday devices are under the scope of data collection and monitoring. People’s health, emotions, speech, conversations, etc. are captured while they are unable to perceive and determine, when, by whom, and in what way they are being monitored, and this constant surveillance not only puts the user in a phase of constant exposure, but also subconsciously shapes their behavior. Amazon’s voice assistant, Alexa, is an always-on voice device. Not only does it save information about the user’s purchases on Amazon.com, geolocation, but it also analyses the user’s shopping habits and collects information about family members. When there are children in the home, consent must be given for this machine to collect the child’s photo, voice conversation history, date of birth, and more. This may lead to immature minded children to see Alexa as a smart friend that they can talk to and learn about themselves and actively share more personal information and content about their lives. Without realizing a horror that the tech company on the other side is listening in on what it is saying. In this digital panopticon, users are both beneficiaries and prisoners of technology. In a constant display of ceding privacy to invisible surveillance and data collection, one loses true control of one’s digital life.
Surveillance and Privacy Invasion by Invisible Jailers
The jailers, who hold the power of data collection and analysis, are also technology companies and government agencies. Unlike traditional surveillance systems, the IoT’s surveillance mechanisms are more complex and covert.
Technology companies drive business models by analysing users’ private data. In the process, they not only collect information, but also invisibly shape users’ digital environments. As a result, tech companies have unprecedented influence over how we share information, who we communicate with, and what news we watch. Social media platforms, such as YouTube and Facebook, utilize users’ browsing habits, liking and favoriting records, and so on, to build accurate user profiles. They also enhance user stickiness by personalizing recommended content through algorithms. For example, the YouTube platform’s algorithms recommend video content that is more appealing to users’ attention based on their history, which leads to many users being pushed to extreme political or conspiracy theory videos. This recommendation mechanism builds a cocoon of homogeneous content and information, further exacerbating social polarization and prejudice (Piao et al., 2023). Manipulating users’ emotions so that they are influenced unconsciously.
Data collection and management is a hot topic not only in business and society, but also in the government field (Chen et al., 2012). IoT technology is widely used by governments for social governance and national security, but in some cases, it can also be misused. For example, predictive law enforcement by the U.S. government. The data analyses crime trends, but studies have shown that such predictions can exacerbate racial discrimination and bias. In addition, some countries use the Internet of Things and big data to conduct online censorship, monitor inappropriate speech and images, and even manipulate public opinion through the media. Such technologies, if left unchecked and allowed to be abused in an extreme manner, not only threaten individual privacy and freedom, but may also lead to increased social inequity. Tech companies and governments, acting as invisible jailers, implement surveillance and control in seemingly unobtrusive ways that affect everyone.
The erosion of digital human rights

Figure 3. Image from “Human rights in the digital age”.
Imagine you’re at home casually spouting off to a friend about how stressful studying is, and a few minutes later you open your phone’s app, Little Red Book, to find posts and advertisements on how to relieve stress from studying appearing on your screen. This kind of precise and vertical push isn’t some magical miracle of telepathy; it means that our everyday conversations are being listened to by our jailers. What digital human rights do we have when our personal, autonomous emotions are monitored and controlled, as illustrated by Facebook’s emotion manipulation experiments, where the emotions of an online social network can infect our actual own emotions (Kramer et al., 2014). Research has shown that platforms can influence users’ moods by tweaking their push posts. Making users happier or more anxious without their knowledge. This isn’t science fiction; this is an actual event happening to us. Your data isn’t just information, it’s also a tool for platform jailers to utilize. The problem, however, is that a huge amount of our data is being collected in a default mode without us having any choice. Apple’s Siri has been known to “inadvertently” record users’ private content, and even when a user wants to delete it, it’s very difficult, if not impossible, to do so.
Even more disturbing is the fact that data companies are also utilizing this data for high profits. Your health data may affect insurance premiums, and your search history determines the push prices of advertisers (Pasquale, 2015). Even the model of cell phone you use can determine the price of your mobile shopping app. In China, if you use an Apple phone, some platforms automatically default to the “rich” symbol, and the price of membership on websites and some shopping apps are priced according to the model of the phone. For example, some Chinese-made cell phones such as Vivo and OPPO and American-made cell phone Apple will have different pricing for membership on the same platform. In other words, your data is not your own, but an indifferent commodity bought and sold by a company that has a share of your unpaid contribution to the high profits they make.
Case studies and current situation analysis

Figure 4.Image from “Kroger’s electronic shelf labels draw scrutiny from senators amid inflation concerns”
In recent years, some shopping malls and retail stores have begun to plan for the use of smart electronic price tags. The move was supposed to improve the efficiency of price updates and reduce paper waste. However, the technology has raised widespread concerns. By using electronic price tags in conjunction with the Internet of Things (IoT) system, retailers can dynamically adjust prices in real time in response to market fluctuations, which is supposed to allow for better management of inventory and promotional activities. However, some supermarkets, such as Kroger in the U.S., have adopted an intelligent technology, Electronic Shelf Labels, that displays prices in a dynamic way, allowing supermarkets to offer different prices at different times of the day (Bigora, 2024). This dynamic pricing may cause price increases during high-demand periods, such as when office workers are leaving get off work and when they need to rush to buy supplies during the epidemic. This approach may cause price discrimination. If the electronic price tags change without informing consumers, consumers will pay a higher price than the goods without knowing it, which is unfair to consumers’ digital human rights. In addition, Kroger Supermarket plans to cooperate with Microsoft to equip electronic shelf labels with cameras to monitor consumers and use facial recognition to collect customers’ gender, age and other biometric information to push personalized advertisements and perform potential differentiated pricing. This series of actions not only poses a threat to consumer privacy, but also challenges the right to fair trade, which may further exacerbate the digital divide and undermine digital human rights.
Conclusion
As we continue to embrace the connected, convenient world of the Internet of Things and enjoy a wide range of smart devices, the line between personal freedom and digital surveillance is blurring. The visible jailer, more than just a nebulous philosophical concept, is becoming a reality of how we interact with the IoT, while the collection and processing of information about us by tech companies and governments is becoming more and more prevalent. In this world, invisible jailers silently guide our decisions, from our lifestyles to the goods we buy.
While these personalized experiences and convenient services are appealing, we must ask ourselves, is the price of this convenience too high? Is it the erosion of our digital sovereignty and privacy for the sake of customized services? Are we aware that our behavior is unknowingly being manipulated by invisible hands?
Ultimately, the real challenge lies in finding a balance between the two, where we enjoy convenience while protecting our personal data. Convenience shouldn’t be a shackle, and technology shouldn’t be a prison. In the era of the Internet of Things, we can’t just be prisoners of data, we should change our identity and become guardians of the power of data.
Reference
Bigora, P. (2024, August 12). Kroger comes under fire for use of electronic shelf labels. Grocery Dive. https://www.grocerydive.com/news/kroger-electronic-shelf-labels-instore-technology-senators-inflation/723939/
Bentham, J. (1843). Plan of the Panopticon [Illustration]. In The works of Jeremy Bentham (Vol. 4, pp. 172–173).
Bentham, J. (1843). The works of Jeremy Bentham (J. Bowring, Ed.). William Tait.
Božovič, M. (2018). Seeing it all: Bentham’s panopticon and the dark spots of enlightenment. In Transparency, society and subjectivity (pp. 133–154). Springer. https://doi.org/10.1007/978-3-319-77161-8_7
Chen, H., Chiang, R. H. L., & Storey, V. C. (2012). Business intelligence and analytics: From big data to big impact. MIS Quarterly, 36(4), 1165–1188. https://doi.org/10.2307/41703503
Foucault, M. (1977). Discipline and punish: The birth of the prison (A. Sheridan, Trans.). Pantheon Books.
Galič, M., Timan, T., & Koops, B.-J. (2016). Bentham, Deleuze and beyond: An overview of surveillance theories from the panopticon to participation. Philosophy & Technology, 30(1), 9–37. https://doi.org/10.1007/s13347-016-0219-1
Grocery Dive. (n.d.). Kroger’s electronic shelf labels draw scrutiny from senators amid inflation concerns [Image]. Grocery Dive. https://www.grocerydive.com/news/kroger-electronic-shelf-labels-instore-technology-senators-inflation/723939/
Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790. https://doi.org/10.1073/pnas.1320040111
OECD. (n.d.). Human rights in the digital age [Image]. OECD. https://www.oecd.org/en/topics/human-rights-in-the-digital-age.html
Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Harvard University Press. https://doi.org/10.4159/harvard.9780674736061
Piao, J., Liu, J., Zhang, F., Su, J., & Li, Y. (2023). Human–AI adaptive dynamics drives the emergence of information cocoons. Nature Machine Intelligence, 5(11), 1214–1224. https://doi.org/10.1038/s42256-023-00731-4
Rose, K., Eldridge, S., & Chapin, L. (2015). The Internet of Things: An overview. The Internet Society. https://www.internetsociety.org/resources/doc/2015/iot-overview
Wilgengebroed. (2012, December 6). Draw that represents the Internet of Things, signed by the author [Illustration]. Flickr. https://www.flickr.com/photos/wilgengebroed/8249565455/
Be the first to comment