Information Privacy and Security in the Age of Connected Cars

Introduction

In February 2024, one of the world’s largest automotive firms, Toyota, was accused by a user who was aware of the risk of connected car data use. The customer supposed that Toyota collects and shares extensive user data with third-party companies, such as debt-collecting agencies and insurance firms. The shocker is that it’s not the only company that collects and shares personal privacy. The Mozilla Foundation examined the privacy standards of 25 major automotive brands in 2023. All brands failed to meet the standard when it comes to consumer privacy. To be specific, an external intrusion into Mazda’s internal system servers resulted in the potential compromise of a total of approximately 104,732 pieces of information about the company’s employees and users. Moreover, between 2019 and 2022, Tesla employees have been exposed for privately sharing videos and photos recorded by customer car cameras.

As part of the Internet of Things (IoT) trend, manufacturer-driven product innovation is focused on adding Internet connectivity to products and improving the quality of service. Connected cars are rapidly gaining popularity. While connected car services can enhance the overall driving experience, these services rely heavily on the collection and utilization of customer data, which carries significant privacy risks (Nils et al., 2021). This blog will analyze user privacy issues in connected cars and make suggestions for how platforms should handle user information in the age of the Internet.

Privacy and Service

The definition of privacy has a variety of meanings,  but it is possible to construct a core meaning that includes “access to information”, “solitude”, “undisturbed” and “controlled space”( Wolfe, M., & Golan, 1976). Privacy provides a secure environment for people’s information, safe from disturbances and intrusions from unsettled elements. Although the right to privacy is limited in practice by other competing rights, obligations, and norms, it remains an inherent human right (Flew, 2021).

Privacy is particularly vulnerable in the internet era, where the public is forced to trade privacy for access to services. Nowadays, vehicles are normally “connected” by default. The connected car, on the other hand, has evolved from a closed mechanical system to a digital communication platform  (Jakobi et al., 2021). Connected car services provide drivers with smart parking help, global positioning systems,  real-time driving feedback, etc. There is no doubt that the above-connected services greatly enhance the driving experience. 

However, As the price of convenience, consumers have to accept that personal data is being collected and shared by car companies. If customers opt out of Connected Services, it would disable other functions such as Bluetooth and GPS. Most consumers are aware that vehicles collect personal data, but there is a lack of awareness about what data cars collect (Bella et al., 2021). In other words, because of the lack of uniform standards and relevant laws and regulations in the development of new products, the Company collects user data without clear instructions and purposes of use. Besides, when customers are aware of privacy risks and try to understand what data is being collected and used, they realize how vague the privacy regulations are. In this case, Toyota was exposed as having twelve near-incomprehensible privacy policy documents. These documents allocate significant powers to platforms and give them absolute discretion to make and enforce rules as they see fit (Suzor, 2019). Furthermore, the service agreements give customers little choice but to give up their information for use by third parties. All in all, such privacy regulations give the car companies permission rather than safeguard the consumer.

Perceived Privacy Risks

Data

In Toyota’s privacy policy, the Connected Services feature collects various bits and pieces of records, such as vehicle location, driving data, and personal information like phone numbers and email addresses. In fact, the car companies are given much more information than that. 

DATA and the CONNECTED CAR ( Future of Privacy Forum, 2017)

Modern automobiles are equipped with various types of connections, both internal, such as on-board computers and bus systems connecting sensors, and external, such as communication protocols between moving cars (Coppola & Morisio,  2016). The data collected through the sensors about the user’s behavior in the car is enough for the company to build a personal profile of the user. Car companies can collect information about your income, ethnicity, genetic information, and job. For example, route history allows car companies to learn about user’s addresses, jobs, interests, and more. If users allow it, they’ll even take a self-service look at your photos, calendar, and even to-do list. Data are presented as ‘raw material’ that can be translated into predictive algorithms about human behavior (Flew, 2021). The collection, storage,  and processing of such information have implications for individual privacy, as it can make various predictions about consumers. Through the prediction, they can better send their products and services to target consumers and it is also at risk of being sold to third parties. For example, automotive companies can use the acquired user information for security research, product development, and data analytics.

User Perception

Users have expressed their concerns about the collection and handling of personal information by car companies.  Jakob’s research shows that the risks that users are aware of when considering the collection of personal data by connected cars can be categorized into the following unlawful and lawful uses.

Connected cars: a favourite target of hacker (Peshkov, 2023)

On the one hand, illegal use refers to the ease with which hackers and other criminals can break into the system and capture personal driving data for theft, tracking, and other crimes. Consumers have a high prevalence of concerns about the negative impacts on personal physical safety. In 2023, the number of large-scale cyber incidents that could impact tens of millions of mobile assets has increased 2.5 times from 2022 (Upstream, 2024). If the car is on autopilot, the leakage of data can even risk the life of the user.  The number and scale of cyber incidents have grown dramatically, threatening passenger and vehicle safety and impacting operations. 

On the other hand, is legal use. Users fear the loss of a great deal of privacy by car companies sharing car data with third parties, and there are also concerns about secondary uses of the data, revealing improper operation of vehicles and their traffic offenses. This concern generally stems from vague privacy regulations, because the use of the data is not spelled out in the document. There are still major concerns about the huge information asymmetry between service providers and consumers(Flew, 2021). 

Privacy paradox

Consumers want personalized products and services but are reluctant to provide too much personal information due to concerns about commercial privacy abuse (Sheng et al., 2021). Users want to trade minimal personal information for personalized services, but companies offering personalized services need a lot of user data, leading to a paradox between personalization and privacy.

Personalization Vs. Privacy- Unleashing The Privacy Paradox (John, 2018)

There is no doubt that users have a positive attitude towards personalized services. Personalized services enhance the user experience by satisfying the need to get access to the service anytime, anywhere, and through any medium. For example, connected cars can provide users with more personalized travel services, such as route recommendations, customized music playback, voice control, and other functions, aiming to meet the diverse needs of users and enhance the user experience. Previous research has confirmed that personalization can drive positive user attitudes and behaviors (Gutierrez, 2019). This means that users may give up a certain level of privacy in exchange for these personalized services to meet their needs. On the contrary, when users perceive the risk of privacy exposure to be greater than expected, they will resist related innovative products or services. For example, when car companies share personal information with third parties, users are prone to the subjective feeling of being monitored and intruded upon, believing that private information is not effectively protected. In this case, they will forgo the corresponding personalized benefits and boycott the services offered by connected cars.

Privacy needs to be protected

In the digital age, open social networking platforms are novel, and privacy-protecting interfaces may block potentially valuable applications in the future. However, data collected and processed in the course of using the Internet may exceed the scope of use of the services provided. How to maintain the balance between the use of services and privacy is an important topic for both users and platforms. David Vaile, President of the Australian Privacy Foundation, said the collection and “exploitation” of customer data by multinational corporations was becoming more commonplace and Australia needed to provide stronger privacy protections for consumers.

What comes first is that, in the face of users’ privacy concerns, platforms are obliged to provide clearer and more explicit regulations informing users of the means, types, and purposes of data collection. Secondly, consumer permission is necessary for companies to share sensitive information or use it for marketing purposes. Some findings suggest that informing consumers about privacy regulations and the absence of privacy risk implications can help alleviate consumer concerns (Jakobi et al., 2021). At the same time, data minimization is required. That means the data collected by the platform from its customers must be used only to perform legitimate operations, other redundant information must be discarded. This action requires platforms to increase the transparency of information processing so that users have a complete understanding of how their personal information will be handled. Last but not least, In the context of the GDPR, data processors are required to carry out a risk assessment to take appropriate technical and robust privacy protections for consumers (Mariusz, 2018). Platforms must provide users with stable security for their private data when facing external threats, especially in the face of hacker attacks.

Social Media Privacy Settings: Keep Your Data Away from Prying Eyes (Adgully,2021)

Users should value personal information to protect their privacy and minimize potential risks. Depending on the nature and design of social media platforms, they encourage users to disclose a great deal of personal information, such as their real name, date of birth and sexual orientation, which is used to actively construct their identity. Research has shown an increase in the use of protections by users on such platforms. (Debatin, 2009). Protecting personal privacy on social media and other online platforms requires users to take care to review and manage privacy settings, such as limiting the public visibility of personal data and regularly purging unwanted posts and comments. Also, be careful about adding online friends, clicking on links and downloading files, protecting login credentials, and limiting application permissions. At the same time, continue to learn about cybersecurity and actively spread privacy awareness and knowledge to promote broader privacy awareness and practices.

All of the above elaborations are ideal, but with the high profits that platforms make by selling users’ personal information in exchange for it, few platforms are able to meet users’ needs in terms of protecting their privacy. Actually, most platforms choose to ignore the privacy claims made by users and choose to let them go when they have a grievance. Users are still in a passive position when it comes to protecting their personal data privacy, their self-protection doesn’t enough if the platform doesn’t take responsibility for protecting user privacy.

Conclusion

With the rapid development of the Internet, social networking platforms and the Internet of Things are integrated into our lives. Platforms and connected products provide personalized services to users by collecting and storing their daily online activities and life tracks. While data sharing has brought convenience and positive feedback on the experience for users, it has also raised public concerns about the adversely affected by privacy violations. How to balance personal privacy and service quality has become a topic of discussion for both the public and platforms. The right to privacy protects individuals from invasion of the private sphere by governments, corporations, and fellow citizens, focusing on the right not to allow them access to personal data, bodies, or homes. However, there is a contradiction between platforms and users, where users want to exchange minimal personal information for high-quality services, and platforms collect large amounts of user information through their services and sell it to third parties for huge profits.

Moreover, the threat to user privacy exists not only in the sharing of resources within the platform but also in the risk of an external attack on the platform. In this case, platforms should take effective measures to protect users’ privacy from theft, such as establishing effective firewalls. Users should also be aware of the extent of information disclosure on the Web and actively use privacy features to assert their rights. 

The discussion on privacy never stops, and the development of the Internet has created new challenges on how to protect privacy. But whatever the context, the right to privacy as a human right should not be ignored.

Reference

Bella, G., Biondi, P., & Tudisco, G. (2021). Car Drivers’ Privacy Concerns and Trust Perceptions. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12927, 143–154. https://doi.org/10.1007/978-3-030-86586-3_10

Coppola, R., & Morisio, M. (2016). Connected Car: Technologies, Issues, Future Trends. ACM Computing Surveys, 49(3), 1–36. https://doi.org/10.1145/2971482

Debatin, B., Lovejoy, J. P., Horn, A.-K., & Hughes, B. N. (2009). Facebook and Online Privacy: Attitudes, Behaviors, and Unintended Consequences. Journal of Computer-Mediated Communication, 15(1), 83–108. https://doi.org/10.1111/j.1083-6101.2009.01494.x

Flew, T. (2021). Regulating platforms. Polity Press.

Gutierrez, A., O’Leary, S., Rana, N. P., Dwivedi, Y. K., & Calle, T. (2019). Using privacy calculus theory to explore entrepreneurial directions in mobile location-based advertising: Identifying intrusiveness as the critical risk factor. Computers in Human Behavior, 95, 295–306. https://doi.org/10.1016/j.chb.2018.09.015

Jakobi, T., Alizadeh, F., Marburger, M., & Stevens, G. (2021). A Consumer Perspective on Privacy Risk Awareness of Connected Car Data Use. ACM International Conference Proceeding Series, 294–302. https://doi.org/10.1145/3473856.3473891

Mariusz Krzysztofek. (2018). GDPR : General Data Protection Regulation (EU) 2016/679 : post-reform personal data protection in the European Union. Kluwer Law International.

Nils Koester, Patrick Cichy, David Antons, & Torsten Oliver Salge. (2021). Privacy Risk Perceptions in the Connected Car Context. Proceedings of the Annual Hawaii International Conference on System Sciences. https://doi.org/10.24251/hicss.2021.536

Sheng, H., Nah, F., & Siau, K. (2008). An Experimental Study on Ubiquitous commerce Adoption: Impact of Personalization and Privacy Concerns. Journal of the Association for Information Systems, 9(6), 344–376. https://doi.org/10.17705/1jais.00161

Suzor, N. P. (2019). Lawless: The secret rules that govern our digital lives. Cambridge University Press.

Upstream’s 2023 global automotive cybersecurity report. (2024). Upstream Security. https://upstream.auto/reports/global-automotive-cybersecurity-report/

Wolfe, M., & Golan, M. B. (1976). Privacy and institutionalization. Paper presented at the meeting of the Environmental Design Research Association, Van- couver, B. C.

Be the first to comment

Leave a Reply