Ubiquitous Computing: Driving in the Intelligent Environment
Abstract
:1. Introduction
2. Literature Review
2.1. Connected Cars and Smart Transport Infrastructure
2.2. Wearables and the Intelligent World Environment
2.3. User Experience and Car Multimodal Interfaces
3. Proposed Solution
3.1. Harvested Categories of Data
3.1.1. The Internet Cloud (the World)
3.1.2. The Smart Devices (the Car)
3.1.3. The Ubiquitous Interface (the IVIS)
3.1.4. The Monitored User (the Driver)
3.2. Processed Categories of Information
3.2.1. Car, Road and Driving Task Related Information
3.2.2. Personal Health and Wellbeing Related Information
3.2.3. Social and Media Updates Related Information
3.3. Natural Multimodal Interaction
3.3.1. Visual Interaction
3.3.2. Audio Interaction
3.3.3. Gesture Interaction
3.4. Autonomy and Proactive Interaction
3.4.1. Active Monitoring
3.4.2. Proactive Action
4. Heuristic Evaluation of the IVIS
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Arena, F.; Pau, G.; Severino, A. An overview on the current status and future perspectives of smart cars. Infrastructures 2020, 5, 53. [Google Scholar] [CrossRef]
- Trubia, S.; Severino, A.; Curto, S.; Arena, F.; Pau, G. Smart Roads: An Overview of What Future Mobility Will Look Like. Infrastructures 2020, 5, 107. [Google Scholar] [CrossRef]
- Seiberth, G.; Gründinger, W. Data-driven business models in connected cars, mobility services & beyond. BVDW Res. 2018, 1, 18. [Google Scholar]
- Rijcken, C. Rainforests of wearables and insideables. In Pharmaceutical Care in Digital Revolution; Academic Press: Cambridge, MA, USA, 2019; pp. 107–117. [Google Scholar]
- Gheran, B.F.; Vatavu, R.D. From controls on the steering wheel to controls on the finger: Using smart rings for in-vehicle interactions. In Companion Publication of the 2020 ACM Designing Interactive Systems Conference; Massachusetts Institute of Technology: Cambridge, MA, USA, 2020; pp. 299–304. [Google Scholar]
- Pelliccione, P.; Knauss, E.; Ågren, S.M.; Heldal, R.; Bergenhem, C.; Vinel, A.; Brunnegård, O. Beyond connected cars: A systems of systems perspective. Sci. Comput. Program. 2020, 191, 102414. [Google Scholar] [CrossRef]
- Telang, S.; Chel, A.; Nemade, A.; Kaushik, G. Intelligent Transport System for a Smart City. In Security and Privacy Applications for Smart City Development; Springer: Cham, 2021; pp. 171–187. [Google Scholar]
- Eiza, M.H.; Cao, Y.; Xu, L. Toward Sustainable and Economic Smart Mobility: Shaping the Future of Smart Cities; WSPC: Casper, WY, USA, 2020. [Google Scholar]
- Ahram, T.; Karwowski, W.; Vergnano, A.; Leali, F.; Taiar, R. Intelligent Human Systems Integration. In Proceedings of the 3rd International Conference on Intelligent Human Systems Integration (IHSI 2020): Integrating People and Intelligent Systems, Modena, Italy, 19–21 February 2020; Springer Nature: Cham, Switzerland, 2020. [Google Scholar]
- Nischak, F.; Hanelt, A. Ecosystem Change in the Era of Digital Innovation–A Longitudinal Analysis and Visualization of the Automotive Ecosystem. In Proceedings of the ICIS 2019 Proceedings, Munich, Germany, 15–18 December 2019; ISBN 978-0-9966831-9-7. [Google Scholar]
- Uddin, H.; Gibson, M.; Safdar, G.A.; Kalsoom, T.; Ramzan, N.; Ur-Rehman, M.; Imran, M.A. IoT for 5G/B5G applications in smart homes, smart cities, wearables and connected cars. In Proceedings of the 2019 IEEE 24th International Workshop on Computer Aided Modeling and Design of Communication Links and Networks (CAMAD), Limassol, Cyprus, 23 June 2019; pp. 1–5. [Google Scholar]
- Kuoch, S.K.; Nowakowski, C.; Hottelart, K.; Reilhac, P.; Escrieut, P. Designing an Intuitive Driving Experience in a Digital World. Automot. Eng. 2018. preprint. [Google Scholar] [CrossRef]
- Budaker, B.; Geiger, M.; Fernandes, K. Development of smart interior systems for connected cars. In Internationales Stuttgarter Symposium; Springer Vieweg: Wiesbaden, Germany, 2018; pp. 1265–1276. [Google Scholar]
- Perelló, J.R.; García, A. A case study of cooperative design on integrated smart-car systems: Assessing drivers’ experience. In International Conference on Cooperative Design, Visualization and Engineering; Springer: Cham, Switzerland, 2017; pp. 202–206. [Google Scholar]
- Broström, R.; Engström, J.; Agnvall, A.; Markkula, G. Towards the next generation intelligent driver information system (IDIS): The Volvo car interaction manager concept. In Proceedings of the 2006 ITS World Congress, London, UK, 8–12 October 2006; Volume 32. [Google Scholar]
- Han, J.; Kim, H.; Heo, S.; Lee, N.; Kang, D.; Oh, B.; Kim, K.; Yoon, W.; Byun, J.; Kim, D. GS1 Connected Car: An Integrated Vehicle Information Platform and Its Ecosystem for Connected Car Services based on GS1 Standards. In Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018; pp. 367–374. [Google Scholar]
- Lipson, H.; Kurman, M. Driverless: Intelligent Cars and the Road ahead; Mit Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Großwindhager, B.; Rupp, A.; Tappler, M.; Tranninger, M.; Weiser, S.; Aichernig, B.K.; Boano, C.A.; Horn, M.; Kubin, G.; Mangard, S.; et al. Dependable internet of things for networked cars. Int. J. Comput. 2017, 16, 226–237. [Google Scholar] [CrossRef]
- Vörös, F.; Tompos, Z.; Kovács, B. Examination of car navigation systems and UX designs–suggestion for a new interface. Proc. Int. Cartogr. Assoc 2019, 2, 139. [Google Scholar] [CrossRef]
- Kazmi, S.A.; Dang, T.N.; Yaqoob, I.; Ndikumana, A.; Ahmed, E.; Hussain, R.; Hong, C.S. Infotainment enabled smart cars: A joint communication, caching, and computation approach. IEEE Trans. Veh. Technol. 2019, 68, 8408–8420. [Google Scholar] [CrossRef]
- Pandit, S.; Fitzek, F.H.; Redana, S. Demonstration of 5G connected cars. In Proceedings of the 2017 14th IEEE Annual Consumer Communications & Networking Conference (CCNC), Las Vegas, NV, USA, 8–11 January 2017; pp. 605–606. [Google Scholar]
- Giust, F.; Sciancalepore, V.; Sabella, D.; Filippou, M.C.; Mangiante, S.; Featherstone, W.; Munaretto, D. Multi-access edge computing: The driver behind the wheel of 5G-connected cars. IEEE Commun. Stand. Mag. 2018, 2, 66–73. [Google Scholar] [CrossRef]
- Uhlir, D.; Sedlacek, P.; Hosek, J. Practial overview of commercial connected cars systems in Europe. In Proceedings of the 2017 9th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT), Munich, Germany, 6–8 November 2017; pp. 436–444. [Google Scholar]
- Marosi, A.C.; Lovas, R.; Kisari, Á.; Simonyi, E. January. A novel IoT platform for the era of connected cars. In Proceedings of the 2018 IEEE International Conference on Future IoT Technologies (Future IoT), Eger, Hungary, 18–19 January 2018; pp. 1–11. [Google Scholar]
- Jiang, T.; Fang, H.; Wang, H. Blockchain-Based Internet of Vehicles: Distributed Network Architecture and Performance Analysis. IEEE Internet Things J. 2018, 6, 4640–4649. [Google Scholar] [CrossRef]
- Xu, W.; Zhou, H.; Cheng, N.; Lyu, F.; Shi, W.; Chen, J.; Shen, X. Internet of vehicles in big data era. IEEE J. Autom. Sin. 2017, 5, 19–35. [Google Scholar] [CrossRef]
- Zhdanenko, O.; Liu, J.; Torre, R.; Mudriievskiy, S.; Salah, H.; Nguyen, G.T.; Fitzek, H.F. Demonstration of mobile edge cloud for 5g connected cars. In Proceedings of the 2019 16th IEEE Annual Consumer Communications & Networking Con-ference (CCNC), Las Vegas, NV, USA, 11–14 January 2019; pp. 1–2. [Google Scholar]
- Ma, Z.; Zhang, J.; Guo, Y.; Liu, Y.; Liu, X.; He, W. An Efficient Decentralized Key Management Mechanism for VANET With Blockchain. IEEE Trans. Veh. Technol. 2020, 69, 5836–5849. [Google Scholar] [CrossRef]
- Ayaz, F.; Sheng, Z.; Tian, D.; Leung, V.C. Blockchain-enabled security and privacy for Internet-of-Vehicles. In Internet of Vehicles and its Applications in Autonomous Driving; Springer: Cham, Switzerland, 2021; pp. 123–148. [Google Scholar]
- Hernandez-Oregon, G.; Rivero-Angeles, M.E.; Chimal-Eguía, J.C.; Campos-Fentanes, A.; Jimenez-Gallardo, J.G.; Es-tevez-Alva, U.O.; Juarez-Gonzalez, O.; Rosas-Calderon, P.O.; Sandoval-Reyes, S.; Menchaca-Mendez, R. Performance analysis of V2V and V2I LiFi communication systems in traffic lights. Wirel. Commun. Mobile Comput. 2019, 2019, 4279683. [Google Scholar] [CrossRef] [Green Version]
- Spahiu, C.S.; Stanescu, L.; Brezovan, M.; Petcusin, F. LiFi Technology Feasibility Study for Car-2-Car Communication. In Proceedings of the 21th International Carpathian Control Conference (ICCC), High Tatras, Slovakia, 27–29 October 2020; pp. 1–5. [Google Scholar]
- Burkacky, O.; Deichmann, J.; Doll, G.; Knochenhauer, C. Rethinking Car Software and Electronics Architecture; McKinsey & Co.: New York, NY, USA, 2019. [Google Scholar]
- Yang, G.; Ahmed, M.; Gaweesh, S.; Adomah, E. Connected vehicle real-time traveler information messages for freeway speed harmonization under adverse weather conditions: Trajectory level analysis using driving simulator. Accid. Anal. Prev. 2020, 146, 105707. [Google Scholar] [CrossRef]
- Jagielski, M.; Jones, N.; Lin, C.W.; Nita-Rotaru, C.; Shiraishi, S. Threat detection for collaborative adaptive cruise control in connected cars. In Proceedings of the 11th ACM Conference on Security & Privacy in Wireless and Mobile Networks, Stockholm, Sweden, 18–20 June 2018; pp. 184–189. [Google Scholar]
- Lee, E.-K.; Gerla, M.; Pau, G.; Lee, U.; Lim, J.-H. Internet of Vehicles: From intelligent grid to autonomous cars and vehicular fogs. Int. J. Distrib. Sens. Netw. 2016, 12, 1550147716665500. [Google Scholar] [CrossRef]
- Gerla, M.; Lee, E.-K.; Pau, G.; Lee, U. Internet of vehicles: From intelligent grid to autonomous cars and vehicular clouds. In IEEE World Forum on Internet of Things (WF-IoT); IEEE: Piscataway, NJ, USA, 2014; pp. 241–246. [Google Scholar] [CrossRef]
- Bosler, M.; Jud, C.; Herzwurm, G. Platforms and Ecosystems for Connected Car Services. In Proceedings of the 9th International Workshop Software Ecosystem IWSECO, Espoo, Finland, 29 November 2017; pp. 16–27. [Google Scholar]
- Zhou, H.; Xu, W.; Chen, J.; Wang, W. Evolutionary V2X Technologies toward the Internet of Vehicles: Challenges and Opportunities. Proc. IEEE 2020, 108, 308–323. [Google Scholar] [CrossRef]
- Mirnig, N.; Perterer, N.; Stollnberger, G.; Tscheligi, M. Three strategies for autonomous car-to-pedestrian communication: A survival guide. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; pp. 209–210. [Google Scholar]
- Bai, H.; Shen, J.; Wei, L.; Feng, Z. Accelerated Lane-Changing Trajectory Planning of Automated Vehicles with Vehicle-to-Vehicle Collaboration. J. Adv. Transp. 2017, 2017, 8132769. [Google Scholar] [CrossRef] [Green Version]
- Hock, P.; Benedikter, S.; Gugenheimer, J.; Rukzio, E. Carvr: Enabling in-car virtual reality entertainment. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 4034–4044. [Google Scholar]
- Malinverno, M.; Mangues-Bafalluy, J.; Casetti, C.E.; Chiasserini, C.F.; Requena-Esteso, M.; Baranda, J. An Edge-Based Framework for Enhanced Road Safety of Connected Cars. IEEE Access 2020, 8, 58018–58031. [Google Scholar] [CrossRef]
- Bierzynski, K.; Escobar, A.; Eberl, M. Cloud, fog and edge: Cooperation for the future? In Proceedings of the 2017 Second International Conference on Fog and Mobile Edge Computing (FMEC), Valencia, Spain, 8–11 May 2017; pp. 62–67. [Google Scholar]
- Ghosh, S.; Mukherjee, A.; Ghosh, S.K.; Buyya, R. Mobi-iost: Mobility-aware cloud-fog-edge-iot collaborative framework for time-critical applications. IEEE Trans. Netw. Sci. Eng. 2019, 7, 2271–2285. [Google Scholar] [CrossRef] [Green Version]
- Vallati, M. Centralised Versus Decentralised Traffic Optimisation of Urban Road Networks: A Simulation Study. In Proceedings of the 2020 IEEE 5th International Conference on Intelligent Transportation Engineering (ICITE), Beijing, China, 11–13 September 2020; pp. 319–325. [Google Scholar] [CrossRef]
- Duan, L.; Wei, Y.; Zhang, J.; Xia, Y. Centralized and decentralized autonomous dispatching strategy for dynamic autonomous taxi operation in hybrid request mode. Transp. Res. Part C Emerg. Technol. 2020, 111, 397–420. [Google Scholar] [CrossRef]
- Olaverri-Monreal, C.; Lehsing, C.; Trubswetter, N.; Schepp, C.A.; Bengler, K. In-vehicle displays: Driving information prioritization and visualization. In Proceedings of the 2013 IEEE Intelligent Vehicles Symposium (IV), Gold Coast, Australia, 23–26 June 2013; pp. 660–665. [Google Scholar] [CrossRef]
- Siems-Anderson, A.R.; Walker, C.L.; Wiener, G.; Mahoney, W.P., III; Haupt, S.E. An adaptive big data weather system for surface transportation. Transp. Res. Interdiscip. Perspect. 2019, 3, 100071. [Google Scholar] [CrossRef]
- Kamoun, F.; Chaabani, H.; Outay, F.; Yasar, A.-U. A Survey of Approaches for Estimating Meteorological Visibility Distance under Foggy Weather Conditions. IGI Glob. 2020, 65–92. [Google Scholar] [CrossRef] [Green Version]
- Wang, K.; Zhang, W.; Feng, Z.; Yu, H.; Wang, C. Reasonable driving speed limits based on recognition time in a dynamic low-visibility environment related to fog—A driving simulator study. Accid. Anal. Prev. 2021, 154, 106060. [Google Scholar] [CrossRef]
- Hold-Geoffroy, Y.; Sunkavalli, K.; Hadap, S.; Gambaretto, E.; Lalonde, J.F. Deep outdoor illumination estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7312–7321. [Google Scholar]
- Wood, J.M. Nighttime driving: Visual, lighting and visibility challenges. Ophthalmic Physiol. Opt. 2020, 40, 187–201. [Google Scholar] [CrossRef] [Green Version]
- Pegin, P.; Sitnichuk, E. The Effect of Sun Glare: Concept, Characteristics, Classification. Transp. Res. Procedia 2017, 20, 474–479. [Google Scholar] [CrossRef]
- Li, X.; Cai, B.Y.; Qiu, W.; Zhao, J.; Ratti, C. A novel method for predicting and mapping the occurrence of sun glare using Google Street View. Transp. Res. Part C Emerg. Technol. 2019, 106, 132–144. [Google Scholar] [CrossRef]
- Glaser, S.; Mammar, S.; Dakhlallah, D. Lateral wind force and torque estimation for a driving assistance. IFAC Proc. Vol. 2008, 41, 5688–5693. [Google Scholar] [CrossRef] [Green Version]
- Li, Y.; Xing, L.; Wang, W.; Wang, H.; Dong, C.; Liu, S. Evaluating impacts of different longitudinal driver as-sistance systems on reducing multi-vehicle rear-end crashes during small-scale inclement weather. Accid. Anal. Prev. 2017, 107, 63–76. [Google Scholar] [CrossRef]
- Darwish, T.S.; Bakar, K.A. Fog based intelligent transportation big data analytics in the internet of vehicles environment: Motivations, architecture, challenges, and critical issues. IEEE Access 2018, 6, 15679–15701. [Google Scholar] [CrossRef]
- Hirz, M.; Walzel, B. Sensor and object recognition technologies for self-driving cars. Comput. Des. Appl. 2018, 15, 501–508. [Google Scholar] [CrossRef] [Green Version]
- Krumm, J. Ubiquitous Computing Fundamentals; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Brush, A.B. Ubiquitous Computing Field Studies. In Ubiquitous Computing Fundamentals; Chapman and Hall/CRC: London, UK, 2018; pp. 175–216. [Google Scholar]
- Saganowski, S.; Kazienko, P.; Dzieżyc, M.; Jakimów, P.; Komoszyńska, J.; Michalska, W.; Dutkowiak, A.; Polak, A.; Dziadek, A.; Ujma, M. Consumer wearables and affective computing for wellbeing support. arXiv 2020, preprint. arXiv:2005.00093. [Google Scholar]
- El-Gayar, O.F.; Ambati, L.S.; Nawar, N. Wearables, artificial intelligence, and the future of healthcare. In AI and Big Data’s Potential for Disruptive Innovation; IGI Global: Hershey, PA, USA, 2020; pp. 104–129. [Google Scholar]
- Hicks, J.L.; Althoff, T.; Sosic, R.; Kuhar, P.; Bostjancic, B.; King, A.C.; Leskovec, J.; Delp, S.L. Best practices for analyzing large-scale health data from wearables and smartphone apps. NPJ Digit. Med. 2019, 2, 45. [Google Scholar] [CrossRef] [PubMed]
- Lou, M.; Abdalla, I.; Zhu, M.; Wei, X.; Yu, J.; Li, Z.; Ding, B. Highly Wearable, Breathable, and Washable Sensing Textile for Human Motion and Pulse Monitoring. ACS Appl. Mater. Interfaces 2020, 12, 19965–19973. [Google Scholar] [CrossRef] [PubMed]
- Liang, R.H.; Yang, S.Y.; Chen, B.Y. Indexmo: Exploring finger-worn RFID motion tracking for activity recognition on tagged objects. In Proceedings of the 23rd International Symposium on Wearable Computers, London, UK, 19–13 September; pp. 129–134.
- Bandodkar, A.J.; Jia, W.; Yardımcı, C.; Wang, X.; Ramirez, J.; Wang, J. Tattoo-based noninvasive glucose mon-itoring: A proof-of-concept study. Anal. Chem. 2015, 87, 394–398. [Google Scholar] [CrossRef]
- Andrew, T.L. The Future of Smart Textiles: User Interfaces and Health Monitors. Matter 2020, 2, 794–795. [Google Scholar] [CrossRef]
- Kurasawa, S.; Ishizawa, H.; Fujimoto, K.; Chino, S.; Koyama, S. Development of Smart Textiles for Self-Monitoring Blood Glucose by Using Optical Fiber Sensor. J. Fiber Sci. Technol. 2020, 76, 104–112. [Google Scholar] [CrossRef]
- Zhou, Z.; Padgett, S.; Cai, Z.; Conta, G.; Wu, Y.; He, Q.; Zhang, S.; Sun, C.; Liu, J.; Fan, E.; et al. Single-layered ultra-soft washable smart textiles for all-around ballistocardiograph, respiration, and posture monitoring during sleep. Biosens. Bioelectron. 2020, 155, 112064. [Google Scholar] [CrossRef]
- Koyama, S.; Sakaguchi, A.; Ishizawa, H.; Yasue, K.; Oshiro, H.; Kimura, H. Vital Sign Measurement Using Covered FBG Sensor Embedded into Knitted Fabric for Smart Textile. J. Fiber Sci. Technol. 2017, 73, 300–308. [Google Scholar] [CrossRef] [Green Version]
- Sinnapolu, G.; Alawneh, S. Integrating wearables with cloud-based communication for health monitoring and emergency assistance. Internet Things 2018, 1–2, 40–54. [Google Scholar] [CrossRef]
- Betancourt Diaz, N.R. Wearables, Big Data and Design Thinking: Perspectives from the Wellbeing Industry. Available online: https://www.politesi.polimi.it/handle/10589/139431 (accessed on 1 July 2021).
- Lin, F.-R.; Windasari, N.A. Continued use of wearables for wellbeing with a cultural probe. Serv. Ind. J. 2018, 39, 1140–1166. [Google Scholar] [CrossRef]
- Persson, N.-K.; Martinez, J.G.; Zhong, Y.; Maziz, A.; Jager, E.W.H. Actuating Textiles: Next Generation of Smart Textiles. Adv. Mater. Technol. 2018, 3, 1700397. [Google Scholar] [CrossRef]
- Kongahage, D.; Foroughi, J. Actuator Materials: Review on Recent Advances and Future Outlook for Smart Textiles. Fibers 2019, 7, 21. [Google Scholar] [CrossRef] [Green Version]
- Rayes, A.; Salam, S. The things in iot: Sensors and actuators. In Internet of Things from Hype to Reality; Springer: Cham, Switzerland, 2017; pp. 57–77. [Google Scholar]
- Kazeem, O.O.; Akintade, O.O.; Kehinde, L.O. Comparative study of communication interfaces for sensors and actuators in the cloud of internet of things. Int. J. Internet Things 2020, 6, 9–13. [Google Scholar]
- Pawlowski, E.; Pawlowski, K.; Trzcielinska, J.; Trzcielinski, S. Designing and management of intelligent, autonomous environment (IAE): The research framework. In Proceedings of the International Conference on Human Systems Engineering and Design: Future Trends and Applications, Pula, Croatia, 22–24 September 2020; Springer: Cham, Switzerland, 2020; pp. 381–386. [Google Scholar]
- Takayama, L. The motivations of ubiquitous computing: Revisiting the ideas behind and beyond the prototypes. Pers. Ubiquitous Comput. 2017, 21, 557–569. [Google Scholar] [CrossRef]
- Ravenswaaij-Arts, C.M.; Kollee, L.A.; Hopman, J.C.; Stoelinga, G.B.; van Geijn, H.P. Heart rate variability. Ann. Intern. Med. 1993, 118, 436–447. [Google Scholar] [CrossRef]
- Ranjan, Y.; Rashid, Z.; Stewart, C.; Conde, P.; Begale, M.; Verbeeck, D.; Boettcher, S.; Dobson, R.; Folarin, A.; Hyve, T.; et al. RADAR-Base: Open Source Mobile Health Platform for Collecting, Monitoring, and Analyzing Data Using Sensors, Wearables, and Mobile Devices. JMIR mHealth uHealth 2019, 7, e11734. [Google Scholar] [CrossRef] [Green Version]
- Fakhrhosseini, S.M.; Jeon, M. How do angry drivers respond to emotional music? A comprehensive perspective on assessing emotion. J. Multimodal User Interfaces 2019, 13, 137–150. [Google Scholar] [CrossRef] [Green Version]
- Abdi, L.; Ben Abdallah, F.; Meddeb, A. In-Vehicle Augmented Reality Traffic Information System: A New Type of Communication Between Driver and Vehicle. Procedia Comput. Sci. 2015, 73, 242–249. [Google Scholar] [CrossRef] [Green Version]
- Abdi, L.; Meddeb, A. Driver information system: A combination of augmented reality, deep learning and vehicular Ad-hoc networks. Multimed. Tools Appl. 2017, 77, 14673–14703. [Google Scholar] [CrossRef]
- Schipor, O.A.; Vatavu, R.D. Towards Interactions with Augmented Reality Systems in Hyper-Connected Cars. EICS Workshops 2019, 2503, 76–82. [Google Scholar]
- Vögel, H.J.; Süß, C.; Hubregtsen, T.; André, E.; Schuller, B.; Härri, J.; Conradt, J.; Adi, A.; Zadorojniy, A.; Terken, J.; et al. Emotion-awareness for intelligent vehicle assistants: A research agenda. In Proceedings of the 2018 IEEE/ACM 1st International Workshop on Software Engineering for AI in Autonomous Systems (SEFAIAS), Gothenburg, Sweden, 28 May 2018; 2018; pp. 11–15. [Google Scholar]
- Birek, L.; Grzywaczewski, A.; Iqbal, R.; Doctor, F.; Chang, V. A novel Big Data analytics and intelligent technique to predict driver’s intent. Comput. Ind. 2018, 99, 226–240. [Google Scholar] [CrossRef]
- Michalke, T.; Gepperth, A.; Schneider, M.; Fritsch, J.; Goerick, C. Towards a Human-like Vision System for Resource-Constrained Intelligent Cars. In Proceedings of the International Conference on Computer Vision Systems, Berlin, Germany, 11–14 March 2007. [Google Scholar] [CrossRef]
- Davidsson, S.; Alm, H. Context adaptable driver information–Or, what do whom need and want when? Appl. Ergon. 2014, 45, 994–1002. [Google Scholar] [CrossRef]
- McStay, A. Emotional AI: The Rise of Empathic Media; Sage: Thousand Oaks, CA, USA, 2018. [Google Scholar] [CrossRef]
- McStay, A. Emotional AI, soft biometrics and the surveillance of emotional life: An unusual consensus on privacy. Big Data Soc. 2020, 7, 2053951720904386. [Google Scholar] [CrossRef] [Green Version]
- Braun, M.; Schubert, J.; Pfleging, B.; Alt, F. Improving Driver Emotions with Affective Strategies. Multimodal Technol. Interact. 2019, 3, 21. [Google Scholar] [CrossRef] [Green Version]
- Oehl, M.; Ihme, K.; Pape, A.-A.; Vukelić, M.; Braun, M. Affective Use Cases for Empathic Vehicles in Highly Automated Driving: Results of an Expert Workshop. In International Conference on Human-Computer Interaction; Springer: Cham, Switzerland, 2020; pp. 89–100. [Google Scholar] [CrossRef]
- Frison, A.K.; Wintersberger, P.; Riener, A.; Schartmüller, C. Driving Hotzenplotz: A hybrid interface for vehicle control aiming to maximize pleasure in highway driving. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; pp. 236–244. [Google Scholar]
- Caon, M.; Demierre, M.; Abou Khaled, O.; Mugellini, E.; Delaigue, P. February. Enriching the user experience of a connected car with quantified self. In International Conference on Intelligent Human Systems Integration; Springer: Cham, Switzerland, 2020; pp. 66–72. [Google Scholar]
- Giraldi, L. The Future of User Experience Design in the Interior of Autonomous Car Driven by AI. In International Conference on Intelligent Human Systems Integration; Springer: Cham, Switzerland, 2020; pp. 46–51. [Google Scholar]
- Lindgren, T.; Fors, V.; Pink, S.; Bergquist, M.; Berg, M. On the way to anticipated car UX. In Proceedings of the 10th Nordic Conference on Human-Computer Interaction, Oslo, Norway, 29 September–3 October 2018; pp. 494–504. [Google Scholar]
- Basu, C.; Yang, Q.; Hungerman, D.; Sinahal, M.; Draqan, A.D. Do you want your autonomous car to drive like you? In Proceedings of the 2017 12th ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; pp. 417–425. [Google Scholar]
- Paredes, P.E.; Balters, S.; Qian, K.; Murnane, E.L.; Ordóñez, F.; Ju, W.; Landay, J.A. Driving with the fishes: Towards calming and mindful virtual reality experiences for the car. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 2, 1–21. [Google Scholar] [CrossRef]
- Silva, F.; Analide, C. Ubiquitous driving and community knowledge. J. Ambient. Intell. Humaniz. Comput. 2017, 8, 157–166. [Google Scholar] [CrossRef]
- Oehl, M.; Ihme, K.; Bosch, E.; Pape, A.A.; Vukelić, M.; Braun, M. Emotions in the age of automated driving-developing use cases for empathic cars. In Mensch und Computer 2019-Workshopband; Gesellschaft für Informatik: Bonn, Germany, 2019. [Google Scholar]
- Fonsalas, F. Holistic HMI Architecture for Adaptive and Predictive Car Interiors. In Electronic Components and Systems for Automotive Applications; Springer: Cham, Switzerland, 2019; pp. 217–227. [Google Scholar]
- Neuhaus, R.; Laschke, M.; Theofanou-Fülbier, D.; Hassenzahl, M.; Sadeghian, S. Exploring the im-pact of transparency on the interaction with an in-car digital AI assistant. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings, Utrecht, The Netherlands, 21–25 September 2019; pp. 450–455. [Google Scholar]
- Korthauer, A.; Guenther, C.; Hinrichs, A.; Ren, W.; Yang, Y. Watch Your Vehicle Driving at the City: Interior HMI with Augmented Reality for Automated Driving. In Proceedings of the 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services, Oldenburg, Germany, 5–8 October 2020; pp. 1–5. [Google Scholar]
- Liu, H.; Taniguchi, T.; Tanaka, Y.; Takenaka, K.; Bando, T. Visualization of Driving Behavior Based on Hidden Feature Extraction by Using Deep Learning. IEEE Trans. Intell. Transp. Syst. 2017, 18, 2477–2489. [Google Scholar] [CrossRef]
- Dahl, D.A. Multimodal Interaction with W3C Standards; Springer International Publishing: Cham, Switzerland, 2017. [Google Scholar]
- Pesek, M.; Strle, G.; Kavčič, A.; Marolt, M. The Moodo dataset: Integrating user context with emotional and color perception of music for affective music information retrieval. J. New Music. Res. 2017, 46, 246–260. [Google Scholar] [CrossRef]
- Zhang, J.; Wang, B.; Zhang, C.; Xiao, Y.; Wang, M.Y. An EEG/EMG/EOG-Based Multimodal Human-Machine Interface to Real-Time Control of a Soft Robot Hand. Front. Neurorobotics 2019, 13, 7. [Google Scholar] [CrossRef] [Green Version]
- Djamal, E.C.; Fadhilah, H.; Najmurrokhman, A.; Wulandari, A.; Renaldi, F. Emotion brain-computer interface using wavelet and recurrent neural networks. Int. J. Adv. Intell. Inform. 2020, 6, 1–12. [Google Scholar] [CrossRef]
- Nam, C.S.; Nijholt, A.; Lotte, F. Brain–Computer Interfaces Handbook: Technological and Theoretical Advances; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Ceccacci, S.; Mengoni, M.; Andrea, G.; Giraldi, L.; Carbonara, G.; Castellano, A.; Montanari, R. A Preliminary Investigation Towards the Application of Facial Expression Analysis to Enable an Emotion-Aware Car Interface. In International Conference on Human-Computer Interaction; Springer: Cham, Switzerland, 2020; pp. 504–517. [Google Scholar]
- Delbouys, R.; Hennequin, R.; Piccoli, F.; Royo-Letelier, J.; Moussallam, M. Music mood detection based on audio and lyrics with deep neural net. arXiv 2018, preprint. arXiv:1809.07276. [Google Scholar]
- Ünal, A.B.; de Waard, D.; Epstude, K.; Steg, L. Driving with music: Effects on arousal and performance. Transp. Res. Part F Traffic Psychol. Behav. 2013, 21, 52–65. [Google Scholar] [CrossRef]
- Amini, R.; Willemsen, M.C.; Graus, M.P. Affective Music Recommender System (MRS): Investigating the Effectiveness and User Satisfaction of different Mood Inducement Strategies. 2019. Available online: https://pure.tue.nl/ws/portalfiles/portal/131839906/Affective_MRS_R._Amini_v1.1.pdf (accessed on 31 June 2021).
- Park, M.; Thom, J.; Mennicken, S.; Cramer, H.; Macy, M. Global music streaming data reveal diurnal and seasonal patterns of affective preference. Nat. Hum. Behav. 2019, 3, 230–236. [Google Scholar] [CrossRef] [PubMed]
- Febriandirza, A.; Chaozhong, W.; Zhong, M.; Hu, Z.; Zhang, H. The Effect of Natural Sounds and Music on Driving Performance and Physiological. Eng. Lett. 2017, 25, 455–463. [Google Scholar]
- Navarro, J.; Osiurak, F.; Gaujoux, V.; Ouimet, M.C.; Reynaud, E. Driving under the influence: How music listening affects driving behaviors. J. Vis. Exp. 2019, 145, e58342. [Google Scholar] [CrossRef]
- Green, P. Crashes Induced by Driver Information Systems and What Can Be Done to Reduce Them (No. 2000-01-C008). Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.589.3723&rep=rep1&type=pdf (accessed on 31 June 2021).
- Li, B.; Sano, A. Extraction and Interpretation of Deep Autoencoder-based Temporal Features from Wearables for Forecasting Personalized Mood, Health, and Stress. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 1–26. [Google Scholar] [CrossRef]
- Mikoski, P.; Zlupko, G.; Owens, D.A. Drivers’ assessments of the risks of distraction, poor visibility at night, and safety-related behaviors of themselves and other drivers. Transp. Res. Part F Traffic Psychol. Behav. 2019, 62, 416–434. [Google Scholar] [CrossRef]
- Bran, E.; Bautu, E.; Popovici, D.M. Open Affordable Mixed Reality: A Manifesto. In Proceedings of the 2020 International Conference on Development and Application Systems (DAS), Suceava, Romania, 21–23 May 2020; pp. 177–184. [Google Scholar]
- Augusto, J.C.; Callaghan, V.; Cook, D.; Kameas, A.; Satoh, I. Intelligent environments: A manifesto. Hum. Comput. Inf. Sci. 2013, 3, 1–18. [Google Scholar] [CrossRef] [Green Version]
- Schipor, O.-A.; Vatavu, R.-D.; Vanderdonckt, J. Euphoria: A Scalable, event-driven architecture for designing interactions across heterogeneous devices in smart environments. Inf. Softw. Technol. 2019, 109, 43–59. [Google Scholar] [CrossRef]
- Schipor, O.A.; Vatavu, R.-D. Empirical Results for High-definition Video and Augmented Reality Content De-livery in Hyper-connected Cars. Interact. Comput. 2021, 33, 3–16. [Google Scholar] [CrossRef]
- Bran, E.; Sburlan, D.F.; Popovici, D.M.; Puchianu, C.M.; Băutu, E. In-vehicle Visualization of Data by means of Augmented Reality. Procedia Comput. Sci. 2020, 176, 1487–1496. [Google Scholar] [CrossRef]
- Sburlan, D.F.; Bautu, E.; Puchianu, C.M.; Popovici, D.M. Adaptive Interactive Displaying System for In-Vehicle Use. Procedia Comput. Sci. 2020, 176, 195–204. [Google Scholar] [CrossRef]
- Bran, E.; Bautu, E.; Popovici, D.M.; Braga, V.; Cojuhari, I. Cultural Heritage Interactive Dissemination through Natural Interaction. In Proceedings of the International Conference on Human-Computer Interaction RoCHI, Bucharest, Romania, 17–18 October 2019; pp. 156–161. [Google Scholar]
- Zaiţi, I.-A.; Pentiuc, G.; Vatavu, R.-D. On free-hand TV control: Experimental results on user-elicited gestures with Leap Motion. Pers. Ubiquitous Comput. 2015, 19, 821–838. [Google Scholar] [CrossRef]
- Shao, L. Hand Movement and Gesture Recognition using Leap Motion Controller; Virtual Reality, Course Report. Available online: https://stanford.edu/class/ee267/Spring2016/report_lin.pdf (accessed on 31 June 2021).
- Bautu, E.; Tudose, C.I.; Puchianu, C.M. In-Vehicle System for Adaptive Filtering of Notifications. In Proceedings of the International Conference on Human-Computer Interaction RoCHI, Bucharest, Romania, 17–18 October 2019; pp. 145–151. [Google Scholar]
- Bautu, E.; Puchianu, C.M.; Bran, E.; Sburlan, D.F.; Popovici, D.M. In-Vehicle Software System for Fostering Driver’s Attentiveness. In Proceedings of the 2020 International Conference on Development and Application Systems (DAS), Suceava, Romania, 21–23 May 2020; pp. 151–156. [Google Scholar]
- Naujoks, F.; Wiedemann, K.; Schömig, N.; Hergeth, S.; Keinath, A. Towards guidelines and verification methods for au-tomated vehicle HMIs. Transp. Res. Part F Traffic Psychol. Behav. 2019, 60, 121–136. [Google Scholar] [CrossRef]
- Tan, W.-S.; Liu, D.; Bishu, R. Web evaluation: Heuristic evaluation vs. user testing. Int. J. Ind. Ergon. 2009, 39, 621–627. [Google Scholar] [CrossRef]
Human Machine Interface Evaluation Guidelines [132] | Compliance | Commentary |
---|---|---|
(2) “The system mode should be displayed continuously” | The system displays the inferred state of the driver, instead of showing the state of the system. If notifications are postponed, a pause sign is shown. | In case the system changes state, the interface just switches to either the driving or Wellness panel, where the alert is coming from. |
(3) “System state changes should be effectively communicated” | Connection state is always on display. Tiredness alerts are actively communicated. Car and road alerts are also actively communicated. | Stress alerts are silent.Most of the changes take place in silence in order not to distract the driver. |
(5) “HMI elements should be grouped together according to their function to support the perception of mode indicators” | The system groups information into three groups:
| - |
(7) “The visual interface should have a sufficient contrast in luminance and/or colour between foreground and background” | We chose white and full saturated colors with a 50% lightness, on a dark background | - |
(8) “Texts (e.g., font types and size of characters) and symbols should be easily readable from the permitted seating position” | - | The text of the notifications on the Social panel is not very short, nor is the font large enough. The system displays notifications when it is considered relatively safe for the driver to check them. |
(9) “Commonly accepted or standardized symbols should be used to communicate the automation mode. Use of non-standard symbols should be supplemented by additional text explanations or vocal phrase/s” | We chose colored icons to express the state of the driver, the car, and the road. With respect to the state of the driver red means stress, while blue means tiredness. With respect to the information gathered from the car by means of the OBDII interface, we communicate warnings. The weather is shown using conventional icons, while for illumination we use color codes: blue shades for the sky and orange for sunglare hazards. | - |
(10) “The semantic of a message should be in accordance with its urgency” | Semantics is communicated through text and color. | - |
(12) “Text messages should be as short as possible” | - | The text body of smart phone notifications is fully displayed. |
(13) Not more than five colours should be consistently used to code system states (excluding white and black) | Connection state of the system from the server and other components is coded with green for connected and red for disconnected. | We chose to express states for the pulse, tiredness/stress, music through the spectrum of colors between red and blue included. |
(14) “The colours used to communicate system states should be in accordance with common conventions and stereotypes” | Red is for stress, high pulse, and also for energetic music, hazards. Green for optimal physiological state. Blue is for tiredness, low pulse, calming music. | We do use all colors between red and blue because the spectrum is mapped to each possible value/meaning. Pulse according to continuous intervals, state of alertness fades to green after a red or blue alert. |
(16) “Auditory output should raise the attention of the driver without startling her/him or causing pain” | We consider as high priority the pleasantness of multimodal outputs of the system. | - |
(17) “Auditory and vibrotactile output should be adapted to the urgency of the message” | The system only raises alerts for vehicle-related problems, environmental hazards, detected driver tiredness and low environmental stimulation that may lead to sleep. These alerts are actively communicated. | Stress alerts would also be high on the priority list, but we chose not to distract the driver even more than he already is. We just show the Wellness panel with a red facial icon, and silently recommend calming music. He may smile for agreement. |
(18) “High-priority messages should be multimodal” | The system uses both audio output and vibration output. | - |
(19) “Warning messages should orient the user towards the source of danger” | Every time an alert is shown, the system switches to the specific information panel. | - |
Strengths | Weaknesses |
---|---|
|
|
Opportunities | Threats |
|
|
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bran, E.; Bautu, E.; Sburlan, D.F.; Puchianu, C.M.; Popovici, D.M. Ubiquitous Computing: Driving in the Intelligent Environment. Mathematics 2021, 9, 2649. https://doi.org/10.3390/math9212649
Bran E, Bautu E, Sburlan DF, Puchianu CM, Popovici DM. Ubiquitous Computing: Driving in the Intelligent Environment. Mathematics. 2021; 9(21):2649. https://doi.org/10.3390/math9212649
Chicago/Turabian StyleBran, Emanuela, Elena Bautu, Dragos Florin Sburlan, Crenguta Madalina Puchianu, and Dorin Mircea Popovici. 2021. "Ubiquitous Computing: Driving in the Intelligent Environment" Mathematics 9, no. 21: 2649. https://doi.org/10.3390/math9212649
APA StyleBran, E., Bautu, E., Sburlan, D. F., Puchianu, C. M., & Popovici, D. M. (2021). Ubiquitous Computing: Driving in the Intelligent Environment. Mathematics, 9(21), 2649. https://doi.org/10.3390/math9212649