Next Article in Journal
Future Trends of Natural Refrigerants: Selection, Preparation, and Evaluation
Previous Article in Journal
Harvesting Human Energy to Power Head Torches Using a Thermoelectric Generator
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Beyond the Physical Environment: Integrating Individual Perception for Context-Related Adaptation †

1
Department of Communication Engineering and Technologies, Technical University of Varna, 9010 Varna, Bulgaria
2
Department of Software and Internet Technologies, Technical University of Varna, 9010 Varna, Bulgaria
*
Author to whom correspondence should be addressed.
Presented at the International Conference on Electronics, Engineering Physics and Earth Science (EEPES’24), Kavala, Greece, 19–21 June 2024.
Eng. Proc. 2024, 70(1), 29; https://doi.org/10.3390/engproc2024070029
Published: 7 August 2024

Abstract

:
Current approaches to context-related studies primarily emphasize monitoring temporal changes in the physical environment to achieve context awareness. However, these approaches often overlook the individual’s subjective perception of these contextual changes. Even psychological studies tend to attribute contextual influences solely to external parameters, neglecting the nuanced human perception of such changes. In this paper, we propose a novel concept for context-related adaptation rooted in the individual’s perception of contextual changes. To address this question, we conducted an experiment involving 18 volunteers to assess the contextual influence of controlled stimuli on each participant individually. Through the collection of both objective data and self-reported subjective assessments, we present initial results indicating the potential for integrating individual perception into context-related adaptation. These findings strongly suggest that contextual changes influence each person in correspondence with their own personal traits, underscoring the relevance of the proposed concept and the need for future research in this direction. Moreover, several challenges remain, particularly in developing a reliable model for assessing human subjective perception. These challenges include obtaining sufficient data from multiple modalities and implementing real domain-specific context scenarios.

1. Introduction

With the development of AI technologies, the field of human–machine collaboration (HMC) is also changing rapidly. The wider acceptance of such systems in industry implementation necessitates a thorough analysis of the cause-and-effect relationships between the participating agents. Since most modern human–machine interaction (HMI) and HMC systems incorporate a human-centered approach [1,2,3], the machine side of the system adapts its behavior correspondingly to the behavior of the human. Some of these systems also consider the dynamics of the surrounding environment (the context), whose variability changes the parameters of the system and often leads to decreased performance or other undesired consequences if this impact cannot be controlled or offset.
Most recent studies underline the importance of context-awareness in systems, which includes the development of various context-aware tools and modalities for context recognition, interpretation, and assessment. Ref. [4] proposes an architecture for a Human–Machine Interface Controller (HMIC) based on bidirectional, multimodal communication (sensors/actuators) between humans and machines, relying on client–server technology and the implementation of wearable devices.
The inclusion of a large number of modalities and monitoring tools raises the question of their processing and the subsequent use of their data for context-related adaptation. The state-of-the-art approach used in the most recent studies [5,6,7] incorporates the use of ontologies for the formal representation of knowledge, enhancing the decision-making process and subsequent context-related adaptation.
A common approach in all context-related studies is to consider the context as a temporal state of the physical environment and base their context awareness on monitoring such changes. Even purely psychological studies [8] attribute context influence only to external system parameters, such as the existence or lack of auditory feedback. None of this research, to the authors’ knowledge, considers the individual human perception of these changes. The current study focuses on and aims to address the following research question (RQ): “Is an individualized assessment/adaptation, based on person-specific perception of context changes, possible?”.

2. Materials and Methods

2.1. Conceptual Architecture

In response to this question, we propose a concept of personalized context-related adaptation (Figure 1). This concept enhances the HMC system’s context awareness by incorporating both environmental monitoring and personal context impact assessment. It considers individual differences in human perception of identical contextual changes.
The concept builds on the previously proposed intelligent Human–Machine Interface framework (iHMIfr) [9] as well as the iHMIfr workflow [10]. The novelty in the current paper lies in the first three modules of the context-related loop, depicted in green in Figure 1. The remaining modules are consistent with those described in [9] and are explained in detail there.

2.1.1. Context State Recognition

Changes in the context affect both the human and machine sides of the collaboration system. In this regard, the Context State Recognition module encompasses the machine-side perception of the current context state. It relies on a multimodal set of inputs such as cameras, temperature sensors, microphones, and other relevant sensors, depending on the specific environment. This module has two main functionalities: the detection of context changes (e.g., changes in the auditory environment) and the interpretation of these changes (e.g., type of noise, volume, etc.).

2.1.2. Context Impact Assessment

The primary functionality of this module is to capture and interpret the individual perception of the human participant in the collaboration regarding changes in the context. This functionality adds value to the decision-making process. Humans exhibit considerable variation in their responses to environmental stimuli; for instance, one individual may be highly distracted by surrounding noise, while another may not find noise distracting but may lose focus in unfavorable lighting conditions. To effectively acquire and utilize such knowledge for each human participant, the module requires multimodal inputs to assess the human state of mind, along with reliable technologies for signal and data processing and interpretation. These technologies include, but are not limited to, computer vision (e.g., cameras and image recognition algorithms for assessing cognitive and emotional states) and physiological signals collected through wearable devices, along with corresponding algorithms for interpreting the current human state.

2.1.3. Context Monitoring and Profiling

The context monitoring and profiling module relies on a sophisticated ontology that integrates two essential types of information: machine-interpreted context states and the human participant’s interpreted perception of context changes. In cases of discrepancies between these sources, the system prioritizes the individual’s perception to ensure an accurate reflection of the human experience in the system’s adaptive responses.
To achieve a nuanced understanding of the collaborative task environment, the module defines a set of context profiles tailored to the specificities of both the task and the surrounding conditions. Each context profile captures the unique characteristics of various scenarios, enabling the system to implement corresponding adaptation strategies effectively. These strategies support the decision-making process, ensuring contextually appropriate system responses that enhance overall performance.
The ontology-based approach aims to ensure a comprehensive and dynamic understanding of the context, facilitating the seamless integration of human perceptual data with machine interpretations. This dual consideration not only enhances system adaptability but also significantly improves user experience by aligning the adaptive system responses more closely with human expectations and perceptions.

2.2. Experimental Setup

The experimental setup builds upon the configuration used to validate the task-related adaptation [11] and the concept of human-related adaptation based on the emotional intelligence of the machine [12]. The objective of the current experiment is to validate the architecture proposed in Section 2 as a response to the research question posed in Section 1.
The synthesized workflow for adaptation based on iHMIfr is illustrated in Figure 2.
In the preceding experiments related to the iHMIfr conceptual framework, dedicated software for data acquisition during collaborative tasks was developed. The task was based on the T-Load_D-back_Indv cognitive test. Initially, the application and decision managers, along with opportunities for setting different adaptive strategies, were developed. Task-related adaptation was then realized and validated based on these components.
Subsequently, the user model was validated through image recognition and interpretation of the current emotional and cognitive states, level of attention, etc., using the MorphCast platform [13]. A dataset consisting of 21 volunteers, along with their states during a 15 min test, as well as their current performance, reaction time, etc., was compiled and published. Extensive analysis of the data validated the concept for the implementation of emotional intelligence through the usage of the iHMIfr [12].
For the current experiment, the existing experimental setup was expanded to include controlled changes in the environment by introducing unfavorable light conditions and noise as external distractors for the collaborative system (context-related distractors). Eighteen volunteers participated in this experiment, and a new dataset was collected. Customized questionnaires were prepared concurrently with the experiment to collect self-reports regarding the influence of the distractors on each participant.
The implementation of the controlled environment was achieved using an Arduino controller, an adjustable desk lamp positioned above the monitor to simulate unfavorable light conditions, and standard noise provided by two speakers located on both sides of the participant. The noise was of the “neighbour speaking” type and was prepared in advance as an .mp3 file.
Three scenarios of context-related changes were applied: LIGHT, NOISE, and BOTH.

3. Results

The results presented in Figure 3 illustrate the chronological changes in human-related parameters (arousal, valence, and attention) used for emotional and cognitive assessment, collected from three different users during their 15 min participation. As anticipated, the data differ significantly between users, but the impacts of the three time intervals of distraction (LIGHT, NOISE, and BOTH) on the monitored parameters are clearly evident and vary individually.
Similar to the data collected through the software application, the responses provided in the self-reporting questionnaires (Figure 4) exhibit varying perceptions of the three intervals as reported by the users. The rankings range from 0 to 5 points for each distractor’s impact.
Both the experimental results and the self-assessment reports confirm that the perception of the dynamics of context-related parameters strongly depends on personality. While physical measurement and monitoring of these parameters provide basic knowledge about the context, a reliable assessment of the person-specific impact experienced by the human side of the HMC system would be far more beneficial for the application of context-adaptation strategies.
Future work extending the findings of this paper will involve attempts for modeling towards the recognition and reliable interpretation of context-related perception. Validation of the concept through multimodal input signals (e.g., physiological) will be the focus of upcoming experiments in this regard.

4. Conclusions

In this paper, we propose a concept for context-related adaptation based on individual perceptions of context changes, in response to the research question posed in Section 1. An experiment involving 18 volunteers was conducted to individually assess the contextual influence of controlled distractors. The acquired data from each participant, along with their self-reported subjective assessments, yielded some results as presented above. Overall, these results demonstrate the potential of integrating the individual perception of context changes for context-related adaptation. However, challenges remain in creating a model for reliable assessment of this perception, primarily related to acquiring sufficient data from multiple modalities and implementing real domain-specific context scenarios. These challenges will be addressed in forthcoming experiments.

Author Contributions

Conceptualization, Y.K. and M.M.; methodology Y.K. and M.M.; software, Y.K.; validation, M.M. and Y.K.; formal analysis, M.M.; investigation, M.M. and Y.K.; resources, M.M. and Y.K.; data curation, Y.K. and M.M.; writing—original draft preparation, Y.K. and M.M.; writing—review and editing, M.M.; visualization, Y.K. and M.M.; supervision, M.M.; project administration, M.M.; funding acquisition, M.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by THE BULGARIAN NATIONAL SCIENCE FUND, grant number FNI KP-06-N37/18, entitled “Investigation on intelligent human-machine interaction inter-faces, capable of recognizing high-risk emotional and cognitive conditions”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are presented in the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. He, H.; Gray, J.; Cangelosi, A.; Meng, Q.; McGinnity, T.M.; Mehnen, J. The Challenges and Opportunities of Human-Centered AI for Trustworthy Robots and Autonomous Systems. IEEE Trans. Cogn. Dev. Syst. 2022, 14, 1398–1412. [Google Scholar] [CrossRef]
  2. Rodriguez-Conde, I.; Campos, C. Towards Customer-Centric Additive Manufacturing: Making Human-Centered 3D Design Tools through a Handheld-Based Multi-Touch User Interface. Sensors 2020, 20, 4255. [Google Scholar] [CrossRef] [PubMed]
  3. Horberry, T.; Mulvihill, C.; Fitzharris, M.; Lawrence, B.; Lenne, M.; Kuo, J.; Wood, D. Human-Centered Design for an In-Vehicle Truck Driver Fatigue and Distraction Warning System. IEEE Trans. Intell. Transport. Syst. 2022, 23, 5350–5359. [Google Scholar] [CrossRef]
  4. Kardos, C.; Kemény, Z.; Kovács, A.; Pataki, B.E.; Váncza, J. Context-dependent multimodal communication in human-robot collaboration. Procedia CIRP 2018, 72, 15–20. [Google Scholar] [CrossRef]
  5. Cabrera, O.; Franch, X.; Marco, J. Ontology-based context modeling in service-oriented computing: A systematic mapping. Data Knowl. Eng. 2017, 110, 24–53. [Google Scholar] [CrossRef]
  6. Cabrera, O.; Oriol, M.; Franch, X.; Marco, J. A context-aware monitoring architecture for supporting system adaptation and reconfiguration. Computing 2021, 103, 1621–1655. [Google Scholar] [CrossRef]
  7. Caro, M.F.; Cox, M.T.; Toscano-Miranda, R.E. A Validated Ontology for Metareasoning in Intelligent Systems. J. Intell. 2022, 10, 113. [Google Scholar] [CrossRef] [PubMed]
  8. Neszmélyi, B.; Horváth, J. The role of auditory context in action-effect-related motor adaptation. Hum. Mov. Sci. 2019, 67, 102503. [Google Scholar] [CrossRef] [PubMed]
  9. Markov, M.; Ganchev, T. Intelligent human-machine interface framework. Int. J. Adv. Electron. Comput. Sci. 2022, 9, 41–46. [Google Scholar]
  10. Feradov, F.; Markova, V.; Ganchev, T. Automated Detection of Improper Sitting Postures in Computer Users Based on Motion Capture Sensors. Computers 2022, 11, 116. [Google Scholar] [CrossRef]
  11. Markov, M.; Kalinin, Y.; Ganchev, T. A Task-related Adaptation in Intelligent Human-Machine Interfaces. In Proceedings of the International Conference on Communications, Information, Electronic and Energy Systems (CIEES), Veliko Tarnovo, Bulgaria, 24–26 November 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–4. [Google Scholar] [CrossRef]
  12. Markov, M.; Kalinin, Y.; Markova, V.; Ganchev, T. Towards Implementation of Emotional Intelligence in Human–Machine Collaborative Systems. Electronics 2023, 12, 3852. [Google Scholar] [CrossRef]
  13. Emotion AI Provider. Facial Emotion Recognition, MorphCast (2023). Available online: https://www.morphcast.com (accessed on 22 April 2024).
Figure 1. Conceptual architecture of an individualized context-related adaptation for collaborative human–machine interaction systems.
Figure 1. Conceptual architecture of an individualized context-related adaptation for collaborative human–machine interaction systems.
Engproc 70 00029 g001
Figure 2. Workflow of decisions for adaptation based on the iHMIfr. The experimental validation focus is depicted in green.
Figure 2. Workflow of decisions for adaptation based on the iHMIfr. The experimental validation focus is depicted in green.
Engproc 70 00029 g002
Figure 3. Personal reactions to the context changes from different participants. The sequence of the distractors is as follows: LIGHT; NOISE; and BOTH.
Figure 3. Personal reactions to the context changes from different participants. The sequence of the distractors is as follows: LIGHT; NOISE; and BOTH.
Engproc 70 00029 g003aEngproc 70 00029 g003b
Figure 4. Subjective assessment of the context state impact reported by the participants.
Figure 4. Subjective assessment of the context state impact reported by the participants.
Engproc 70 00029 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kalinin, Y.; Markov, M. Beyond the Physical Environment: Integrating Individual Perception for Context-Related Adaptation. Eng. Proc. 2024, 70, 29. https://doi.org/10.3390/engproc2024070029

AMA Style

Kalinin Y, Markov M. Beyond the Physical Environment: Integrating Individual Perception for Context-Related Adaptation. Engineering Proceedings. 2024; 70(1):29. https://doi.org/10.3390/engproc2024070029

Chicago/Turabian Style

Kalinin, Yasen, and Miroslav Markov. 2024. "Beyond the Physical Environment: Integrating Individual Perception for Context-Related Adaptation" Engineering Proceedings 70, no. 1: 29. https://doi.org/10.3390/engproc2024070029

Article Metrics

Back to TopTop