Next Article in Journal
A Mitigation Method for Optical-Turbulence-Induced Errors and Optimal Target Design in Vision-Based Displacement Measurement
Previous Article in Journal
Physical-Layer Security with Irregular Reconfigurable Intelligent Surfaces for 6G Networks
Previous Article in Special Issue
A Care Robot with Ethical Sensing System for Older Adults at Home
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Assistive Robots for Healthcare and Human–Robot Interaction

by
Grazia D’Onofrio
1,* and
Daniele Sancarlo
2
1
Clinical Psychology Service, Health Department, Fondazione IRCCS Casa Sollievo della Sofferenza, San Giovanni Rotondo, 71013 Foggia, Italy
2
Complex Unit of Geriatrics, Department of Medical Sciences, Fondazione IRCCS Casa Sollievo della Sofferenza, San Giovanni Rotondo, 71013 Foggia, Italy
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(4), 1883; https://doi.org/10.3390/s23041883
Submission received: 27 December 2022 / Revised: 27 January 2023 / Accepted: 29 January 2023 / Published: 8 February 2023
(This article belongs to the Special Issue Assistive Robots for Healthcare and Human-Robot Interaction)
Assistive robots are still mostly prototypes that only remotely recall human interactive dynamics. Researchers are working to create, in record time, a being that can assist us, encourage us, teach us, and support precision and heavy work activities, becoming—on the basis of its purpose of use—the possibly perfect interactive partner. However, to do this, it will be necessary to go far beyond the already advanced robotic implementations that make non-verbal communication the pivot on which the human–robot interaction (HRI) is based.
The study of the HRI is a relatively new research area. The field of investigation is broad and diverse, and both hardware and software design processes face interesting and open challenges. Presently, various fields of application and the use of assistive robots are being examined by the scientific community. For these reasons, the development of the HRI makes use of the contribution of various disciplines, ranging from those with a more mathematical-engineering imprint to the more humanistic sciences. The theme is then opened up to numerous issues of an ethical and legal nature, which are also still being explored. Thanks to the entry on the market of relatively affordable models, assistive robots are gradually expanding in society and can provide initial support to human activities, albeit still in an experimental form and in structured places.
By definition, the interaction implicates the communication. In light of this assumption, research in the HRI field is increasingly focused on the development of robots equipped with intelligent communicative abilities, in particular speech-based natural language conversational abilities. These efforts directly relate to the research area of computational linguistics, generally defined as “the subfield of computer science concerned with using computational techniques to learn, understand, and produce human language content”. The advances and results in computational linguistics provide a foundational background for the development of so-called Spoken Dialogue Systems, i.e., computer systems designed to interact with humans using spoken natural language. The ability to communicate using natural language is a fundamental requirement for a robot that interacts with a human being. Then, spoken dialogue is generally considered as the most natural way for a social human–robot interaction.
The main topic of this Special Issues has been to advance the novel technologies applied in healthcare processes that have shown exceptional promise in models of an HRI. The first important question concerns the modalities needed to sense the emotional state of people by the robot. Second, there is the problem of modeling the interaction between human and robot, not only on a haptic level, but also on an emotional level.
The Special Issue is a collection of papers targeting an audience of practicing researchers, academics, and other scientists from Canada, France, Greece, Italy, Japan, Korea, Poland, Saudi Arabia, Spain, Taiwan, and the USA. Its contents were written by multiple authors and edited by experts in clinical and researcher fields.
They contributed to increasing the knowledge on assistive robots and the HRI considering, as enablers, to support the process of care giving, potentially enhancing patients’ well-being and decreasing the caregiver workload.
In the first study, Kim and colleagues [1] discussed studies on care robots and the human-centered artificial intelligence framework, presented an ethical design for the sensing services of care robots, and reported the development of a care robot for frail older adult users.
In the second study, Aygun et al. [2] analyzed and modeled data from a multi-modal simulated driving study specifically designed to evaluate different levels of cognitive workload induced by various secondary tasks, such as dialogue interactions and braking events, in addition to the primary driving task. They performed statistical analyses of various physiological signals including eye gaze, electroencephalography, and arterial blood pressure from the healthy volunteers and utilized several machine learning methodologies, including k-nearest neighbor, naive Bayes, random forest, support-vector machines, and neural network-based models to infer human cognitive workload levels.
In the third study, Michel et al. [3] presented a prototype robot capable of supporting a surgeon during otological surgery. On the basis of the observation that patients may move or wake up during an operation, and that the surgeon must regularly clean the endoscope optics, new robot architecture was presented.
In the successive work, Julia Arias-Rodríguez and colleagues [4] discussed the feasibility and validation of a microwave antenna-based imaging system for intra-operative surgical navigation. The authors reported that the experimental assessment of the proposed system showed accuracies and errors consistent with other approaches with other technologies found in the literature, thus highlighting the interest for further studies.
In the fifth study, Grazia D’Onofrio et al. [5] identified if traditional machine learning algorithms could be used to assess every users’ emotions separately, to relate emotion recognizing in two robotic modalities, a static or motion robot, and to evaluate the acceptability and usability of an assistive robot from an end-user point of view. The authors reported that the random forest algorithm’s performance was better in terms of accuracy and execution time than k-nearest neighbor’s algorithm, and the robot was not a disturbing factor in the arousal of emotions.
Slawomir Tobis et al.’s [6] study focused on technology acceptance. They posed the question of whether there was a possibility of interacting with the technology and if this had an impact on the scores awarded by the respondents in various domains of the needs and requirements for social robots to be deployed in the care of older adults. The authors concluded that pre-implementation studies and assessments should include the possibility of interacting with the robot to provide its future users with a clear idea of the technology and facilitate the necessary customizations of the machine.
In the seventh study, Kazuyuki Matsumoto and colleagues [7] focused on interview dialogue systems, proposing a method based on a multi-task learning neural network that uses embedded representations of sentences to understand the context of the text and utilizes the intention of an utterance as a feature.
In the successive manuscript, Chris Lytridis et al. [8] presented novel tools for the analysis of human behavior data regarding robot-assisted special education for children with autism spectrum disorder (ASD). The authors included an understanding of human behavior in response to an array of robot actions and an improved intervention design based on suitable mathematical instruments.
Grazia D’Onofrio and colleagues [9] determined the needs and preferences of older people and their caregivers for improving healthy and active aging and guiding the technological development of a technological system. Additionally, these authors highlighted the importance of pre-implementation studies in order to improve the acceptance of the technological systems by end-users.
Afterwards, Hsiao-Kuan Wu et al. [10] showed that the robot can follow the user in the designated position while the user performs forward, backward, and lateral movements, turning and walking along a curve.
Nan Liang and Goldie Nejat [11] presented the first comprehensive investigation and meta-analysis of two types of robotic presence to determine how they influence the HRI outcomes and impact user tasks.
In the final manuscript, Amal Alabdulkareem and colleagues [12] proposed a systematic review and asserted that robot-assisted therapy is a promising field of application for intelligent social robots, especially to support children with ASD in achieving their therapeutic and educational objectives (social and emotional development, communication and interaction development, cognitive development, motor development, sensory development, and areas other than developmental ones).
In light of this Special Issue, as the area of social robotics and HRI grows, public demonstrations have the potential to provide insights into the robot and system effectiveness in public settings and the reactions of the people. One of the challenges is that, although modeling the dynamics of expressions and emotions has been extensively studied in the literature, how to model personality in a time-continuous manner has been an open problem.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kim, J.-W.; Choi, Y.-L.; Jeong, S.-H.; Han, J. A Care Robot with Ethical Sensing System for Older Adults at Home. Sensors 2022, 22, 7515. [Google Scholar]
  2. Aygun, A.; Nguyen, T.; Haga, Z.; Aeron, S.; Scheutz, M. Investigating Methods for Cognitive Workload Estimation for Assistive Robots. Sensors 2022, 22, 6834. [Google Scholar] [CrossRef] [PubMed]
  3. Michel, G.; Bordure, P.; Chablat, D. A New Robotic Endoscope Holder for Ear and Sinus Surgery with an Integrated Safety Device. Sensors 2022, 22, 5175. [Google Scholar] [PubMed]
  4. Blanco-Angulo, C.; Martínez-Lozano, A.; Juan, C.G.; Gutiérrez-Mazón, R.; Arias-Rodríguez, J.; Ávila-Navarro, E.; Sabater-Navarro, J.M. Validation of an RF Image System for Real-Time Tracking Neurosurgical Tools. Sensors 2022, 22, 3845. [Google Scholar] [CrossRef] [PubMed]
  5. D’Onofrio, G.; Fiorini, L.; Sorrentino, A.; Russo, S.; Ciccone, F.; Giuliani, F.; Sancarlo, D.; Cavallo, F. Emotion Recognizing by a Robotic Solution Initiative (EMOTIVE Project). Sensors 2022, 22, 2861. [Google Scholar] [CrossRef] [PubMed]
  6. Tobis, S.; Piasek, J.; Cylkowska-Nowak, M.; Suwalska, A. Robots in Eldercare: How Does a Real-World Interaction with the Machine Influence the Perceptions of Older People? Sensors 2022, 22, 1717. [Google Scholar] [PubMed]
  7. Matsumoto, K.; Sasayama, M.; Kirihara, T. Topic Break Detection in Interview Dialogues Using Sentence Embedding of Utterance and Speech Intention Based on Multitask Neural Networks. Sensors 2022, 22, 694. [Google Scholar] [CrossRef] [PubMed]
  8. Lytridis, C.; Kaburlasos, V.G.; Bazinas, C.; Papakostas, G.A.; Sidiropoulos, G.; Nikopoulou, V.-A.; Holeva, V.; Papadopoulou, M.; Evangeliou, A. Behavioral Data Analysis of Robot-Assisted Autism Spectrum Disorder (ASD) Interventions Based on Lattice Computing Techniques. Sensors 2022, 22, 621. [Google Scholar] [CrossRef]
  9. D’Onofrio, G.; Fiorini, L.; Toccafondi, L.; Rovini, E.; Russo, S.; Ciccone, F.; Giuliani, F.; Sancarlo, D.; Cavallo, F. Pilots for Healthy and Active Ageing (PHArA-ON) Project: Definition of New Technological Solutions for Older People in Italian Pilot Sites Based on Elicited User Needs. Sensors 2022, 22, 163. [Google Scholar] [CrossRef] [PubMed]
  10. Wu, H.-K.; Chen, P.-Y.; Wu, H.-Y.; Yu, C.-H. User Local Coordinate-Based Accompanying Robot for Human Natural Movement of Daily Life. Sensors 2021, 21, 3889. [Google Scholar] [CrossRef] [PubMed]
  11. Liang, N.; Nejat, G. A Meta-Analysis on Remote HRI and In-Person HRI: What Is a Socially Assistive Robot to Do? Sensors 2022, 22, 7155. [Google Scholar] [PubMed]
  12. Alabdulkareem, A.; Alhakbani, N.; Al-Nafjan, A. A Systematic Review of Research on Robot-Assisted Therapy for Children with Autism. Sensors 2022, 22, 944. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

D’Onofrio, G.; Sancarlo, D. Assistive Robots for Healthcare and Human–Robot Interaction. Sensors 2023, 23, 1883. https://doi.org/10.3390/s23041883

AMA Style

D’Onofrio G, Sancarlo D. Assistive Robots for Healthcare and Human–Robot Interaction. Sensors. 2023; 23(4):1883. https://doi.org/10.3390/s23041883

Chicago/Turabian Style

D’Onofrio, Grazia, and Daniele Sancarlo. 2023. "Assistive Robots for Healthcare and Human–Robot Interaction" Sensors 23, no. 4: 1883. https://doi.org/10.3390/s23041883

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop