sensors-logo

Journal Browser

Journal Browser

Multisensory AI for Human-Robot Interaction

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 20 January 2025 | Viewed by 38

Special Issue Editors


E-Mail Website
Guest Editor
Robotics & Intelligent Adaptive Systems, University of Hertfordshire, Hatfield AL10 9AB, UK
Interests: human–robot interaction; grasping and dexterous manipulation; artificial perception systems/autonomous systems; pattern recognition
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Adaptive Systems Research Group, School of Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK
Interests: intersection of developmental;cognitive robotics; human-robot interaction

Special Issue Information

Dear Colleagues,

Sensors invites submissions for a Special Issue on the cutting-edge advancements in multisensory artificial intelligence (AI) with a particular focus on their applications in human–robot interaction (HRI). This Special Issue aims to explore the latest research and development in this rapidly evolving field, emphasizing the integration of diverse sensory modalities, affective computing, and neuro-inspired approaches to create more intuitive, empathetic, and effective interactions between humans and robots.

Topics of Interest:
We welcome original research articles, reviews, and perspectives on a wide range of topics, including but not limited to the following:

  • Multimodal sensor fusion for HRI: novel methods for integrating data from various sensors (e.g., vision, audio, touch, physiological signals) to enhance robot perception and understanding of human behavior, emotions, and intentions.
  • Neuro-affective computing in HRI: development and evaluation of computational models that can recognize, interpret, and respond to human affective states, including emotions, moods, and cognitive load.
  • Large language models (LLMs) for HRI: exploration of the potential of LLMs in facilitating natural language communication, understanding complex social cues, and generating contextually appropriate responses in HRI scenarios.
  • Adaptive and personalized HRI: design and implementation of intelligent systems that can adapt their behavior and communication strategies based on individual user preferences, emotional states, and learning styles.
  • Ethical considerations in neuro-affective HRI: investigation of the ethical implications of using neuro-affective data and AI in HRI, including issues related to privacy, autonomy, and potential biases.
  • Applications of multisensory AI in HRI: case studies and real-world applications of multisensory AI and neuro-affective computing in various domains, such as healthcare, education, therapy, and social robotics.
  • Empathetic robots: development of robots capable of understanding and responding appropriately to human emotions, fostering trust and rapport in HRI.
  • Socially assistive robotics: design and evaluation of robots that can provide social and emotional support to individuals with special needs, such as children with autism or the elderly.
  • Deep reinforcement learning for HRI: application of deep RL techniques to enable robots to learn complex social behaviors and adapt to dynamic environments through interaction with humans.
  • Robot behavior adaptation based on human stimuli: development of algorithms and models that allow robots to adjust their behavior in real-time based on human verbal and non-verbal cues.
  • Child–robot interaction: investigation of the unique challenges and opportunities in designing robots that can interact effectively with children, considering their developmental needs and cognitive abilities.
  • Companion robots for the elderly: exploration of the potential of robots to provide companionship, assistance, and emotional support to older adults, addressing issues of social isolation and loneliness.
  • Speech emotion for HRI communication: development of robust speech emotion recognition systems that can accurately interpret human emotions from vocal cues, enabling more natural and empathetic communication with robots.
  • Natural language processing for HRI: advancements in NLP techniques for understanding and generating human-like language in HRI, including dialogue systems, sentiment analysis, and intent recognition.

Submission Guidelines:
Manuscripts should be prepared according to the journal's guidelines and submitted through the online submission system. Please indicate in your cover letter that your submission is intended for the Special Issue on Multisensory AI for Human–Robot Interaction.

Dr. Diego R. Faria
Dr. Frank Förster
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • human-robot interaction
  • multisensory AI
  • sensors
  • socially assistive robotics
  • affective robotics
  • artificial perception for HRI
  • deep reinforcement learning for HRI
  • LLMs for HRI, adaptation for HRI

Published Papers

This special issue is now open for submission.
Back to TopTop