Human-Robot Interaction and Applications: Challenges and Future Perspectives

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Artificial Intelligence".

Deadline for manuscript submissions: 31 December 2024 | Viewed by 1103

Special Issue Editors


E-Mail Website
Guest Editor
Cognitive Science Department, Jagiellonian University, 31-007 Krakow, Poland
Interests: cognitive science; cognitive robotics; intelligent system for multimodal human–computer interaction

E-Mail Website
Guest Editor
Center for Cybernics Research, University of Tsukuba, Ibaraki 305-8577, Japan
Interests: human-centered computing; haptic devices; computational models of human behavior; interaction design studies

E-Mail Website
Guest Editor
Department for Computer Engineering and Digital Design, Universitat de Lleida, 25006 Lleida, Spain
Interests: educational robotics; embodied interaction; computing education; design research; participatory design

Special Issue Information

Dear Colleagues,

The emergence of new technologies and application areas in human–robot interaction is having significant and meaningful impacts on the ways people experience and interact with the world. This Special Issue sheds light on challenges in the design and use of robots and intelligent systems in our everyday life. Furthermore, this Special Issue invites scholars to critically reflect on the future perspectives of this research field.

We encourage the following types of submissions (among others):

  • Design studies or evaluative research that highlight how the features of robots and intelligent systems are intended to support people’s engagement, learning, and behaviors in everyday life.
  • Critical, sociological, and/or methodological articles on the opportunities/challenges of designing/using intelligent technology in this field.
  • Towards the development of empathic machines: understanding and modeling human behavior to create machines that can respond to and understand humans at an emotional level.
  • Affective haptics: sensors and/or actuators designed to support human–robot interaction through touch.
  • Ethnographic and cultural topics related to human–robot interaction.
  • Calm-technology approach to human–robot interaction.
  • In-the-wild and field studies on human–robot interaction.
  • Child–robot interaction.
  • Infant–robot interaction.

Prof. Dr. Bipin Indurkhya
Dr. Eleuda Nuñez
Prof. Dr. Marie-Monique Schaper
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • affective haptics
  • calm technology
  • child–robot interaction
  • co-design
  • empathetic HRI
  • ethnographic studies on HRI
  • infant–robot interaction
  • multimodal HRI
  • participatory design

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

20 pages, 629 KiB  
Article
Lessons in Developing a Behavioral Coding Protocol to Analyze In-the-Wild Child–Robot Interaction Events and Experiments
by Xela Indurkhya and Gentiane Venture
Electronics 2024, 13(7), 1175; https://doi.org/10.3390/electronics13071175 - 22 Mar 2024
Viewed by 761
Abstract
Behavioral analyses of in-the-wild HRI studies generally rely on interviews or visual information from videos. This can be very limiting in settings where video recordings are not allowed or limited. We designed and tested a vocalization-based protocol to analyze in-the-wild child–robot interactions based [...] Read more.
Behavioral analyses of in-the-wild HRI studies generally rely on interviews or visual information from videos. This can be very limiting in settings where video recordings are not allowed or limited. We designed and tested a vocalization-based protocol to analyze in-the-wild child–robot interactions based upon a behavioral coding scheme utilized in wildlife biology, specifically in studies of wild dolphin populations. The audio of a video or audio recording is converted into a transcript, which is then analyzed using a behavioral coding protocol consisting of 5–6 categories (one indicating non-robot-related behavior, and 4–5 categories of robot-related behavior). Refining the code categories and training coders resulted in increased agreement between coders, but only to a level of moderate reliability, leading to our recommendation that it be used with three coders to assess where there is majority consensus, and thereby correct for subjectivity. We discuss lessons learned in the design and implementation of this protocol and the potential for future child–robot experiments analyzed through vocalization behavior. We also perform a few observational behavior analyses from vocalizations alone to demonstrate the potential of this field. Full article
Show Figures

Figure 1

Back to TopTop