Next Article in Journal
Tactile Cues for Improving Target Localization in Subjects with Tunnel Vision
Previous Article in Journal
Exploring the Development Requirements for Virtual Reality Gait Analysis
Previous Article in Special Issue
Tango vs. HoloLens: A Comparison of Collaborative Indoor AR Visualisations Using Hand-Held and Hands-Free Devices
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design

1
Human-Computer Interaction (HCI), Department Informatics, University of Hamburg, 22527 Hamburg, Germany
2
Technical Aspects of Multimodal Systems (TAMS), Department Informatics, University of Hamburg, 22527 Hamburg, Germany
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2019, 3(2), 25; https://doi.org/10.3390/mti3020025
Submission received: 28 February 2019 / Revised: 22 March 2019 / Accepted: 5 April 2019 / Published: 11 April 2019
(This article belongs to the Special Issue Mixed Reality Interfaces)

Abstract

The number of scientific publications combining robotic user interfaces and mixed reality highly increased during the 21st Century. Counting the number of yearly added publications containing the keywords “mixed reality” and “robot” listed on Google Scholar indicates exponential growth. The interdisciplinary nature of mixed reality robotic user interfaces (MRRUI) makes them very interesting and powerful, but also very challenging to design and analyze. Many single aspects have already been successfully provided with theoretical structure, but to the best of our knowledge, there is no contribution combining everything into an MRRUI taxonomy. In this article, we present the results of an extensive investigation of relevant aspects from prominent classifications and taxonomies in the scientific literature. During a card sorting experiment with professionals from the field of human–computer interaction, these aspects were clustered into named groups for providing a new structure. Further categorization of these groups into four different categories was obvious and revealed a memorable structure. Thus, this article provides a framework of objective, technical factors, which finds its application in a precise description of MRRUIs. An example shows the effective use of the proposed framework for precise system description, therefore contributing to a better understanding, design, and comparison of MRRUIs in this growing field of research.
Keywords: mixed reality; robotic user interface taxonomy; human–robot interaction; robotic user interface; holistic user interface design; user interface analysis mixed reality; robotic user interface taxonomy; human–robot interaction; robotic user interface; holistic user interface design; user interface analysis
Graphical Abstract

Share and Cite

MDPI and ACS Style

Krupke, D.; Zhang, J.; Steinicke, F. IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design. Multimodal Technol. Interact. 2019, 3, 25. https://doi.org/10.3390/mti3020025

AMA Style

Krupke D, Zhang J, Steinicke F. IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design. Multimodal Technologies and Interaction. 2019; 3(2):25. https://doi.org/10.3390/mti3020025

Chicago/Turabian Style

Krupke, Dennis, Jianwei Zhang, and Frank Steinicke. 2019. "IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design" Multimodal Technologies and Interaction 3, no. 2: 25. https://doi.org/10.3390/mti3020025

APA Style

Krupke, D., Zhang, J., & Steinicke, F. (2019). IMPAct: A Holistic Framework for Mixed Reality Robotic User Interface Classification and Design. Multimodal Technologies and Interaction, 3(2), 25. https://doi.org/10.3390/mti3020025

Article Metrics

Back to TopTop