User Interface Pattern for AR in Industrial Applications
Abstract
:1. Introduction
2. State of the Art
- About 10% dealt with the industrial environment;
- About 35% dealt with HMDs, with a decreasing trend since 2010;
- About 24% obtained their results in field and pilot tests.
3. Materials and Methods
3.1. Persona
- The basis for the design effort is formed by personal goals and tasks.
- Personas provide a basis for design decisions and help to ensure that the user is in focus at every step of development.
- Personas make it possible to form a language and thus a common understanding.
- Design decisions can be measured by a persona as well as by real subjects.
- Multiple business units within a company can make use of personas.
- Description: Name and picture.
- Characteristics: Age, education level, lifestyle, role/professional position.
- Knowledge: Basic attitude, technology knowledge, and technology attitude.
- Concerns and goals: Expectations, qualifications, and goals.
- Activities: Tasks and activities.
3.2. The Background of Pattern
3.3. The Creations of Pattern
- Which core work tasks are performed in the production environment with AR glasses?
- Which core structures do the UIs of existing layout solutions on the market have?
- Selecting from the menu: The complex industrial content is prepared in a work-situated manner via entry points.
- Navigating documents: Manufacturing documents or assembly instructions are generally long documents with an average number of pages of about 20–30 DIN A4 pages.
- Deepening object information: Additional information is offered for the work objects in the real world.
- Selecting from the toolbar: Basic functions are arranged here, such as minimize, maximize, back, close, help, save, which are frequently used.
- Position: The cursor is moved to buttons, such as toolbars.
- Select: The desired button is selected, and the system indicates the selection with an appropriate marker.
- Confirm: The input confirms the selection.
- Effectiveness.
- Efficiency.
- Satisfaction.
4. Evaluation of the Pattern Catalog
5. Final Catalog with Pattern
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Plutz, M.; Große Böckmann, M.; Siebenkotten, P.; Schmitt, R. Smart Glasses in der Produktion: Studienbericht des Fraunhofer-Instituts für Produktionstechnologie IPT; Fraunhofer: Munich, Germany, 2016; pp. 16–21. [Google Scholar]
- Dey, A.; Billinghurst, M.; Lindeman, R.W.; Swan, J.E., II. A Systematic Review of Usability Studies in Augmented Reality between 2005 and 2014. In Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, Merida, Yucatan, Mexico, 19–23 September 2016; Veas, E., Langlotz, T., Martinez-Carranza, J., Grasset, R., Sugimoto, M., Eds.; IEEE: Piscataway, NJ, USA, 2016; pp. 49–50. [Google Scholar] [CrossRef]
- Koreng, R. AR in production: Development of UI patterns. In Proceedings of the Mensch und Computer 2019 on—MuC’19, Mensch und Computer 2019, Hamburg, Germany, 8–11 September 2019; Alt, F., Bulling, A., Döring, T., Eds.; ACM Press: New York, NY, USA, 2019; pp. 658–659. [Google Scholar] [CrossRef]
- Ehrlenspiel, K.; Meerkamm, H. Integrierte Produktentwicklung. Denkabläufe, Methodeneinsatz, Zusammenarbeit, 5th ed.; Hanser: München, Germany, 2013; pp. 312–317. [Google Scholar]
- Karl, D.E.; Soderquist, K.A.; Farhi, M.; Grant, A.H.; Pekarek Krohn, D.R.; Murphy, B.; Schneiderman, J.; Straughan, B. 2018 Augmented and Virtual Reality Survey Report: Industry Insight into the Future of AR/VR; Perkins Coie LLP: Seattle, WA, USA, 2018. [Google Scholar]
- Takatsu, Y.; Saito, Y.; Shiori, Y.; Nishimura, T.; Matsushita, H. Key Points for Utilizing Digital Technologies at Manufacturing and Maintenance Sites. Fujitsu Sci. Tech. J. 2018, 54, 9–15. [Google Scholar]
- Billinghurst, M.; Clark, A.; Lee, G. A Survey of Augmented Reality. In Foundations and Trends® in Human–Computer Interaction; Alet Heezemans: Delft, The Netherlands, 2015; Volume 8, pp. 73–272. [Google Scholar] [CrossRef]
- Nilsson, E.G. Design patterns for user interface for mobile applications. Adv. Eng. Softw. 2009, 40, 1318–1328. [Google Scholar] [CrossRef] [Green Version]
- Oscar, D.; Magnus, H.; Anna, S. Augmented Reality Smart Glasses for Industrial Assembly Operators: A Meta-Analysis and Categorization. Adv. Transdiscipl. Eng. 2019, 9, 173–179. [Google Scholar]
- Bowman, D.A.; Krujiff, E.; LaViola, J.J., Jr.; Poupyrev, I. 3D User Interfaces. Theory and Practice; Addison-Wesley: Boston, MA, USA, 2005; pp. 6–8, 139–181, 255–285, 313–347. [Google Scholar]
- LaViola, J.J., Jr.; Kruijff, E.; McMahan, R.P.; Bowman, D.A.; Poupyrev, I. 3D User Interfaces. Theory and Practice; Addison-Wesley: Boston, MA, USA, 2017; pp. 6–8, 18–19, 144–146, 258, 255–311, 379–413, 421–451. [Google Scholar]
- DIN Deutsches Institut für Normung e.V. Ergonomie der Mensch-System-Interaktion—Teil 110: Grundsätze der Dialoggestaltung; ICS 13.180; 35.080; 35.240.20, DIN EN ISO 9241-110; Beuth Verlag GmbH: Berlin, Germany, 2008. [Google Scholar]
- Jerald, J. The VR Book. Human-Centered Design for Virtual Reality; ACM: New York, NY, USA; San Rafael, CA, USA, 2016; pp. 441–442. [Google Scholar]
- Preim, B.; Dachselt, R. Interaktive Systeme. Band 2: User Interface Engineering, 3D-Interaktion, Natural User Interfaces, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2015; pp. 45–108. [Google Scholar]
- Cohé, A.; Dècle, F.; Hachet, M. tBox: A 3D Transformation Widget designed for Touch-screens: CHI’11. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 3005–3008. [Google Scholar]
- DIN Deutsches Institut für Normung e.V. Ergonomie der Mensch-System-Interaktion—Teil 400: Grundsätze und Anforderungen für Physikalische Eingabegeräte; DIN EN ISO 9241-400; Beuth Verlag GmbH: Berlin, Germany, 2007. [Google Scholar]
- Dörner, R.; Broll, W.; Grimm, P.; Jung, B. (Eds.) Virtual und Augmented Reality (VR/AR). Grundlagen und Methoden der Virtuellen und Augmentierten Realität; Springer Vieweg: Berlin, Germany, 2013. [Google Scholar]
- Endsley, M.R.; Jones, D.G. Designing for Situation Awareness. An Approach to User-Centered Design, 2nd ed.; CRC Press: Boca Raton, FL, USA, 2012. [Google Scholar]
- Friedrich, W. ARVIKA-Augmented Reality for Development, Production and Service. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.194.6733&rep=rep1&type=pdf (accessed on 31 May 2017).
- Gabbard, J.L. Researching Usability Design and Evaluation Guidelines for Augmented Reality (AR) Systems. Available online: http://www.rkriz.net/sv/classes/ESM4714/Student_Proj/class00/gabbard/index.html (accessed on 31 May 2017).
- Kratz, S.; Rohs, M.; Guse, D.; Müller, J.; Bailly, G.; Nischt, M. PalmSpace: Continuous Around-Device Gestures vs. Multitouch for 3D Rotation Tasks on Mobile Devices: AVI 12. In Proceedings of the International Working Conference on Advanced Visual Interfaces, Capri Island (Naples), Italy, 22–25 May 2012; pp. 181–188. [Google Scholar]
- Reisman, J.L.; Davidson, P.L.; Han, J.Y. A Screen-Space Formulation for 2D and 3D Direct Manipulation: UIST’09. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology, Victoria, BC, Canada, 4–7 October 2009; pp. 69–78. [Google Scholar]
- Theis, S.; Pfendler, C.; Alexander, T.; Mertens, A.; Brandl, C.; Schlick, C.M. Head-Mounted Displays—Bedingungen des Sicheren und Beanspruchungsoptimalen Einsatzes: Physische Beanspruchung beim Einsatz von HMDs; Bundesanstalt für Arbeitsschutz und Arbeitsmedizin: Dortmund, Germany, 2016; ISBN 978-3-88261-162-5. [Google Scholar]
- Aragon, C.R.; Hearst, M.A. Improving Aviation Safety with Information Visualization: A Flight Simulation Study. In Proceedings of the CHI’05: SIGCHI Conference on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; ACM: New York, NY, USA, 2005. [Google Scholar]
- DIN Deutsches Institut für Normung e.V. Ergonomische Anforderungen für Bürotätigkeiten mit Bildschirmgeräten—Teil 13: Benutzerführung; DIN EN ISO 9241-13; Beuth Verlag GmbH: Berlin/Heidelberg, Germany, 2000. [Google Scholar]
- DIN Deutsches Institut für Normung e.V. Ergonomie der Mensch-System-Interaktion—Teil 154: Sprachdialogsysteme; DIN EN ISO 9241-154; Beuth Verlag GmbH: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Ong, S.K.; Nee, A.Y.C. Virtual and Augmented Reality Applications in Manufacturing; Springer: London, UK, 2004; pp. 129–183. [Google Scholar]
- Schinke, T.; Henze, N.; Boll, S. Visualization of off-screen objects in mobile augmented reality. In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, Lisbon, Portugal, 7–10 September 2010; ACM: New York, NY, USA, 2010; pp. 313–316. [Google Scholar]
- Schmitt, M.; Zühlke, D. Smartphones und Tablets in der industriellen Produktion: Nutzerfreundliche Bedienung von Feldgeräten. Autom. Prax. 2013, 55, 58. [Google Scholar] [CrossRef]
- Uratani, K.; Machida, T.; Kiyokawa, K.; Takemura, H. A study of depth visualization techniques for virtual annotations in augmented reality. In Proceedings of the IEEE Virtual Reality 2005, Bonn, Germany, 12–16 March 2005; Fröhlich, B., Ed.; IEEE Service Center: Piscataway, NJ, USA, 2005; pp. 295–296. [Google Scholar] [CrossRef]
- Diepstraten, J.; Weiskopf, D.; Ertl, T. Interactive Cutaway Illustrations. Comput. Graph. Forum 2003, 22, 523–532. [Google Scholar] [CrossRef]
- DIN Deutsches Institut für Normung e.V. Ergonomische Anforderungen für Bürotätigkeiten mit Bildschirmgeräten—Teil 16: Dialogführung Mittels Direkter Manipulation; DIN EN ISO 9241-16; Beuth Verlag GmbH: Berlin/Heidelberg, Germany, 2000. [Google Scholar]
- Hashemian, A.M.; Riecke, B.E. Leaning-Based 360° Interfaces: Investigating Virtual Reality Navigation Interfaces with Leaning-Based-Translation and Full-Rotation. In Virtual, Augmented and Mixed Reality, Proceedings of the 9th International Conference, VAMR 2017, Held as Part of HCI International 2017, Vancouver, BC, Canada, 9–14 July 2017; Lackey, S., Chen, J., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 15–32. [Google Scholar]
- Preim, B.; Dachselt, R. Interaktive Systeme. Band 1: Grundlagen, Graphical User Interfaces, Informationsvisualisierung, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 245–281. [Google Scholar]
- Pohl, C.; Waßmann, H. Wahrnehmungsgerechte Präsentati Wahrnehmungsgerechte Präsentation von Designentwürfen mit Hilfe von Augmented Reality; ViProSim-Paper; OWL ViProSim e.V.: Paderborn, Germany, 2009; pp. 405–419. [Google Scholar]
- Cooper, A.; Reimann, R.; Cronin, D.; Engel, R. About face. Interface und Interaction Design; Mitp-Verlag: Heidelberg/Hamburg, Germany, 2010; pp. 77–88, 98. [Google Scholar]
- Pruitt, J.S.; Adlin, T. The Persona Lifecycle. Keeping People in Mind Throughout Product Design; Elsevier: Amsterdam, The Netherlands; Boston, MA, USA, 2006; pp. 11, 230–232. [Google Scholar]
- Miaskiewicz, T.; Kozar, K.A. Personas and user-centered design: How can personas benefit product design processes? Des. Stud. 2011, 32, 417–430. [Google Scholar] [CrossRef]
- Mayas, C.; Hörold, S.; Krömker, H. (Eds.) Personas for Requirements Engineering. Opportunities and Challenges; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
- Steinert, T.; Koreng, R.; Mayas, C.; Cherednychek, N.; Dohmen, C.; Hörold, S.; Krempels, K.-H.; Kehren, P. Offene Mobilitätsplattform (OMP): Teil 1: Rollenmodell & Typische Kooperationsszenarien. VDV-Schriftem 436-1; August 2019; Available online: http://docplayer.org/178259330-Vdv-schrift-2019-offene-mobilitaetsplattform-omp-teil-1-rollenmodell-typische-kooperationsszenarien.html (accessed on 10 June 2021).
- Koreng, R. Entwicklung Eines Patternkatalogs für Augmented Reality Interfaces in der Industrie; Universitätsverlag Ilmenau: Ilmenau, Germany, 2021. [Google Scholar]
- Alexander, C.; Ishikawa, S.; Silverstein, M.; Jacobson, M. A Pattern Language. Towns, Buildings, Construction, 41st ed.; Oxford University Press: New York, NY, USA, 1977; pp. ix–xvii. [Google Scholar]
- Mahemoff, M.J.; Johnston, L.J. Principles for a usability-oriented pattern language. In Proceedings of the 1998 Australasian Computer Human Interaction Conference, OzCHI’98, Adelaide, SA, Australia, 30 November–4 December 1998. [Google Scholar] [CrossRef]
- van Duyne, D.K.; Landay, J.A.; Hong, J.I. The Design of Sites. Patterns, Principles, and Processes for Crafting a Customer-Centered Web Experience, 6th ed.; Addison-Wesley: Boston, MA, USA, 2005; pp. 18–30. [Google Scholar]
- Kunert, T. User-Centered Interaction Design Patterns for Interactive Digital Television Applications; Springer: London, UK, 2009; pp. 58–60, 142–144. [Google Scholar]
- Lindemann, U. (Ed.) Handbuch Produktentwicklung; Hanser: München, Germany, 2016. [Google Scholar]
- Koreng, R.; Krömker, H. Augmented Reality Interface: Guidelines for the Design of Contrast Ratios. In Proceedings of the ASME 2019 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Anaheim, CA, USA, 18–21 August 2019; American Society of Mechanical Engineers: New York, NY, USA, 2019. [Google Scholar] [CrossRef]
- DIN Deutsches Institut für Normung e.V. DIN EN ISO 9241-11: 2018-11, Ergonomie der Mensch-System-Interaktion_-Teil_11: Gebrauchstauglichkeit: Begriffe und Konzepte (ISO_9241-11:2018); Deutsche Fassung EN_ISO_9241-11:2018; Beuth Verlag GmbH: Berlin, Germany, 2018. [Google Scholar]
2005: 3D User-Interface [10] | 2017: 3D User-Interface [11] |
Selection and Manipulation | |
Use existing manipulation techniques unless a large amount of benefit might be derived from designing a new, application-specific technique. | |
Use task analysis when choosing a 3D manipulation technique. | |
Match the interaction technique to the device. | |
Use techniques that can help to reduce clutching. | |
Nonisomorphic (“magic”) techniques are useful and intuitive. | |
Use pointing techniques for selection and virtual hand techniques for manipulation. | Use pointing techniques for selection and grasping techniques for manipulation. |
Use grasp-sensitive object selection. | Consider the use of grasp-sensitive object selection. |
Reduce degrees of freedom when possible. | |
Consider the trade-off between techniques design and environment design. | |
There is no single best manipulation technique. | |
System Control | |
Avoid disturbing the flow of action of an interaction task. | |
Prevent unnecessary changes of the focus of attention. | Prevent unnecessary focus switching and context switching. |
Design for discoverability. | |
Avoid mode errors. | |
Use an appropriate spatial reference frame. | |
Structure the functions in an application. | Structure the functions in an application and guide the user. |
Consider using multimodal input. | |
3D is not always the best solution—consider hybrid interface. | |
User Comfort and Safety | |
Move wires and cables out of the way or use wireless solution when possible; reduce the weight of the equipment. | |
Provide physical and virtual barriers to keep the user and the equipment safe. | |
Limit interaction in free space; provide a device resting place. | |
Design public systems to be sanitary. | |
Design for relatively short sessions and encourage breaks. | |
Design for comfortable poses. | |
Ensure temporal and spatial compliance between feedback dimensions. | |
Use constraints. | |
Consider using props and passive feedback, particularly in highly specialized tasks. | |
Use Guiard’s principles in designing a two-handed interface. | |
Consider real-world tools and practices as a source of inspiration for 3D UI design. | |
Consider designing 3D techniques using principles from 2D interaction. | |
Use and invent magical techniques. | |
Consider alternatives to photorealistic aesthetics. |
Category | Subcategory |
---|---|
Task appropriateness | |
Focus on content [10,13,14], interview with experts | Preparation of content:
|
AR input and output [10,12,14,15,16,17,18,19,20,21,22,23], interview with experts | Automatically start the AR program when the glasses are put on. |
Precisely match input to task:
| |
Integration of signals:
| |
Self-descriptiveness | |
Status information [10,13,14,17,18,23,24,25,26,27,28,29,30], interview with experts | Use of permanent displays at:
|
Use of situational displays at:
| |
Use of feedback for:
| |
Use of general design:
| |
Basic interaction [10,13,14,17,19,20,24,28,30,31,32,33], interview with experts | Display of all possible interactions. |
Designing sequences of interactions in a process- and interface-oriented manner. | |
Design and coding of manipulation techniques, such as the reset function. | |
Use hidden information (“ghost views”) to reduce information. | |
Conformity to expectations | |
Description of the product as a model [13,33], interview with experts | Represent product with reduced background or context geometries; use realistic and familiar object geometries. |
Focus on the product. | |
AR system [10,14,16,34], interview with experts | Field of view of AR glasses should largely correspond to natural human field of view (horizontal~180° vertical~120°). |
Interaction with the AR system should be consistent, self-explanatory, and oriented to what has already been learned. | |
Learning conciseness | |
Context-specific information [13,19,25,29], interview with experts | Hints facilitate operation/interaction. |
More detailed information should be displayed if required. | |
Specific product criteria are displayed as action hints for employees. | |
Controllability | |
Software supports the various tasks [13], interview with experts | AR system contains hints for physical tools. |
AR system uses the pointing capabilities of humans with their hands for interaction. | |
User is guided [13,25,33], interview with experts | Situational information display depending on work activity. |
Depth cues [13,14,19,20,24,28,30,33,35], interview with experts | Giving depth cues in a task-oriented manner. |
Supporting the perception of depth cues with tools. | |
Controls [10,14,16,17], interview with experts | Designing gesture and tab sensors to be clearly controllable. |
Tolerance for errors | |
Input/Output [10,13,14,19,26,27], interview with experts | Software recognizes incomplete terms in voice commands. |
In the case of incorrect input and output, reliability is supported by auto-completion or queries. | |
Individualizable | |
Modes of presentation and interaction [13,14,19,25], interview with experts | Individualizable display of information about the product or interaction. |
Display variants:
| |
Offer different modes of interaction, such as speech and gesture. |
Source | Advantages |
---|---|
Cooper (1999) [38] |
|
Cooper and Reimann (2002) [39] |
|
Grudin and Pruitt (2002) [38] |
|
Long (2009) [38] |
|
Ma and LeRouge (2007) [38] |
|
Mayas, Hörold and Krömker (2016) [39] |
|
Pruitt and Adlin (2006) [38] |
|
Gender | N = 40 | Age | N = 40 |
---|---|---|---|
Male | 85% | <25 years | 12% |
Female | 15% | 25–34 years | 35% |
Divers | 0% | 34–44 years | 20% |
45–54 years | 25% | ||
>54 years | 8% |
Occupational specialties | N = 40 |
Development | 7% |
Management | 15% |
Engineer | 23% |
Marketing/Sales | 7% |
Planning | 10% |
Quality Control | 7% |
Engineering | 28% |
Training | 3% |
Category | Category of the pattern |
Name | Name of the pattern |
Problem | Description of the representation problem |
Solution | Description of the alternative solution |
Evidence | Evidence by a usability test |
Potential | Potential of the pattern |
Related patterns | Similar patterns |
Representation | Graphical representation of the pattern |
Generic Task | Alternative Representations | |
---|---|---|
Select from menu | Shape of the main menu: Tile | |
Input: Gestures | Input: Focus | |
Shape of the main menu: List | ||
Input: Gestures | Input: Focus | |
Shape of the main menu: Circle | ||
Input: Gestures | Input: Focus | |
Deepen object information | Position of information: Near object | |
Input: Gestures | Input: Focus | |
Position of information: Far from object | ||
Input: Gestures | Input: Focus | |
Selecting from the toolbar | Position of the toolbar: Top | |
Input: Gestures | Input: Focus | |
Position of the toolbar: Bottom | ||
Input: Gestures | Input: Focus | |
Position of the toolbar: Right | ||
Input: Gestures | Input: Focus | |
Position of the toolbar: Left | ||
Input: Gestures | Input: Focus | |
Navigation in documents: Split screen/full screen | Type of navigation: Browse | |
Input: Gestures and focus | ||
Type of navigation: Scroll | ||
Input: Gestures and focus | ||
Gender | N = 50 | Age | N = 50 |
---|---|---|---|
Male | 78% | <25 years | 10% |
Female | 22% | 25–34 years | 36% |
Divers | 0% | 34–44 years | 28% |
45–54 years | 20% | ||
>54 years | 6% |
Occupational specialties | N = 50 |
Development | 6% |
Management | 14% |
Engineer | 24% |
Marketing/Sales | 8% |
Planning | 8% |
Quality Control | 8% |
Engineering | 30% |
Training | 2% |
Generic Task | Usefulness (M) | Effectiveness (M) | Efficiency (M) | Satisfaction (M) |
---|---|---|---|---|
Select from menu | ||||
Tile | 3.90 | 4.06 | 3.14 | 3.86 |
Deepen object information | ||||
Information about the object | 2.96 | 3.22 | 2.44 | 3.04 |
Selecting from the toolbar | ||||
Function bar on the right | 3.48 | 3.90 | 2.66 | 3.50 |
Navigation in documents | ||||
Scroll reading style | 2.56 | 2.78 | 2.44 | 2.44 |
Select from menu: Tile | ||||
Interaction through gestures | 3.91 | 4.09 | 2.95 | 3.86 |
Deepen object information: Information about the object | ||||
Interaction through focus | 3.48 | 3.76 | 2.86 | 3.57 |
Selecting from the toolbar: Function bar on the right | ||||
Interaction through gestures | 3.75 | 4.04 | 3.02 | 3.71 |
Category | Generic Task: Select from Main Menu | |
---|---|---|
Name | Layout variant: Main menu as list | |
Problem | The user has several applications and contents at his disposal. For an overview of the different contents the user needs a main menu. | |
Solution | The main menu is displayed when the AR device is started. The information is available to the user situationally and must be called up specifically. A list-like display has a high recognition value, as it is already frequently used in industrial applications. Users are thus familiar with the design. The menu in the form of a list allows the user to view the complete contents of the main menu immediately. The complete contents of the main menu immediately. It can be expanded in list elements as well as in depth, but care should be taken to keep the main menu as such and to form suitable subgroups. | |
Evidence | Usability test: Prototypical evaluation with 50 test persons | |
Evaluation of usefulness Rating effectiveness Rating efficiency Evaluation satisfaction | M: 3.40 out of 5.00 points M: 3.88 out of 5.00 points M: 3.14 out of 5.00 points M: 3.34 out of 5.00 points | |
Potential | Extension of the list in length Sorting of contents according to relevance Reduction of content to icons | |
Related patterns | Main menu as tile Main menu as circle Main menu as tile-interaction by gestures/focusing Main menu as list-interaction by gestures/focusing Main menu as circle-interaction by gestures/focusing | |
Representation |
Category | Generic Task: Select from Main Menu | Category | Generic Task: Select from Main Menu |
---|---|---|---|
Name | Layout variant: Main menu as tile | Name | Layout variant: Main menu as circle |
Problem | The user has several applications and contents at his disposal. For an overview of the different contents the user needs a main menu. | Problem | The user has several applications and contents at his disposal. For an overview of the different contents the user needs a main menu. |
Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. A tile-shaped display has a recognition value since it is applied to current Windows PCs. Users are thus familiar with the design. The main menu in the form of a tile enables the user to view the complete contents of the complete contents of the main menu briefly. The tiles are arranged as a matrix and can go into any depth. It should be noted that the matrix does not become too detailed and thus lose the character of the main menu. | Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. A circular display has a recognition value from the gaming field, and it has similarities with the tile display. The main menu in the form of a circle allows the user to see the complete contents of the main menu briefly. The circle corresponds to round arranged elements and has the main category in the center. The circular representation allows a limited number of extensions per level but can be extended in depth. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.90 out of 5.00 points Rating effectiveness M: 4.06 out of 5.00 points Rating efficiency M: 3.14 out of 5.00 points Evaluation satisfaction M: 3.86 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.52 out of 5.00 points Rating effectiveness M: 3.74 out of 5.00 points Rating efficiency M: 3.20 out of 5.00 points Evaluation satisfaction M: 3.60 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Deepen object information | Category | Generic task: Selecting from the toolbar |
Name | Layout variant: Position of information: near object | Name | Layout variant: Function bar top |
Problem | The user should be shown additional information about a specific object or product. The user should be able to quickly grasp and retrieve this information. | Problem | While the user is in an application, a toolbar is needed so that the current position can be left. |
Solution | For 3D models and objects, additional information can be provided to the user. This can be called up situationally using icons. To place the information at the appropriate place and in the field of vision of the user, it is recommended to display the content directly on the object. In this way, an icon symbolizes to the user that information is available. By pressing the icon, this additional information is retrieved from the user. If this information is no longer needed, it can be closed again. Due to the different placement of the contents, there is no overlapping of information. Placing the information directly at the object offers the advantage that an exact assignment can be made. | Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar at the top is perceived directly by the user. The toolbar can be placed over textual content as well as over graphical elements. It is the permanent constant in the application and ensures that the user can always return to the main menu or save content. Most applications on the PC have their function bar in the upper area, which leads to a high recognition value among users. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 4.56 out of 5.00 points Rating effectiveness M: 4.56 out of 5.00 points Rating efficiency M: 3.68 out of 5.00 points Evaluation satisfaction M: 4.56 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.86 out of 5.00 points Rating effectiveness M: 4.10 out of 5.00 points Rating efficiency M: 3.10 out of 5.00 points Evaluation satisfaction M: 3.70 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Selecting from the toolbar | Category | Generic task: Navigation in documents—split screen |
Name | Layout variant: Function bar bottom | Name | Layout variant: Scrolling reading style |
Problem | While the user is in an application, a toolbar is needed so that the current position can be left. | Problem | The user is provided with short texts in a split screen while working with the AR system. To avoid texts that are too long, sensible divisions should be made. |
Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar at the bottom is perceived directly by the user. The toolbar can be placed under textual content as well as under graphical elements. It is the permanent constant in the application and ensures that the user can always return to the main menu or save content. The current PCs have a function bar at the bottom of the screen in their basic settings, so recognition can be generated among users. | Solution | Textual information can be provided to the user in the AR system. The text is called up specifically and must be quickly grasped by the user. Scrolling is suitable for navigating through single-column texts on a split screen in an AR system. The content sections of the individual pages enable the user to quickly grasp information. Furthermore, scrolling allows the user to review sections that have already been read. Scrolling is done by arrows above the text and indicates the number of pages at the bottom of the screen. The advantage of scrolling is that it can be designed like reading in a book; by labeling the page number, the user gets an overview of the scope. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.44 out of 5.00 points Rating effectiveness M: 4.98 out of 5.00 points Rating efficiency M: 3.16 out of 5.00 points Evaluation satisfaction M: 3.30 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 4.34 out of 5.00 points Rating effectiveness M: 4.28 out of 5.00 points Rating efficiency M: 3.68 out of 5.00 points Evaluation satisfaction M: 4.16 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation |
Category | Generic Task: Select from Main Menu | Category | Generic Task: Select from Main Menu |
---|---|---|---|
Name | Layout variant: Main menu as tile Interaction variant: Interaction through gestures | Name | Layout variant: Main menu as tile Interaction variant: Interaction through focusing |
Problem | A range of applications and content is available to the user. For an overview of the different contents, the user needs a main menu. Therefore, the interaction with the main menu is relevant for the user. | Problem | A range of applications and content is available to the user. For an overview of the different contents, the user needs a main menu. Therefore, the interaction with the main menu is relevant for the user. |
Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. The main menu in the form of a tile allows the user to view the complete contents of the main menu briefly. With this form of presentation, interaction via gesture control is suitable. The action is deliberately and specifically triggered by the user through a hand movement. Attention: One hand must always trigger the action, which does not allow complete hands-free work. | Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. The main menu in the form of a tile allows the user to view the complete contents of the main menu briefly. In this form of presentation, the interaction is suitable by focusing with the eye. The action is only triggered after a set period and enables complete, hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.91 out of 5.00 points Rating effectiveness M: 4.09 out of 5.00 points Rating efficiency M: 2.95 out of 5.00 points Evaluation satisfaction M: 3.86 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.84 out of 5.00 points Rating effectiveness M: 4.15 out of 5.00 points Rating efficiency M: 3,21 out of 5.00 points Evaluation satisfaction M: 3.90 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Select from main menu | Category | Generic task: Select from main menu |
Name | Layout variant: Main menu as list Interaction variant: Interaction through gestures | Name | Layout variant: Main menu as list Interaction variant: Interaction through focusing |
Problem | A range of applications and content is available to the user. For an overview of the different contents, the user needs a main menu. Therefore, the interaction with the main menu is relevant for the user. | Problem | A range of applications and content is available to the user. For an overview of the different contents, the user needs a main menu. Therefore, the interaction with the main menu is relevant for the user. |
Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. The main menu in the form of a list allows the user to view the complete contents of the main menu immediately. With this form of presentation, interaction via gesture control is suitable. The action is deliberately and specifically triggered by the user through a hand movement. Attention: One hand must always trigger the action, which does not allow complete hands-free work. | Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. The main menu in the form of a list allows the user to view the complete contents of the main menu immediately. With this form of presentation, the interaction is suitable by focusing with the eye. In this form of presentation, the interaction is suitable by focusing with the eye. The action is only triggered after a set period and enables complete, hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.66 out of 5.00 points Rating effectiveness M: 4.00 out of 5.00 points Rating efficiency M: 2.95 out of 5.00 points Evaluation satisfaction M: 3.60 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.59 out of 5.00 points Rating effectiveness M: 4.06 out of 5.00 points Rating efficiency M: 3.21 out of 5.00 points Evaluation satisfaction M: 3.64 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Select from main menu | Category | Generic task: Select from main menu |
Name | Layout variant: Main menu as circle Interaction variant: Interaction through gestures | Name | Layout variant: Main menu as circle Interaction variant: Interaction through focusing |
Problem | A range of applications and content is available to the user. For an overview of the different contents, the user needs a main menu. Therefore, the interaction with the main menu is relevant for the user. | Problem | A range of applications and content is available to the user. For an overview of the different contents, the user needs a main menu. Therefore, the interaction with the main menu is relevant for the user. |
Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. The main menu in the form of a circle allows the user to see the complete contents of the main menu briefly. With this form of display, interaction via gesture control is suitable. The action is consciously and purposefully triggered by the user through a hand movement. Attention: One hand must always trigger the action, which does not allow complete hands-free work. | Solution | The main menu is displayed when the AR terminal is started. The information is available to the user situationally and must be called up specifically. The main menu in the form of a circle allows the user to see the complete contents of the main menu briefly. With this form of presentation, the interaction is suitable by focusing with the eye. In this form of presentation, the interaction is suitable by focusing with the eye. The action is only triggered after a set period and enables complete, hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.72 out of 5.00 points Rating effectiveness M: 3.93 out of 5.00 points Rating efficiency M: 2.98 out of 5.00 points Evaluation satisfaction M: 3.73 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.65 out of 5.00 points Rating effectiveness M: 3.99 out of 5.00 points Rating efficiency M: 3.24 out of 5.00 points Evaluation satisfaction M: 3.77 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Deepen object information | Category | Generic task: Deepen object information |
Name | Layout variant: Information on the object Interaction variant: Interaction through gestures | Name | Layout variant: Information on the object Interaction variant: Interaction through focusing |
Problem | The user should be shown additional information about a specific object or product. The user should be able to quickly grasp and retrieve this information. The interaction with the main menu is relevant for the user. | Problem | The user should be shown additional information about a specific object or product. The user should be able to quickly grasp and retrieve this information. The interaction with the main menu is relevant for the user. |
Solution | For 3D models and objects, additional information can be provided to the user. This can be called up situationally using icons. To place the information in the appropriate place and in the user’s field of vision, it is advisable to display the content directly on the object. In this form of presentation, interaction via gesture control is suitable. The user consciously and specifically triggers the action with a hand movement. However, one hand always must trigger the action, which does not allow complete hands-free work. | Solution | For 3D models and objects, additional information can be provided to the user. This can be called up situationally using icons. To place the information in the appropriate place and in the user’s field of vision, it is advisable to display the content directly on the object. In this form of presentation, interaction by focusing with the eye is suitable. The action is triggered only after a specified period and enables complete, hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 4.25 out of 5.00 points Rating effectiveness M: 4.31 out of 5.00 points Rating efficiency M: 3.22 out of 5.00 points Evaluation satisfaction M: 4.28 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 4.28 out of 5.00 points Rating effectiveness M: 4.43 out of 5.00 points Rating efficiency M: 3.48 out of 5.00 points Evaluation satisfaction M: 4.33 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Deepen object information | Category | Generic task: Deepen object information |
Name | Layout variant: Information far from object Interaction variant: Interaction through gestures | Name | Layout variant: Information far from object Interaction variant: Interaction through focusing |
Problem | The user should be shown additional information about a specific object or product. The user should be able to quickly grasp and retrieve this information. The interaction with the main menu is relevant for the user. | Problem | The user should be shown additional information about a specific object or product. The user should be able to quickly grasp and retrieve this information. The interaction with the main menu is relevant for the user. |
Solution | For 3D models and objects, additional information can be provided to the user. This can be called up situationally using icons. The information can be bundled in a central location and placed in the user’s field of vision; for this purpose, it is advisable to display the content above the object. With this form of presentation, interaction via gesture control is suitable. The action is consciously and purposefully triggered by the user through a hand movement. Attention: One hand must always trigger the action, which does not allow complete hands-free work. | Solution | For 3D models and objects, additional information can be provided to the user. This can be called up situationally using icons. The information can be bundled in a central location and placed in the user’s field of vision; for this purpose, it is advisable to display the content above the object. In this form of presentation, the interaction is suitable by focusing with the eye. In this case, the action is only triggered after a set period and enables complete, hands-free interaction. Attention: The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.45 out of 5.00 points Rating effectiveness M: 3.64 out of 5.00 points Rating efficiency M: 2.60 out of 5.00 points Evaluation satisfaction M: 3.52 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.48 out of 5.00 points Rating effectiveness M: 3.76 out of 5.00 points Rating efficiency M: 2.86 out of 5.00 points Evaluation satisfaction M: 3.57 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Selecting from the toolbar | Category | Generic task: Selecting from the toolbar |
Name | Layout variant: Function bar top Interaction variant: Interaction through gestures | Name | Layout variant: Function bar top Interaction variant: Interaction through focusing |
Problem | While the user is in an application, a toolbar is needed so that the current position can be left. | Problem | While the user is in an application, a toolbar is needed so that the current position can be left. |
Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar at the top is perceived directly by the user. The function bar can be placed above textual content as well as above graphical elements. In this form of presentation, interaction via gesture control is suitable. The user consciously and specifically triggers the action with a hand movement. However, one hand always must trigger the action, which does not allow complete hands-free work. | Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar at the top is perceived directly by the user. The function bar can be placed above textual content as well as above graphical elements. With this form of presentation, the interaction is suitable by focusing with the eye. In this case, the action is triggered only after a specified period and enables complete, hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.94 out of 5.00 points Rating effectiveness M: 4.14 out of 5.00 points Rating efficiency M: 3.24 out of 5.00 points Evaluation satisfaction M: 3.81 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.83 out of 5.00 points Rating effectiveness M: 4.18 out of 5.00 points Rating efficiency M: 3.44 out of 5.00 points Evaluation satisfaction M: 3.88 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Selecting from the toolbar | Category | Generic task: Selecting from the toolbar |
Name | Layout variant: Function bar down Interaction variant: Interaction through gestures | Name | Layout variant: Function bar down Interaction variant: Interaction through focusing |
Problem | While the user is in an application, a toolbar is needed so that the current position can be left. | Problem | While the user is in an application, a toolbar is needed so that the current position can be left. |
Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar at the bottom is directly perceived by the user. The function bar can be placed under textual content as well as under graphical elements. With this form of presentation, interaction via gesture control is suitable. The user consciously and purposefully triggers the action by moving his or her hand. However, one hand always must trigger the action, which does not allow complete hands-free working. | Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar at the bottom is directly perceived by the user. The function bar can be placed under textual content as well as under graphical elements. In this form of presentation, the interaction is suitable by focusing with the eye. The action is only triggered after a set period and enables complete hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.73 out of 5.00 points Rating effectiveness M: 4.08 out of 5.00 points Rating efficiency M: 3.27 out of 5.00 points Evaluation satisfaction M: 3.61 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.62 out of 5.00 points Rating effectiveness M: 4.12 out of 5.00 points Rating efficiency M: 3.47 out of 5.00 points Evaluation satisfaction M: 3.68 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Selecting from the toolbar | Category | Generic task: Selecting from the toolbar |
Name | Layout variant: Function bar right Interaction variant: Interaction through gestures | Name | Layout variant: Function bar right Interaction variant: Interaction through focusing |
Problem | While the user is in an application, a toolbar is needed so that the current position can be left. | Problem | While the user is in an application, a toolbar is needed so that the current position can be left. |
Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar on the right-hand side is perceived directly by the user. The toolbar can be placed to the right of textual content as well as to the right of graphical elements. In this form of presentation, interaction via gesture control is suitable. The user consciously and specifically triggers the action with a hand movement. However, one hand always has to trigger the action, which does not allow complete hands-free work. | Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar on the right-hand side is perceived directly by the user. The toolbar can be placed to the right of textual content as well as to the right of graphical elements. With this form of presentation, the interaction is suitable by focusing with the eye. In this case, the action is triggered only after a specified period and enables complete hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unintentional actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.75 out of 5.00 points Rating effectiveness M: 4.04 out of 5.00 points Rating efficiency M: 3.02 out of 5.00 points Evaluation satisfaction M: 3.71 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.52 out of 5.00 points Rating effectiveness M: 4.08 out of 5.00 points Rating efficiency M: 3.22 out of 5.00 points Evaluation satisfaction M: 3.78 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation | ||
Category | Generic task: Selecting from the toolbar | Category | Generic task: Selecting from the toolbar |
Name | Layout variant: Function bar left Interaction variant: Interaction through gestures | Name | Layout variant: Function bar left Interaction variant: Interaction through focusing |
Problem | While the user is in an application, a toolbar is needed so that the current position can be left. | Problem | While the user is in an application, a toolbar is needed so that the current position can be left. |
Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar in the left area is perceived directly by the user. The toolbar can be placed to the left of textual content as well as to the left of graphical elements. With this form of presentation, interaction via gesture control is suitable. The user consciously and purposefully triggers the action by moving his or her hand. However, the action must always be triggered by one hand, which does not allow complete hands-free working. | Solution | The function bar has the task in the AR system that the user can navigate in the current view. As soon as an activity is selected from the main menu, the function bar is permanently available to the user. The function bar in the left area is perceived directly by the user. The toolbar can be placed to the left of textual content as well as to the left of graphical elements. In this form of presentation, the interaction is suitable by focusing with the eye. The action is only triggered after a set period and enables complete hands-free interaction. The time span until the interaction is triggered should only last a few seconds in order not to influence the daily work routine; however, unwanted actions can also be triggered in this way. |
Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.63 out of 5.00 points Rating effectiveness M: 3.97 out of 5.00 points Rating efficiency M: 3.11 out of 5.00 points Evaluation satisfaction M: 3.44 out of 5.00 points | Evidence | Usability test: Prototypical evaluation with 50 test persons Evaluation of usefulness M: 3.52 out of 5.00 points Rating effectiveness M: 4.01 out of 5.00 points Rating efficiency M: 3.31 out of 5.00 points Evaluation satisfaction M: 3.51 out of 5.00 points |
(…) | (…) | (…) | (…) |
Representation | Representation |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Koreng, R.; Krömker, H. User Interface Pattern for AR in Industrial Applications. Information 2021, 12, 251. https://doi.org/10.3390/info12060251
Koreng R, Krömker H. User Interface Pattern for AR in Industrial Applications. Information. 2021; 12(6):251. https://doi.org/10.3390/info12060251
Chicago/Turabian StyleKoreng, Regina, and Heidi Krömker. 2021. "User Interface Pattern for AR in Industrial Applications" Information 12, no. 6: 251. https://doi.org/10.3390/info12060251
APA StyleKoreng, R., & Krömker, H. (2021). User Interface Pattern for AR in Industrial Applications. Information, 12(6), 251. https://doi.org/10.3390/info12060251