Creating Collaborative Augmented Reality Experiences for Industry 4.0 Training and Assistance Applications: Performance Evaluation in the Shipyard of the Future †
Abstract
:Featured Application
Abstract
1. Introduction
- The potential of IAR to facilitate, support, and optimize production and assembly tasks through training and assistance IAR applications is analyzed.
- Thorough details on the analysis, design, and implementation of an IAR application based in Microsoft HoloLens smart glasses [21] are provided so as to ease the work of future developers when developing similar IAR applications.
- A novel collaborative IAR framework is proposed in order to enable creating IAR experiences easily. The performance of such a framework is evaluated in terms of packet communications delay, communication interference, and anchor transmission latency.
- The validation of the proposed system by Navantia’s operators is presented, thus providing useful insights and guidelines for future developers.
2. State-of-the-Art: IAR Applications for Training and Assistance
2.1. IAR Training Systems
2.2. IAR Assistance Systems
2.3. Developing IAR Training and Assistance Systems
2.4. Shipbuilding IAR Systems
2.5. Analysis of the State-of-the-Art
- First of all, there is a lack of solutions devoted specifically to solving the main issues related to AR training and assistance in shipbuilding.
- Second, there are not many solutions designed for Microsoft HoloLens, currently one of the more sophisticated HMD devices.
- Third, there is no significant recent literature that proposes a collaborative framework for IAR. There are only some early works that were systematically reviewed in [57].
- Finally, the fourth shortcoming is related to the fact that, although different experiments have been proposed mainly regarding visual fixtures and on the usability of IAR applications, there are no significant solution validations that include performance considerations for their deployment under production conditions in a real-world industrial scenario.
3. Analysis and Design of the Proposed System
3.1. Main Goals of the System
- It should allow for scaling and moving the displayed 3D models with the aim of facilitating their visualization, thus enabling placing them where the user considers appropriate. In addition, when moving an object, an automatic scanning of the environment will be carried out to avoid the model being placed in an intersection with real objects, as this would cause an uncomfortable effect when visualizing it.
- The assembly sequence should be visualized step-by-step through animations and contextual text instruction that provide the necessary technical details.
- It should be possible to visualize the documentation associated with every relevant part, especially its blueprints and physical measures.
- The developed application should implement a shared experience system that allows multiple AR devices to interact with the same virtual part at the same time.
- Ease of use: The application interface should be as simple and intuitive as possible to avoid misunderstandings during the training process. Animations and interactions should also be reinforced through the reproduction of sounds that guide each action (e.g., start and end of the animations, selections in the menus).
3.2. Design Requirements
- User interaction with the system through the gestures detected by the HoloLens smart glasses (gaze and tap).
- The cursor that indicates the position the user is looking at (gaze) at any given moment.
- The panel through which the user can interact with the part to scale it, move it, start animations, or read additional information about it.
- Panel tracking of the user, so that when the user changes her/his location or rotates his/her head, the panel is positioned in front of the user. A button should also be included on the panel to enable or disable this functionality.
- A modular internationalization system for real-time translation of the application, allowing more languages to be easily added.
- Sounds and sound effects that help the user recognize and reaffirm different actions, such as a click or the start and end of the animations.
- Creation of animations that illustrate the assembly steps contained in the part assembly manual.
- Creation of an assembly-disassembly animation that makes all the parts of the clutch visible.
- Scaling of the 3D model so that it can be made bigger and smaller.
- Movement of the 3D model. During this process, the surrounding surfaces are mapped by the HoloLens to avoid possible collisions and intersections of the part with real elements.
- The parts that have associated documentation should be highlighted to facilitate their detection and interaction by the user.
- There should be an independent panel that shows the documentation available for each part.
- There should be an independent panel that shows the manufacturing order and the hydraulic clutch assembly manual.
- The AR environment should be synchronized among multiple HoloLens devices.
- Sharing of the hydraulic clutch, taking into account its location and rotation, as well as its real-time status, so that all users share their interactions with the part (animations, scaling, movement) with the rest of the used HoloLens glasses.
3.3. Communications Architecture
4. Implementation
4.1. Hardware and Software
4.2. Collaborative Framework
4.2.1. Master’s Discovery Process
4.2.2. Anchor Synchronization
4.2.3. AR User Event Synchronization
4.3. HoloLens Application
4.3.1. Design
- Camera: This is the Unity camera used to render the objects in the scene. At the beginning of the execution, it must be located in the XYZ (0,0,0) coordinates with no rotation in any of the axes.
- Directional lights: The lights that illuminate the scene in a homogeneous way.
- Manager: The empty GameObject that contains the general logic of the scene: GazeManager, GazeGestureManager (responsible for capturing gestures and the communications with the selected objects by sending an “OnSelect” message) and LocalizationManager (this component handles the movement of virtual elements).
- Cursor: The object located at the collision point between the gaze and the rest of the scene. The cursor is placed on the surfaces that can be interacted with to make it clear to the user that they can be selected.
- Panel/panels: Each panel contains buttons that are needed for interacting with the hydraulic clutch. Each button sends a message to the clutch so that the corresponding action is performed.
- Clutch/turbine: The 3D object of the clutch that is composed of all its parts.
4.3.2. Implementation
5. Experiments
5.1. Collaborative Framework Performance Tests
5.1.1. Experimental Setup
- One pair of Microsoft HoloLens glasses.
- Router Linksys Wireless-G (2.4 GHz).
- Router Archer C3200 (5 GHz).
- Desktop computer.
5.1.2. Regular Packet Communication Delay
5.1.3. Interference Influence
5.1.4. Anchor Transmission Latency
5.2. Validation Tests
5.3. Operator Feedback
5.4. Key Findings
5.5. Next Challenges
- As of writing, Microsoft is working on its own official framework to implement shared experiences. This could be an important step forward as it is thought that this official implementation would solve the compatibility problems detected during the development of the solution presented in this paper.
- Another important aspect that can be improved is the visualization of traditional documentation (or digital PDF files) through panels, which are shown as mere 2D objects that do not take advantage of all the potential of the used AR devices. In addition, documentation panels make it more difficult to navigate the information than in a computer or tablet. To tackle this issue, the documentation could be shown in a contextual way, associated with the parts that are being visualized in 3D and, whenever possible, using elements like arrows, gauges, tools (e.g., screwdrivers, drillers), or animations to enhance the information provided by the documentation.
- During the tests at Navantia’s Turbine workshop, it was identified that the small details in the assembly steps were the most useful for the operators when using the developed AR application, since the possibility of watching the parts from different angles allows for highlighting specific details in a more effective way than a printed document. Therefore, future developers should consider adding more detail in each step of the assembly sequence, giving the possibility of moving back and forth in the animations.
- Due to the problems discussed in the previous Section 5.4 about the size of the anchors, it would be interesting to implement an error detection system during synchronization that would allow for restarting the synchronization and thus enabling automatically recovering from this type of malfunction.
- The option of adding speech recognition could be considered to facilitate the interaction with the application without the need for hand gestures, which would offer greater freedom when working. Nonetheless, developers should consider that speech recognition may be difficult or even impossible in noisy industrial environments.
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
AR | Augmented Reality |
CAD | Computer-Aided Design |
ERP | Enterprise Resource Planning |
HMD | Head-Mounted Display |
IAR | Industrial Augmented Reality |
IoT | Internet of Things |
IIoT | Industrial Internet of Things |
MES | Manufacturing Execution System |
MR | Mixed Reality |
PLM | Product Life-cycle Management |
UWP | Universal Windows Platform |
VR | Virtual Reality |
References
- Announcement of the Industrie 4.0 Project in the 2011 Hannover Fair. April 2011. Available online: https://www.vdi-nachrichten.com/Technik-Gesellschaft/Industrie-40-Mit-Internet-Dinge-Weg-4-industriellen-Revolution (accessed on 5 December 2020).
- Industrie 4.0. February 2019. Available online: https://www.bmbf.de/de/zukunftsprojekt-industrie-4-0-848.html (accessed on 5 December 2020).
- Qi, Q.; Tao, F. Digital Twin and Big Data Towards Smart Manufacturing and Industry 4.0: 360 Degree Comparison. IEEE Access 2018, 6, 3585–3593. [Google Scholar] [CrossRef]
- Rasheed, A.; San, O.; Kvamsdal, T. Digital Twin: Values, Challenges and Enablers From a Modeling Perspective. IEEE Access 2020, 8, 21980–22012. [Google Scholar]
- Wu, Y.; Dai, H.-N.; Wang, H. Convergence of Blockchain and Edge Computing for Secure and Scalable IIoT Critical Infrastructures in Industry 4.0. IEEE Internet Things J. 2020. [Google Scholar] [CrossRef]
- Fernández-Caramés, T.M.; Fraga-Lamas, P. A Review on the Application of Blockchain to the Next Generation of Cybersecure Industry 4.0 Smart Factories. IEEE Access 2019, 7, 45201–45218. [Google Scholar] [CrossRef]
- Lee, J.; Davari, H.; Singh, J.; Pandhare, V. Industrial Artificial Intelligence for industry 4.0-based manufacturing systems. Manuf. Lett. 2018, 18, 20–23. [Google Scholar] [CrossRef]
- Compare, M.; Baraldi, P.; Zio, E. Challenges to IoT-Enabled Predictive Maintenance for Industry 4.0. IEEE Internet Things J. 2020, 7, 4585–4597. [Google Scholar] [CrossRef]
- Wan, J.; Tang, S.; Hua, Q.; Li, D.; Liu, C.; Lloret, J. Context-Aware Cloud Robotics for Material Handling in Cognitive Industrial Internet of Things. IEEE Internet Things J. 2018, 5, 2272–2281. [Google Scholar] [CrossRef]
- Gattullo, M.; Scurati, G.W.; Fiorentino, M.; Uva, A.E.; Ferrise, F.; Bordegoni, M. Towards augmented reality manuals for industry 4.0: A methodology. Robot. Comput.-Integr. Manuf. 2019, 56, 276–286. [Google Scholar] [CrossRef]
- Fraga-Lamas, P.; Fernández-Caramés, T.M.; Blanco-Novoa, Ó.; Vilar-Montesinos, M.A. A review on industrial augmented reality systems for the industry 4.0 shipyard. IEEE Access 2018, 6, 13358–13375. [Google Scholar] [CrossRef]
- Sutherland, I.E. The ultimate display. In Proceedings of the IFIP, New York, NY, USA, 24–29 May 1965; pp. 506–508. [Google Scholar]
- Picallo, I.; Vidal-Balea, A.; Lopez-Iturri, P.; Fraga-Lamas, P.; Klaina, H.; Fernández-Caramés, T.M.; Falcone, F. Wireless Channel Assessment of Auditoriums for the Deployment of Augmented Reality Systems for Enhanced Show Experience of Impaired Persons. Multidiscip. Digit. Publ. Inst. Proc. 2020, 42, 30. [Google Scholar]
- Noreikis, M.; Savela, N.; Kaakinen, M.; Xiao, Y.; Oksanen, A. Effects of Gamified Augmented Reality in Public Spaces. IEEE Access 2019, 7, 148108–148118. [Google Scholar] [CrossRef]
- Kim, M.; Park, K.B.; Choi, S.H.; Lee, J.Y.; Kim, D.Y. AR/VR-Based Live Manual for User-Centric Smart Factory Services. In Proceedings of the IFIP International Conference on Advances in Production Management Systems, Seoul, Korea, 26–30 August 2018; Springer: Cham, Switzerland, 2018; pp. 417–421. [Google Scholar]
- Daling, L.; Abdelrazeq, A.; Sauerborn, C.; Hees, F. A Comparative Study of Augmented Reality Assistant Tools in Assembly. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DC, USA, 24–28 July 2019; Springer: Cham, Switzerland, 2020; pp. 755–767. [Google Scholar]
- Smparounis, K.; Mavrikios, D.; Pappas, M.; Xanthakis, V.; Viganò, G.P.; Pentenrieder, K. A virtual and augmented reality approach to collaborative product design and demonstration. In Proceedings of the 2008 IEEE International Technology Management Conference (ICE), Lisbon, Portugal, 23–28 June 2008; pp. 1–8. [Google Scholar]
- Henderson, S.; Feiner, S. Exploring the benefits of augmented reality documentation for maintenance and repair. IEEE Trans. Vis. Comput. Graph. 2010, 17, 1355–1368. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Werrlich, S.; Daniel, A.; Ginger, A.; Nguyen, P.A.; Notni, G. Comparing HMD-based and Paper-based Training. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, 16–20 October 2018; pp. 134–142. [Google Scholar]
- Vidal-Balea, A.; Blanco-Novoa, O.; Fraga-Lamas, P.; Vilar-Montesinos, M.; Fernández-Caramés, T.M. A Collaborative Augmented Reality Application for Training and Assistance during Shipbuilding Assembly Processes. Multidiscip. Digit. Publ. Inst. Proc. 2020, 54, 4. [Google Scholar] [CrossRef]
- Microsoft HoloLens Official Web Page. Available online: https://www.microsoft.com/en-us/hololens (accessed on 4 November 2020).
- Hořejší, P. Augmented reality system for virtual training of parts assembly. Procedia Eng. 2015, 100, 699–706. [Google Scholar] [CrossRef] [Green Version]
- Tang, A.; Owen, C.; Biocca, F.; Mou, W. Comparative effectiveness of augmented reality in object assembly. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Fort Lauderdale, FL, USA, 5 April 2003; pp. 73–80. [Google Scholar]
- Alesky, M.; Vartiainen, E.; Domova, V.; Naedele, M. Augmented reality for improved service delivery. In Proceedings of the IEEE 28th International Conference on Advanced Information Networking and Applications, Victoria, BC, Canada, 13–16 May 2014; pp. 382–389. [Google Scholar]
- Mosiello, G.; Kiselev, A.; Loutfi, A. Using augmented reality to improve usability of the user interface for driving a telepresence robot. Paladyn J. Behav. Robot. 2013, 4, 174–181. [Google Scholar] [CrossRef]
- Herrera, K.A.; Rocha, J.A.; Silva, F.M.; Andaluz, V.H. Training Systems for Control of Mobile Manipulator Robots in Augmented Reality. In Proceedings of the 2020 15th Iberian Conference on Information Systems and Technologies (CISTI), Sevilla, Spain, 24–27 June 2020; pp. 1–7. [Google Scholar]
- Lapointe, J.F.; Molyneaux, H.; Allili, M. A Literature Review of AR-Based Remote Guidance Tasks with User Studies; Springer: London, UK, 2020. [Google Scholar]
- Nee, A.Y.; Ong, S.K. Virtual and augmented reality applications in manufacturing. IFAC Proc. Vol. 2013, 46, 15–26. [Google Scholar] [CrossRef] [Green Version]
- Chicaiza, E.A.; De la Cruz, E.I.; Andaluz, V.H. Augmented Reality System for Training and Assistance in the Management of Industrial Equipment and Instruments. Lect. Notes Comput. Sci. 2018, 675–686. [Google Scholar] [CrossRef]
- Reddy, K.P.K.; Venkitesh, B.; Varghese, A.; Narendra, N.; Chandra, G.; Balamuralidhar, P. Deformable 3D CAD models in mobile augmented reality for tele-assistance. In Proceedings of the 2015 Asia Pacific Conference on Multimedia and Broadcasting, Kuta, Indonesia, 23–25 April 2015; pp. 1–5. [Google Scholar]
- Schneider, M.; Rambach, J.; Stricker, D. Augmented reality based on edge computing using the example of remote live support. In Proceedings of the 2017 IEEE International Conference on Industrial Technology (ICIT), Toronto, ON, Canada, 22–25 March 2017; pp. 1277–1282. [Google Scholar]
- Zollmann, S.; Hoppe, C.; Kluckner, S.; Poglitsch, C.; Bischof, H.; Reitmayr, G. Augmented reality for construction site monitoring and documentation. Proc. IEEE 2014, 102, 137–154. [Google Scholar] [CrossRef]
- Moloney, J. Augmented reality visualisation of the built environment to support design decision making. In Proceedings of the Tenth International Conference on Information Visualisation (IV’06), London, UK, 5–7 July 2006; pp. 687–692. [Google Scholar]
- Erkoyuncu, J.; Khan, S. Olfactory-Based Augmented Reality Support for Industrial Maintenance. IEEE Access 2020, 8, 30306–30321. [Google Scholar] [CrossRef]
- Lampen, E.; Lehwald, J.; Pfeiffer, T. A Context-Aware Assistance Framework for Implicit Interaction with an Augmented Human. In Virtual, Augmented and Mixed Reality. Industrial and Everyday Life Applications, Proceedings of the 12th International Conference, VAMR 2020, Held as Part of the 22nd HCI International Conference, HCII 2020, Copenhagen, Denmark, 19–24 July 2020; Part II; Springer: Cham, Switzerland, 2020; pp. 91–110. [Google Scholar]
- Blanco-Novoa, Ó.; Fraga-Lamas, P.; Vilar-Montesinos, M.A.; Fernández-Caramés, T.M. Creating the Internet of Augmented Things: An Open-Source Framework to Make IoT Devices and Augmented and Mixed Reality Systems Talk to Each Other. Sensors 2020, 20, 3328. [Google Scholar] [CrossRef]
- Minerva, R.; Lee, G.M.; Crespi, N. Digital Twin in the IoT Context: A Survey on Technical Features, Scenarios, and Architectural Models. Proc. IEEE 2020, 108, 1785–1824. [Google Scholar] [CrossRef]
- Brizzi, F.; Peppoloni, L.; Graziano, A.; Di Stefano, E.; Avizzano, C.A.; Ruffaldi, E. Effects of augmented reality on the performance of teleoperated industrial assembly tasks in a robotic embodiment. IEEE Trans. Hum.-Mach. Syst. 2017, 48, 197–206. [Google Scholar] [CrossRef]
- Zahorik, P.; Jenison, R.L. Presence as being-in-the-world. Presence 1998, 7, 78–89. [Google Scholar] [CrossRef]
- Rosenberg, L.B. Virtual fixtures: Perceptual tools for telerobotic manipulation. In Proceedings of the IEEE Virtual Reality Annual International Symposium, Seattle, WA, USA, 18–22 September 1993; pp. 76–82. [Google Scholar]
- Gwilliam, J.C.; Mahvash, M.; Vagvolgyi, B.; Vacharat, A.; Yuh, D.D.; Okamura, A.M. Effects of haptic and graphical force feedback on teleoperated palpation. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 677–682. [Google Scholar]
- Livingston, M.A.; Gabbard, J.L.; Swan, J.E.; Sibley, C.M.; Barrow, J.H. Basic perception in head-worn augmented reality displays. In Proceedings of the Human Factors in Augmented Reality Environments; Springer: New York, NY, USA, 2013; pp. 35–65. [Google Scholar]
- Renner, R.S.; Velichkovsky, B.M.; Helmert, J.R. The perception of egocentric distances in virtual environments-a review. ACM Comput. Surv. (CSUR) 2013, 46, 1–40. [Google Scholar] [CrossRef] [Green Version]
- Smith, E.; Semple, G.; Evans, D.; McRae, K.; Blackwell, P. Augmented Instructions: Analysis of Performance and Efficiency of Assembly Tasks. In Proceedings of the 12th International Conference, VAMR 2020, Held as Part of the 22nd HCI International Conference, HCII 2020, Copenhagen, Denmark, 19–24 July 2020; Lecture Notes in Computer Science. Springer: Cham, Switzerland, 2020; pp. 166–177. [Google Scholar]
- Tavares, P.; Costa, C.M.; Rocha, L.; Malaca, P.; Costa, P.; Moreira, A.P.; Sousa, A.; Veiga, G. Collaborative welding system using BIM for robotic reprogramming and spatial augmented reality. Autom. Constr. 2019, 106, 102825. [Google Scholar] [CrossRef]
- Lee, J.; Kwon, O.; Choi, J.; Park, C. A Study on Construction Defect Management Using Augmented Reality Technology. In Proceedings of the 2012 International Conference on Information Science and Applications, Suwon, Korea, 23–25 May 2012; pp. 1–6. [Google Scholar]
- Morikawa, K.; Ando, T. Reduction of piping management person-hours through use of AR technology at shipbuilding sites. Fujitsu Sci. Tech. J. 2019, 55, 20–26. [Google Scholar]
- Olbrich, M.; Wuest, H.; Riess, P.; Bockholt, U. Augmented reality pipe layout planning in the shipbuilding industry. In Proceedings of the 2011 10th IEEE International Symposium Mixed and Augmented Reality (ISMAR), Basel, Switzerland, 26–29 October 2011; pp. 26–29. [Google Scholar]
- Ding, J.; Zhu, Y.; Luo, M.; Zhu, M.; Fan, X.; Zhou, Z. AR Assisted Process Guidance System for Ship Block Fabrication. In Virtual, Augmented and Mixed Reality. Industrial and Everyday Life Applications; Springer International Publishing: Cham, Switzerland, 2020. [Google Scholar]
- Blanco-Novoa, O.; Fernández-Caramés, T.M.; Fraga-Lamas, P.; Vilar-Montesinos, M.A. A practical evaluation of commercial industrial augmented reality systems in an industry 4.0 shipyard. IEEE Access 2018, 6, 8201–8218. [Google Scholar] [CrossRef]
- Atos Air Platform Official Web Page. Available online: https://atos.net/nl/nederland/augmented-interactive-reality (accessed on 5 November 2020).
- Andy 3D Official Web Page. Available online: https://www.capgemini.com/gb-en/resources/andy3d-remote-assistance-and-asset-revamping-web-platform/ (accessed on 5 November 2020).
- Reflekt Remote Official Web Page. Available online: https://www.re-flekt.com/ (accessed on 5 November 2020).
- Librestream Official Web Page. Available online: https://librestream.com/ (accessed on 5 November 2020).
- Remote Eye Official Web Page. Available online: https://remoteeye.com/ (accessed on 5 November 2020).
- ATR by Innovae Official Web Page. Available online: https://www.innovae.eu/atr-technical-remote-assistant/?lang=en (accessed on 5 November 2020).
- Sereno, M.; Wang, X.; Besancon, L.; Mcguffin, M.J.; Isenberg, T. Collaborative Work in Augmented Reality: A Survey. IEEE Trans. Vis. Comput. Graph. 2020. [Google Scholar] [CrossRef]
- Unity Official Web Page. Available online: https://unity.com/ (accessed on 2 December 2020).
- Siemens NX Official Web Page. Available online: https://www.plm.automation.siemens.com/global/en/products/nx/ (accessed on 2 December 2020).
- Blender Official Web Page. Available online: https://www.blender.org/ (accessed on 2 December 2020).
- Microsoft Recommendations for Model Optimization. Available online: https://docs.microsoft.com/en-us/dynamics365/mixed-reality/import-tool/best-practices (accessed on 2 December 2020).
- Universal Windows Platform Official Web Page. Available online: https://visualstudio.microsoft.com/es/vs/features/universal-windows-platform/ (accessed on 2 December 2020).
- ITU. The Tactile Internet. In ITU-T Technology Report; ITU: Geneva, Switzerland, 2014; Available online: https://www.itu.int/dms_pub/itu-t/opb/gen/T-GEN-TWATCH-2014-1-PDF-E.pdf (accessed on 5 December 2020).
Framework | Relevant Features | Limitations |
---|---|---|
RAKNet | Native support C++, fast, multi-platform | Complex, requires a server |
UNET | Integrated with Unity, standard | Deprecated by unity, no replacement yet |
Photon | Integrated with Unity, used by Microsoft (2020) | Proprietary, requires cloud license |
Mirror | Integrated with Unity, open source, network discovery | Hard to make multi-platform |
Custom UDP-TCP | Lightweight, simple, custom made for the needs | Not standard |
Ease of Use | Intuitiveness | Application Usefulness | HoloLens Comfort | |
---|---|---|---|---|
Excellent | 30% | 40% | 40% | 30% |
Good | 60% | 50% | 40% | 70% |
Can Be Improved | 10% | 10% | 20% | 20% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Vidal-Balea, A.; Blanco-Novoa, O.; Fraga-Lamas, P.; Vilar-Montesinos, M.; Fernández-Caramés, T.M. Creating Collaborative Augmented Reality Experiences for Industry 4.0 Training and Assistance Applications: Performance Evaluation in the Shipyard of the Future. Appl. Sci. 2020, 10, 9073. https://doi.org/10.3390/app10249073
Vidal-Balea A, Blanco-Novoa O, Fraga-Lamas P, Vilar-Montesinos M, Fernández-Caramés TM. Creating Collaborative Augmented Reality Experiences for Industry 4.0 Training and Assistance Applications: Performance Evaluation in the Shipyard of the Future. Applied Sciences. 2020; 10(24):9073. https://doi.org/10.3390/app10249073
Chicago/Turabian StyleVidal-Balea, Aida, Oscar Blanco-Novoa, Paula Fraga-Lamas, Miguel Vilar-Montesinos, and Tiago M. Fernández-Caramés. 2020. "Creating Collaborative Augmented Reality Experiences for Industry 4.0 Training and Assistance Applications: Performance Evaluation in the Shipyard of the Future" Applied Sciences 10, no. 24: 9073. https://doi.org/10.3390/app10249073
APA StyleVidal-Balea, A., Blanco-Novoa, O., Fraga-Lamas, P., Vilar-Montesinos, M., & Fernández-Caramés, T. M. (2020). Creating Collaborative Augmented Reality Experiences for Industry 4.0 Training and Assistance Applications: Performance Evaluation in the Shipyard of the Future. Applied Sciences, 10(24), 9073. https://doi.org/10.3390/app10249073