Methodology for the Development of Augmented Reality Applications: MeDARA. Drone Flight Case Study
Abstract
:1. Introduction
2. Methodology for the Development of Augmented Reality Applications Based on the EMCF
2.1. Concrete Level
2.2. Graphic Level
- Mechatronic prototype: This is the object the student will learn to manipulate and/or control. This prototype will allow students to identify its main capabilities and attributes and have a basic overview of its theoretical and practical elements.
- Scenario: Defines the physical environment or place where the student will be interacting with the mechatronic prototype. This place can be outdoors, such as a park, beach, or lake, among others, or an enclosed space, such as a classroom, laboratory, or factory.
- Commands: These are instructions given by the student through a handheld device, called a remote control, which allows maneuvering and making adjustments to the mechatronic prototype. Actions: Describes the linear and/or angular movements performed by the mechatronic prototype based on the commands entered by the student.
- Process: A set of successive phases of a phenomenon or complex fact. Some examples include the phases of flight of a drone, the phases of the trajectory of a car, and the phases of object manipulation by a manipulator’s arm, among others.
- Color: Describes the colors to be assigned to the mechatronic prototype, environment, commands, actions, and the process in general.
- Effects: They add a special sparkle to the storyboard as they describe achievements or highlight the importance of some model elements.
2.3. Abstract Level
- User stories: They present a description of the system’s behavior and represent the program’s main characteristics and the release plan. User stories are actions that can be performed by the user/student within the software/application or project. These stories are described in conjunction with the associated teacher/advisor in order to make clear the specifications that the application contains.
- Acceptance criteria: This refers to the survey of requirements validated in the testing stage. These criteria describe each requirement that the system or application must meet before the application is released. Some of the requirements that can be considered for the development of the application are the size of the object in augmented reality, colors of the object, button functionality, button position, and filling in requested documents.
- Unit tests: These tests are conducted for each component of the stories to verify and validate their operation according to the requirements specified in the planning. For example, a unit test on the remote control command applied to the mechatronic prototype should produce a specific movement. These tests ensure the correct scaling of the prototype when it is necessary.
- Acceptance tests: These tests are carried out once; the program certainly works. The objective is to validate the acceptance criteria defined in the planning stage by users/students and that everything in the project works correctly. If the project is accepted, it can be considered ready to be used as a training tool for future engineers or moved to the next iteration. If the project is accepted, it is established with the released status. The decision to advance or not to the next iteration can be made depending on whether more elements are able to be added to the application or whether more detail is required for the scene where the user interacts. All modules must undergo tests before integration with more iterations or releases. Tests are carried out at different stages of software development, and these can be documented tests or small tests of code functionality.
3. Application of the MeDARA to the Drone Flight Case Study
3.1. Concrete Level
- Takeoff: the drone rises to a certain altitude;
- Operational flight: the drone can hold a stationary position in the air (hover) and maneuver flight where mixed movements to the left, right, forward, backward, up, and down are possible;
- Landing: the drone landing gear makes contact with the ground.
3.2. Graphic Level
3.3. Abstract Level
3.3.1. Planning
- User storiesThis involves filling out a template that indicates every action that can be performed by users in the AR application, such as takeoff and landing of the drone; moving it up, down, left, and right; and rotating clockwise and counter-clockwise. Every action presented in the Experiential Storyboard has a related user history (see Figure 14).
- Criteria of acceptanceThese need to be specified at the beginning of the creation of the software and include the following:
- –
- Drone size in augmented reality (scale X = 4.13, Y = 4.13,Z = 4.13);
- –
- Drone design colors (black, pink, blue);
- –
- Number of movements allowed for the drone (left, right, up, down);
- –
- Verification of the commands up, down, left, right, turns, etc.;
- –
- Design, position, and size of the buttons.
3.3.2. Design
3.3.3. Coding
- Control Pseudocode
Algorithm 1 Initialization of variables for the drone movement. |
Input:Velocity, VelocityH, VelocityRot, Propeller1, Propeller2, Propeller3, Propeller4 |
Output:Boolean with movement |
1: Initialization of input variables |
2: Boolean declaration variables: Adelante, moverAtras, moverDerecha, moverIzquierda, Start, StartControl |
3: Start |
4: Initialization of Boolean variables to false |
Algorithm 2 Drone actions |
Input:Velocity, VelocityH, VelocityRot, Propeller1, Propeller2, Propeller3, Propeller4 |
Output:Boolean with movement |
if Adelante isPressed=true then |
2: moverAdelante=true |
end if |
4: if Adelante isRelease=true then |
moverAdelante=false |
6: end if |
if moverAtras isPressed=true then |
8: moverAtras=true |
end if |
10: if moverAtras isRelease=true then |
moverAtras=false |
12: end if |
if moverIzquierda isPressed=true then |
14: moverIzquierda=true |
end if |
16: if moverIzquierda isRelease=true then |
moverIzquierda=false |
18: end if |
if moverDerecha isPressed=true then |
20: moverDerecha=true |
end if |
22: if moverDerecha isRelease=true then |
moverDerecha=false |
24: end if |
3.3.4. Testing
- *
- It was proven that the propellers work correctly and that they do not remain suspended in the air when turning.
- *
- Tests were carried out to configure the revolutions of the drone’s movement so that they would not be seen as static and their movements would imitate real drones.
- *
- The application was tested to ensure its availability at any time. It is worth mentioning that all functionalities work correctly.
- Unit testsThese contribute to verifying and validating each aspect of the augmented reality model, the buttons’ functionality, the drone’s size, and the Unity design of the drone. The tests are presented in more detail below. Unit tests of the static Augmented Reality (AR) prototype are of the Unity simulator.Figure 18 presents the complete design with the drone integrated into the platform for the specific tests. We validated the drone’s size within the application, the design colors, and the platform’s scale where it lands. We also validated the user’s view when the model was displayed on a mobile device.In this unit test, several cases were applied to visualize how the application and functionality were being integrated into the complete AR model. Some of these tests include the following:
- –
- Tests of the deployed static modelThis test validates the illustration displayed when the model is activated (Figure 19a). It was performed with the Vuforia add-on using a mobile device preview. At this stage, the design of the buttons had not been added yet, only the position of the buttons was validated within the image display on the mobile device. Designers carry out this test in the development of the application.
- –
- Testing of the AR kinematic modelThis test validates the AR kinematic prototype using a mobile device preview with the Vuforia application. The application deployment was performed for Android-based mobile devices. The colorimetry for the buttons and the space allocated for the buttons was also validated, see Figure 19b, while Figure 19c displays the close-up model already with the buttons integrated and the final colors.
- –
- Testing of the dynamic prototype using AR input commandsIn these tests, the final prototype is shown to be working. This test aims to validate each of the application’s buttons, including the final design. For this, each of the buttons were tested to see the type of movement, controls, and actions on the drone. Its operation was validated and accepted. Figure 19d shows one of the tests performed.
- Acceptance testOnce the unit tests were done, the design and the results of the programming environment were validated. Acceptance tests help to determine if changes are made within the design. This stage includes the approval of the people involved so that the application can be defined as completed. This kind of test is made by users who give feedback on the app’s functionality. The acceptance tests that were performed are described below.
- –
- Acceptance test for the deployment of the drone using a mobile device This test validates and verifies the app’s compatibility on a real mobile device. The interface and interaction with the drone were finalized intuitively and successfully, keeping the cyberpunk design and unifying all the application components.
- –
- Testing of controls using the mobile application The final tests considered the interaction of a user with the application. For this purpose, the complete application runs on a mobile device. Figure 20 shows the image of a user interacting with the dynamic prototype using AR input commands on the mobile device.
- –
- Additional test(1) A first test was performed where the drone must appear when the target is scanned.(2) The buttons that perform the movements to the right, left, back, and forth were tested as well (see https://n9.cl/fdt8v (accessed on June 15 2022)).(3) In addition, another test showed when the mobile device was moved away from the target (see https://n9.cl/werqp (accessed on June 15 2022)).(4) Demonstrated that the drone still works even though the mobile device stops seeing the target (see https://n9.cl/v9z6y (accessed on June 15 2022)).
4. Results
5. Discussion
- Encouragement of more technical, computerized, and specialized careers (STEM careers);
- Inclusion of business in the educational system through “philanthropy”;
- Increasing incorporation of robots into society;
- Movement of capital from the public to the private sector;
- Normalization, by the education sector, of the company discourse that this “has to be so”;
- Involvement of companies, through concrete projects, in academic life.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
MeDARA | Development of Augmented Reality Applications |
EMCF | Educational Mechatronics Conceptual Framework |
AR | Augmented Reality |
UAV | Unmanned Aerial Vehicle |
BIM | Building Information Model |
SCARA | Selective Compliant Articulated Robot Arm |
AUTOC-AR | Extreme Programming |
XP | Extreme Programming |
CAD | Computer-Aided Design |
UML | Unified Modeling Language |
PC | Personal Computer |
SDK | Software Development Kit |
JDK | Java Development Kit |
RAM | Random-Access Memory |
MB | Megabyte |
GB | Gigabyte |
ROM | Read-Only Memory |
PNG | Portable Network Graphics |
KB | Kilobyte |
STEM | Science, Technology, Engineering, and Mathematics |
FPS | Frames per second |
References
- Gunal, M.M. Simulation for Industry 4.0; Springer: Cham, Switzerland, 2019. [Google Scholar]
- Makhataeva, Z.; Varol, H.A. Augmented reality for robotics: A review. Robotics 2020, 9, 21. [Google Scholar] [CrossRef] [Green Version]
- Alcácer, V.; Cruz-Machado, V. Scanning the industry 4.0: A literature review on technologies for manufacturing systems. Eng. Sci. Technol. Int. J. 2019, 22, 899–919. [Google Scholar] [CrossRef]
- Rafiq, K.R.M.; Hashim, H.; Yunus, M.M. Sustaining Education with Mobile Learning for English for Specific Purposes (ESP): A Systematic Review (2012–2021). Sustainability 2021, 13, 9768. [Google Scholar] [CrossRef]
- Liu, D.; Xia, X.; Chen, J.; Li, S. Integrating building information model and augmented reality for drone-based building inspection. J. Comput. Civ. Eng. 2021, 35, 04020073. [Google Scholar] [CrossRef]
- Van Dam, J.; Krasne, A.; Gabbard, J.L. Drone-based augmented reality platform for bridge inspection: Effect of ar cue design on visual search tasks. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 22–26 March 2020; IEEE: New York, NY, USA; pp. 201–204. [Google Scholar]
- Go, Y.G.; Lee, J.W.; Kang, H.S.; Choi, S.M. Interactive Training of Drone Flight Control in Mixed Reality. In SIGGRAPH Asia 2020 XR (SA’20); ACM: New York, NY, USA, 2020; pp. 1–2. [Google Scholar]
- Kaplan, A.D.; Cruit, J.; Endsley, M.; Beers, S.M.; Sawyer, B.D.; Hancock, P. The effects of virtual reality, augmented reality, and mixed reality as training enhancement methods: A meta-analysis. Hum. Factors 2021, 63, 706–726. [Google Scholar] [CrossRef]
- Basogain, X.; Olabe, M.; Espinosa, K.; Rouèche, C.; Olabe, J. Realidad Aumentada en la Educación: Una Tecnología Emergente. Escuela Superior de Ingeniería de Bilbao, EHU. Available online: http://bit.ly/2hpZokY (accessed on 1 July 2022).
- Çetin, H.; Türkan, A. The Effect of Augmented Reality based applications on achievement and attitude towards science course in distance education process. Educ. Inf. Technol. 2022, 27, 1397–1415. [Google Scholar] [CrossRef]
- Scavarelli, A.; Arya, A.; Teather, R.J. Virtual reality and augmented reality in social learning spaces: A literature review. Virtual Real. 2021, 25, 257–277. [Google Scholar] [CrossRef]
- Luque-Vega, L.F.; Michel-Torres, D.A.; Lopez-Neri, E.; Carlos-Mancilla, M.A.; González-Jiménez, L.E. Iot smart parking system based on the visual-aided smart vehicle presence sensor: SPIN-V. Sensors 2020, 20, 1476. [Google Scholar] [CrossRef] [Green Version]
- Syahidi, A.A.; Subandi, S.; Mohamed, A. AUTOC-AR: A Car Design and Specification as a Work Safety Guide Based on Augmented Reality Technology. J. Pendidik. Teknol. Dan Kejuru. 2020, 26, 18–25. [Google Scholar] [CrossRef]
- Juric, R. Extreme Programming and Its Development Practices. In Proceedings of the 22nd International Conference on Information Technology Interfaces, ITI, Pula, Croatia, 13–16 June 2000; IEEE: New York, NY, USA, 2000; pp. 94–104. Available online: https://ieeexplore.ieee.org/document/915842?arnumber=915842 (accessed on 29 June 2022).
- Tahyudin, I.; Saputra, D.I.S.; Haviluddin, H. An interactive mobile augmented reality for tourism objects at Purbalingga district. TELKOMNIKA Indones. J. Electr. Eng. 2015, 16, 559–564. [Google Scholar] [CrossRef]
- Botden, S.M.; Jakimowicz, J.J. What is going on in augmented reality simulation in laparoscopic surgery? Surg. Endosc. 2009, 23, 1693–1700. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Abhari, K.; Baxter, J.S.; Chen, E.C.; Khan, A.R.; Peters, T.M.; De Ribaupierre, S.; Eagleson, R. Training for planning tumour resection: Augmented reality and human factors. IEEE Trans. Biomed. Eng. 2014, 62, 1466–1477. [Google Scholar] [CrossRef] [PubMed]
- Cieza, E.; Lujan, D. Educational mobile application of augmented reality based on markers to improve the learning of vowel usage and numbers for children of a kindergarten in Trujillo. Procedia Comput. Sci. 2018, 130, 352–358. [Google Scholar] [CrossRef]
- Santos, I.; Henriques, R.; Mariano, G.; Pereira, D.I. Methodologies to represent and promote the geoheritage using unmanned aerial vehicles, multimedia technologies, and augmented reality. Geoheritage 2018, 10, 143–155. [Google Scholar] [CrossRef]
- Luque-Vega, L.; Castillo-Toledo, B.; Loukianov, A.G. Robust block second order sliding mode control for a quadrotor. J. Frankl. Inst. 2012, 349, 719–739. [Google Scholar] [CrossRef]
- Luque Vega, L.F.; Lopez-Neri, E.; Arellano-Muro, C.A.; González-Jiménez, L.E.; Ghommam, J.; Carrasco-Navarro, R. UAV Flight Instructional Design for Industry 4.0 based on the Framework of Educational Mechatronics. In Proceedings of the IECON 2020 The 46th Annual Conference of the IEEE Industrial Electronics Society, Singapore, 18–21 October 2020; pp. 2313–2318. [Google Scholar] [CrossRef]
- Sreeram, S.; Nisha, K.; Jayakrishnan, R. Virtual design review and planning using augmented reality and drones. In Proceedings of the 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 14–15 June 2018; IEEE: New York, NY, USA, 2018; pp. 915–918. [Google Scholar] [CrossRef]
- Atamanczuk, M.J.; Siatkowski, A. Indústria 4.0: O panorama da publicação sobre a quarta revolução industrial no portal spell. Future Stud. Res. J. Trends Strateg. 2019, 11, 281–304. [Google Scholar] [CrossRef]
- Vincke, B.; Rodriguez Florez, S.; Aubert, P. An open-source scale model platform for teaching autonomous vehicle technologies. Sensors 2021, 21, 3850. [Google Scholar] [CrossRef]
- Nordby, S.K.; Bjerke, A.H.; Mifsud, L. Computational thinking in the primary mathematics classroom: A systematic review. Digit. Exp. Math. Educ. 2022, 8, 27–49. [Google Scholar] [CrossRef]
- González-Islas, J.C.; Godínez-Garrido, G.; González-Rosas, A.; Ortega-Marín, B.A. Educational mechatronics: Support for teaching-learning of basic education in Hidalgo. Pädi Boletín Científico De Cienc. Básicas E Ing. Del ICBI 2021, 9, 110–117. [Google Scholar] [CrossRef]
- Flanagan, R. Implementing a Ricoeurian lens to examine the impact of individuals’ worldviews on subject content knowledge in RE in England: A theoretical proposition. Br. J. Relig. Educ. 2021, 43, 472–486. [Google Scholar] [CrossRef]
- Lee, H.J.; Yi, H. Development of an Onboard Robotic Platform for Embedded Programming Education. Sensors 2021, 21, 3916. [Google Scholar] [CrossRef]
- Iftene, A.; Trandabăț, D. Enhancing the attractiveness of learning through augmented reality. Procedia Comput. Sci. 2018, 126, 166–175. [Google Scholar] [CrossRef]
- Alzahrani, N.M.; Alfouzan, F.A. Augmented Reality (AR) and Cyber-Security for Smart Cities—A Systematic Literature Review. Sensors 2022, 22, 2792. [Google Scholar] [CrossRef] [PubMed]
- Seo, J.K. A Cognitive Sample Consensus Method for the Stitching of Drone-Based Aerial Images Supported by a Generative Adversarial Network for False Positive Reduction. Sensors 2022, 22, 2474. [Google Scholar] [CrossRef] [PubMed]
Software | Hardware | ||
---|---|---|---|
PC | Mobile | PC | Mobile |
Unity Hub | Android version | Core i5 9th generation | RAM free 2 GB |
Visual Studio 2015 o posterior | RAM 4 GB+ | ROM 16 GB | |
Android Studio (SDK), Java (JDK) | NVIDIA 512 MB (GTX 650 minimum) | Resolution | |
Windows 7 SP1+ | Disk space 10 GB | Smartphone | |
AutoCad 2019+ |
Metrics | Value |
---|---|
Total implementation hours | 166 |
Total methodology implementation hours | 226 |
Overall stage size | 250 cm in X, Y, and Z axis |
Time spent in the app | 5–15 min per user |
Image size | 156 Kbs |
Image pixels | pixels using 32 bits of depth |
Image quality | Ultra-quality and full-response texture quality, 2× multisampling antialiasing parameter |
Frame rate | 60 Frames per second (FPS) |
Button response | s |
Availability | The application is available for 24 h but, in a local manner, the app needs to be released to the Android store. |
Initial Evaluation Test | W | p-Val | Normal |
---|---|---|---|
Experimental | 0.963655 | 0.619153 | True |
Control | 0.968042 | 0.713114 | True |
T | p-Val | Equal_var | |
---|---|---|---|
Bartlett | 9.308328 | 0.002281 | False |
T | Dof | Alternative | p-Val | CI95% | Cohen-d | BF10 | Power | |
---|---|---|---|---|---|---|---|---|
t-test | −6.1909 | 27.2369 | two-sided | 1. | [−19.37,−9.73] | 1.95773 | 0.999976 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zamora-Antuñano, M.A.; Luque-Vega, L.F.; Carlos-Mancilla, M.A.; Hernández-Quesada, R.; Farrera-Vázquez, N.; Carrasco-Navarro, R.; González-Gutiérrez, C.A.; Aguilar-Molina, Y. Methodology for the Development of Augmented Reality Applications: MeDARA. Drone Flight Case Study. Sensors 2022, 22, 5664. https://doi.org/10.3390/s22155664
Zamora-Antuñano MA, Luque-Vega LF, Carlos-Mancilla MA, Hernández-Quesada R, Farrera-Vázquez N, Carrasco-Navarro R, González-Gutiérrez CA, Aguilar-Molina Y. Methodology for the Development of Augmented Reality Applications: MeDARA. Drone Flight Case Study. Sensors. 2022; 22(15):5664. https://doi.org/10.3390/s22155664
Chicago/Turabian StyleZamora-Antuñano, Marco Antonio, Luis F. Luque-Vega, Miriam A. Carlos-Mancilla, Ricardo Hernández-Quesada, Neín Farrera-Vázquez, Rocío Carrasco-Navarro, Carlos Alberto González-Gutiérrez, and Yehoshua Aguilar-Molina. 2022. "Methodology for the Development of Augmented Reality Applications: MeDARA. Drone Flight Case Study" Sensors 22, no. 15: 5664. https://doi.org/10.3390/s22155664
APA StyleZamora-Antuñano, M. A., Luque-Vega, L. F., Carlos-Mancilla, M. A., Hernández-Quesada, R., Farrera-Vázquez, N., Carrasco-Navarro, R., González-Gutiérrez, C. A., & Aguilar-Molina, Y. (2022). Methodology for the Development of Augmented Reality Applications: MeDARA. Drone Flight Case Study. Sensors, 22(15), 5664. https://doi.org/10.3390/s22155664