Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones
Abstract
:1. Introduction
1.1. Testing in VR
- absolute control: A virtual environment provides absolute control. In extreme cases, the simulation can be deterministic and all random / external influences, i.e., the noise of the real world, such as weather aspects, materials with (natural) variations, etc., can be eliminated.
- reproducibility: The elimination of any randomness leads to a high degree of reproducibility, which is unattainable for real tests.
- shortened development time: In many test scenarios, development times can be drastically reduced. Especially in destructive tests, where the prototype is destroyed (e.g., crash tests), the time for building the prototypes is omitted.
- higher optimization potential: By not having to build real prototypes, a limiting scaling factor of product testing is eliminated. Many more product tests can be performed in VR, and by consequence many more product parameters can be optimized.
- hazard free tests: Virtual testing is risk-free—both for humans and for potentially expensive machines and equipment.
- unrealistic conditions: Although VR testing strives for the highest possible level of realism most of the time, unreal conditions are also an advantage that does not exist in reality: in VR, the environment can be abstracted and reduced to a minimal level. This reduction enables efficient debugging and testing of subsystems that would be difficult to test on their own in real life.
1.2. Collision Avoidance
2. Methods
- and
Product Test Environment
- simulation-based input device: In the initial configuration of the CAVE, control was assumed by a human. This task must now be taken over indirectly by the drone. In detail, the output signals from the drone controller, which is otherwise connected to the drone’s motors, are sent to a physics simulation server, which uses a physically correct real-time simulation to calculate the position of the drone in 3D. These data from the flight simulator are then the input data for control within the virtual world. For the drone to explore the virtual world, a corresponding network protocol has been added that handles the current position and orientation in real time.
- real standstill: The advantages of a CAVE—for example, in comparison to head-mounted displays—are its own body perception and the possibility to move (with the limitations of the CAVE) together with other persons in order to explore virtual objects. In our test setup, the drone is placed at the center of the CAVE without the possibility to move. It explores its surroundings using its installed cameras. The new drone uses two cameras to enlarge its field of view. As the physical position of the drone remains unchanged during the tests, the CAVE tracking system has been deactivated.
- stereoscopic rendering: A major advantage of the new collision detection algorithm is its simplicity. It does not require 3D reconstruction or computationally intensive object detection. The fields of view of the two cameras of the drone are therefore without overlap; they merely enlarge the field of view of the drone. The usual stereoscopic rendering in the CAVE is disabled since the collision avoidance algorithm is designed for a monoscopic view. By not using 3D stereo via shutter technology, the full brightness is available for the monoscopic test setup.
- high refresh rate: The frame rate of the new collision detection algorithm is not based on human perception, but on the processing speed of the algorithm implemented in FPGA hardware. This is 120 Hz and the projection systems of the CAVE use the same frequency in the tests. This adjustment is necessary to prevent flickering effects.
- color space, contrast, and brightness: During the planning phase, particular attention was paid to the compatibility of the visualization in the CAVE with the camera system of the drone. Since the drone can only record gray images, the color representation is rather irrelevant; contrast and brightness, however, are suboptimal, especially compared to real outdoor tests.The CAVE uses Digital Light Processing (DLP) projectors with a maximum brightness of 7500 ANSI lumen. Each wall screen is 3.3 m by 2.5 m large. The quadratic floor has an edge length of 3.3 m (see Figure 2 (right)). Consequently, each screen produces no more than 900 lux. Different lighting conditions are listed in Table 1 for comparison.This initial situation is regrettably suboptimal. Unfortunately, the solution to this problem is not feasible, as it would involve expensive hardware replacement; the CAVE is only rented and the provider refuses to undertake this investment.
3. Results and Discussion
- −
- The lighting conditions, which cannot be changed in terms of brightness and contrast, only allow a limited test scenario. In particular, tests of bright sunny days with high-contrast light–shadow configurations cannot be tested in VR—at least not in the currently used setup.
- ±
- Testing in VR guarantees absolute control over the virtual environment and the test configuration. This absolute control has both positive and negative effects. On the one hand, control and reproducibility simplify debugging and automatic testing; in particular, random elements would complicate this process considerably in many cases. On the other hand, everything must also be controlled, i.e., only what is programmed in advance happens in VR. The design space of possibilities may contain variations, but will hardly open up new dimensions that have not been designed before (see Figure 3), e.g., if no flock of birds has been programmed, they will not appear; if no pedestrians or other road users have been animated, the roads will remain empty. This is why real tests are still being carried out in the drone project to test with real-world complexity.
- +
- Despite many error avoidance strategies (test-driven development, pair programming, etc.), one error remained undetected until the first test: the calculated escape vector to avoid a collision was scaled incorrectly. As a consequence, the algorithm did not react to any obstacles. Even when the drone navigated directly towards a building, the algorithm showed no reaction. In a real test, this error would have inescapably (literally) led to a crash and thus probably to the total loss of a prototype. However, this first, complete functional test, which is very critical due to the test scenario, first took place in the CAVE environment for good reason.
- +
- Preventing prototypes from real crashes in this VR environment leads to economic benefits: Irrespective of the costs of a prototype due to increased material and personnel expenses, more efficient test phases are the result. In this context, we would like to point out the potential delays that would have occurred in real tests. A damaged drone in the tests must be repaired at an unknown cost (in terms of personnel costs and material costs) before the next test starts. After all, a virtual drone does not need to be repaired after a crash.
- +
- A side effect of testing in VR only becomes clear in comparison to real outdoor tests: unlike real tests, no VR tests had to be postponed due to fog, rain, or other bad weather conditions. The project schedule is easier to adhere to in VR. However, this cannot be generalized for every VR product test project.
- +
- Even though VR tests usually aim for the highest possible degree of realism, unreal conditions are also an advantage that does not exist in reality. In VR, the surroundings can be made abstract in order to simplify debugging.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Caputoa, F.; Greco, A.; D’Amato, E.; Notaro, I.; Spada, S. On the use of Virtual Reality for a human-centered workplace design. Procedia Struct. Integr. 2018, 8, 297–308. [Google Scholar] [CrossRef]
- Pappas, M.; Karabatsou, V.; Mavrikios, D.; Chryssolouris, G. Ergonomic Evaluation Of Virtual Assembly Tasks. Digit. Enterp. Technol. 2007, 9, 511–518. [Google Scholar]
- Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
- Zhu, W.; Fan, X.; Zhang, Y. Applications and research trends of digital human models in the manufacturing industry. Virtual Real. Intell. Hardw. 2019, 1, 558–579. [Google Scholar] [CrossRef]
- da Silva, A.G.; Winkler, I.; Gomes, M.M.; De Melo Pinto, U. Ergonomic Analysis supported by Virtual Reality: A Systematic Literature Review. Symp. Virtual Augment. Real. (SVR) 2020, 22, 67–73. [Google Scholar]
- Nash, S.G. A History of Scientific Computing; Association for Computing Machinery: New York, NY, USA, 1990. [Google Scholar]
- Gustafsson, B. Scientific Computing—A Historical Perspective; Springer: Heidelberg, Germany, 2018. [Google Scholar]
- Cruz-Neira, C.; Sandin, D.J.; DeFanti, T.A.; Kenyon, R.V.; Hart, J.C. The CAVE: Audio Visual Experience Automatic Virtual Environment. Commun. ACM 1992, 35, 64–72. [Google Scholar] [CrossRef]
- Borrego, A.; Latorre, J.; Alcañiz, M.; Llorens, R. Comparison of Oculus Rift and HTC Vive: Feasibility for Virtual Reality-Based Exploration, Navigation, Exergaming, and Rehabilitation. Games Health J. 2018, 7, 151–156. [Google Scholar] [CrossRef]
- Mestre, D.R. CAVE versus Head-Mounted Displays: Ongoing thoughts. Electron. Imaging 2017, 3, 31–35. [Google Scholar] [CrossRef]
- Ottosson, S. Virtual Reality in the Product Development Process. J. Eng. Des. 2002, 13, 159–172. [Google Scholar] [CrossRef]
- Blümel, E.; Strassburger, S.; Sturek, R.; Kimura, I. Pragmatic approach to apply virtual reality technology in accelerating a product life cycle. Innovations 2004, 2004, 199–207. [Google Scholar]
- Söderman, M. Virtual reality in product evaluations with potential customers: An exploratory study comparing virtual reality with conventional product representations. J. Eng. Des. 2007, 16, 311–328. [Google Scholar] [CrossRef]
- Falcao, C.S.; Soares, M.M. Application of Virtual Reality Technologies in Consumer Product Usability. Des. User Exp. Usability 2013, 8015, 342–351. [Google Scholar]
- Porcherot, C.; Delplanque, S.; Gaudreau, N.; Ischer, M.; De Marles, A.; Cayeux, I. Immersive Techniques and Virtual Reality. Methods Consum. Res. 2018, 2, 69–83. [Google Scholar]
- Park, H.; Son, J.S.; Lee, K.H. Design evaluation of digital consumer products using virtual reality-based functional behaviour simulation. J. Eng. Des. 2008, 19, 359–375. [Google Scholar] [CrossRef]
- Bruno, F.; Muzzupappa, M. Product interface design: A participatory approach based on virtual reality. Int. J. Hum.-Comput. Stud. 2010, 68, 254–269. [Google Scholar] [CrossRef]
- Maurya, S.; Mougenot, C.; Takeda, Y. Impact of mixed reality implementation on early-stage interactive product design process. J. Eng. Des. 2021, 32, 1–27. [Google Scholar] [CrossRef]
- Rebelo, F.; Duarte, E.; Noriega, P.; Soares, M. Virtual Reality in Consumer Product Design: Methods and Applications. Hum. Factors Ergon. Consum. Prod. Des. 2011, 24, 381–402. [Google Scholar]
- Bauer, D.; Settgast, V.; Schrom-Feiertag, H.; Millonig, A. Making the usage of guidance systems in pedestrian infrastructures measurable using the virtual environment DAVE. Transp. Res. (Part F Traffic Psychol. Behav.) 2018, 59, 298–317. [Google Scholar] [CrossRef]
- Schrom-Feiertag, H.; Lorenz, F.; Regal, G.; Settgast, V. Augmented and Virtual Reality Applied for Innovative, Inclusive and Efficient Participatory Planning. Transp. Res. Arena 2018, 7, 1491568. [Google Scholar]
- Mujber, T.S.; Szecsi, T.; Hashmi, M.S.J. Virtual reality applications in manufacturing process simulation. J. Mater. Process. Technol. 2004, 155, 1834–1838. [Google Scholar] [CrossRef]
- Berg, L.; Vance, J. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
- Ikonomov, P.G.; Milkova, E.D. Using Virtual Reality Simulation Through Product Lifecycle. Proc. Int. Mech. Eng. Congr. Expo. 2008, 3719, 761–767. [Google Scholar]
- Ma, D.; Fan, X.; Gausemeier, J.; Grafe, M. Virtual Reality & Augmented Reality in Industry; Springer: Heidelberg, Germany, 2011. [Google Scholar]
- Banerjee, P.P. Virtual Reality and Automation. Handb. Autom. 2009, 2009, 269–278. [Google Scholar]
- Nabati, G.E.; Alvela Nieto, M.T.; Decker, A.; Thoben, K.D. Application of Virtual Reality Technologies for Achieving Energy Efficient Manufacturing: Literature Analysis and Findings. Adv. Prod. Manag. Syst. Path Digit. Transform. Innov. Prod. Manag. Syst. 2020, 1, 479–486. [Google Scholar]
- Staretu, I. From using virtual reality in the training process to virtual engineering. Glob. J. Comput. Sci. 2014, 4, 31–41. [Google Scholar]
- Jerald, J. The VR Book: Human-Centered Design for Virtual Reality; Association for Computing Machinery and Morgan & Claypool: Williston, ND, USA, 2015. [Google Scholar]
- Faithfull, P.T.; Ball, R.J.; Jones, R.P. An investigation into the use of hardware-in-the-loop simulation with a scaled physical prototype as an aid to design. J. Eng. Des. 2001, 12, 231–243. [Google Scholar] [CrossRef]
- Postal, G.R.; Pavan, W.; Rieder, R. A Virtual Environment for Drone Pilot Training Using VR Devices. Symp. Virtual Real. 2016, 18, 183–187. [Google Scholar]
- Liu, Y.; Yang, N.; Li, A.; Paterson, J.; McPherson, D.; Cheng, T.; Yang, A.Y. Usability Evaluation for Drone Mission Planning in Virtual Reality. In Proceedings of the International Conference on Virtual, Augmented and Mixed Reality, Las Vegas, FL, USA, 15–20 July 2018; Volume 10910, pp. 313–330. [Google Scholar]
- Nguyen, V.T.; Jung, K.; Dang, T. DroneVR: A Web Virtual Reality Simulator for Drone Operator. In Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality, San Diego, CA, USA, 9–11 December 2019; Volume 2, pp. 257–262. [Google Scholar]
- Freitas, F.; Oliveira, H.; Winkler, I.; Gomes, M. Virtual Reality on Product Usability Testing: A Systematic Literature Review. In Proceedings of the 2020 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil, 7–10 November 2020; pp. 67–73. [Google Scholar] [CrossRef]
- Judge, S.; Rind, F. The locust DCMD, a movement-detecting neurone tightly tuned to collision trajectories. J. Exp. Biol. 1997, 200, 2209–2216. [Google Scholar] [CrossRef]
- Yue, S.; Rind, F. Collision detection in complex dynamic scenes using an LGMD-based visual neural network with feature enhancement. IEEE Trans. Neural Netw. 2006, 17, 705–716. [Google Scholar]
- Fu, Q.; Yue, S.; Hu, C. Bio-inspired collision detector with enhanced selectivity for ground robotic vision system. In Proceedings of the British Machine Vision Conference, York, UK, 19–22 September 2016; Volume 6, pp. 1–13. [Google Scholar]
- Cizek, P.; Faigl, J. Self-supervised learning of the biologically-inspired obstacle avoidance of hexapod walking robot. Bioinspir. Biomimetics 2019, 14, 046002. [Google Scholar] [CrossRef] [Green Version]
- Blanchard, M.; Rind, F.; Verschure, P.F.M.J. Collision avoidance using a model of the locust LGMD neuron. Robot. Auton. Syst. 2000, 30, 17–38. [Google Scholar] [CrossRef] [Green Version]
- Hartbauer, M. Simplified bionic solutions: A simple bio-inspired vehicle collision detection system. Bioinspir. Biomimetics 2017, 12, 026007. [Google Scholar] [CrossRef] [PubMed]
- Seidelmann, P.K. (Ed.) Explanatory Supplement to the Astronomical Almanac; University Science Books: Mill Valley, CA, USA, 1992. [Google Scholar]
Illuminance | Example |
---|---|
0.0001 lux | moonless night sky |
0.25 lux | clear night sky with full moon |
400 lux | ambient sunrise on a clear day |
1000–2000 lux | typical overcast day at midday |
>100,000 lux | bright sunlight |
Testing in VR | Testing Outdoors | |
---|---|---|
Completed test sessions | 8 | 6 |
Total costs | EUR 19,167 | EUR 40,377 |
These costs already include EUR 2000 rent and EUR 1300 for the provision of the VR models. | This item includes all costs involved, including travel expenses, travel time, etc. | |
Costs per test session | EUR 2395 | EUR 6729 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Settgast, V.; Kostarakos, K.; Eggeling, E.; Hartbauer, M.; Ullrich, T. Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones. Designs 2022, 6, 33. https://doi.org/10.3390/designs6020033
Settgast V, Kostarakos K, Eggeling E, Hartbauer M, Ullrich T. Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones. Designs. 2022; 6(2):33. https://doi.org/10.3390/designs6020033
Chicago/Turabian StyleSettgast, Volker, Konstantinos Kostarakos, Eva Eggeling, Manfred Hartbauer, and Torsten Ullrich. 2022. "Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones" Designs 6, no. 2: 33. https://doi.org/10.3390/designs6020033