Next Article in Journal
Technological Features of Construction and Reconstruction of Geotechnical Structures in the Arctic Zone
Next Article in Special Issue
Secured Multi-Dimensional Robust Optimization Model for Remotely Piloted Aircraft System (RPAS) Delivery Network Based on the SORA Standard
Previous Article in Journal
Optimal Cascade Non-Integer Controller for Shunt Active Power Filter: Real-Time Implementation
Previous Article in Special Issue
On the Effect of Flexibility on the Dynamics of a Suspended Payload Carried by a Quadrotor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Case Report

Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones

by
Volker Settgast
1,
Konstantinos Kostarakos
2,
Eva Eggeling
1,
Manfred Hartbauer
2 and
Torsten Ullrich
1,3,*
1
Fraunhofer Austria Research GmbH, 8010 Graz, Austria
2
Institute of Biology, University of Graz, 8010 Graz, Austria
3
Institute of Computer Graphics and Knowledge Visualization, Graz University of Technology, 8010 Graz, Austria
*
Author to whom correspondence should be addressed.
Designs 2022, 6(2), 33; https://doi.org/10.3390/designs6020033
Submission received: 28 December 2021 / Revised: 16 February 2022 / Accepted: 7 March 2022 / Published: 1 April 2022
(This article belongs to the Special Issue Unmanned Aerial System (UAS) Modeling, Simulation and Control)

Abstract

:
Virtual reality (VR) and real-world simulations have become an important tool for product development, product design, and product tests. Product tests in VR have many advantages, such as reproducibility and shortened development time. In this paper, we investigate the virtual testing of a collision avoidance system for drones in terms of economic benefits. Our results show that virtual tests had both positive and negative effects on the development, with the positive aspects clearly predominating. In summary, the tests in VR shorten the development time and reduce risks and therefore costs. Furthermore, they offer possibilities not available in real-world tests. Nevertheless, real-world tests are still important.

1. Introduction

Virtual reality (VR) and real-world simulations have become an important tool for product development, product design, and product tests. Virtual reality technology, used at various stages of product development, can enable the identification of potential ergonomic problems and, consequently, the definition of measures to solve or at least mitigate them [1,2]. Virtual reality can mimic the characteristics of real, existing, or fictional environments—at least in the relevant aspects being considered. However, it can also transcend the limitations of physical reality [3]. Virtual reality tools use computer modeling and simulation technologies that have been widely used for more than half a century in education, health, entertainment, culture, sports, engineering, the armed forces, and other sectors, but have only recently become a practical tool for the manufacturing industry [4]. While these considerations have so far focused on the human-in-the-loop scenario, i.e., mainly on ergonomics and user experience [5], the focus is now broadening to include hardware-in-the-loop.
Although simulation in virtual worlds has high initial investment costs—after all, a virtual environment must first be created, whereas the real world already exists—we demonstrate in this pilot study that the use of VR technology is already paying off in single projects. This hypothesis is the focus and was investigated in an exemplary project.
In this paper, we describe the virtual testing of a collision avoidance system in a VR environment. The context is a research project for developing a novel collision avoidance algorithm for small flying vehicles such as drones. Testing the system poses a high risk for the prototype drone since it is meant to work as a last resort before the vehicle collides with obstacles. The project therefore focused on VR testing at an early stage. This paper describes the experiences and lessons learned in this regard.

1.1. Testing in VR

Calculating and simulating physical processes is one of the first applications of scientific computing and has its origins before the development of the first computers [6,7]. With VR and computer-aided design, computer-aided engineering, computer-aided manufacturing, and computer-aided …(CAx), this development has evolved into a holistic approach.
Virtual reality is an important tool for all kinds of product tests. VR environments consist of entire rooms, as in a Cave Automated Virtual Environment (CAVE), first described by Cruz-Neira et al. [8], or can be experienced wearing a head-mounted display [9]. The latter are easy to handle and inexpensive [10]. In addition, pure simulation environments—without visualization—are occasionally referred to as VR environments, since they represent a (partial) aspect of reality.
Ottosson describes the use of virtual reality in the product development process [11]. Applying virtual reality technology accelerates a product’s life cycle [12]; furthermore, VR prototypes are less costly to produce, more flexible in modifications, and offer more possibilities for demonstrations [13]. VR technology has therefore established itself in product development [14,15].
Especially for products that are designed virtually and that are also used virtually/electronically, a test in VR is a natural choice. Park et al. [16] demonstrated design evaluations of digital consumer products using virtual reality-based functional behavior simulation. As the interaction with the virtual interface does not invalidate the usability evaluation itself [17], VR tests can be used widely and have exerted a huge impact on early-stage interactive product design processes [18].
However, it is not only for virtual products that evaluation in VR offers advantages. Several studies have confirmed that VR has considerable potential for applications in the area of ergonomics [19]. In the context of urban design and building information modeling, Bauer et al. show a CAVE environment for testing public buildings such as train stations [20]. They argue that evaluating, for example, the pedestrian guidance system before the building is finished helps to find errors early on and avoid costly alterations. In city development, the use of VR technology helps to communicate new mobility concepts and test design alternatives. Schrom-Feiertag et al. present a set of tools for efficient participatory planning with virtual and augmented reality [21].
VR technology has also become popular in industrial contexts and in production [22]. In their survey about industry use of virtual reality in product design and manufacturing, Berg and Vance conclude that VR is mature, stable, and usable [23]. They write that VR is actively being used in a number of industries to support decision making and enable innovation. For the time being, the entire product lifecycle is mapped in VR [24]: from virtual engineering [25], through process simulation [26], including partial aspects such as energy consumption [27], to the end user in the form of training courses [28].
While the terminology is usually user-centered (e.g., see [29] and Figure 1), we extend the term VR by automated, simulated product tests even without a human in the loop of product design, i.e., the existence of a virtual environment is not dependent on a human observer. In our use case, it is mainly drones that explore and fly through the virtual worlds. This approach is not new per se; however, it is still rarely applied [30]. Especially in the context of the development, construction, and design of drones, VR tests have so far been reduced to the user interface [31,32,33].
In summary, the main advantages of product testing in VR are manifold [34]:
  • absolute control: A virtual environment provides absolute control. In extreme cases, the simulation can be deterministic and all random / external influences, i.e., the noise of the real world, such as weather aspects, materials with (natural) variations, etc., can be eliminated.
  • reproducibility: The elimination of any randomness leads to a high degree of reproducibility, which is unattainable for real tests.
  • shortened development time: In many test scenarios, development times can be drastically reduced. Especially in destructive tests, where the prototype is destroyed (e.g., crash tests), the time for building the prototypes is omitted.
  • higher optimization potential: By not having to build real prototypes, a limiting scaling factor of product testing is eliminated. Many more product tests can be performed in VR, and by consequence many more product parameters can be optimized.
  • hazard free tests: Virtual testing is risk-free—both for humans and for potentially expensive machines and equipment.
  • unrealistic conditions: Although VR testing strives for the highest possible level of realism most of the time, unreal conditions are also an advantage that does not exist in reality: in VR, the environment can be abstracted and reduced to a minimal level. This reduction enables efficient debugging and testing of subsystems that would be difficult to test on their own in real life.
All of these points, especially the reduced development time and elimination of the hazard potential of developed prototypes, are the foundation of the hypothesis investigated.

1.2. Collision Avoidance

Collision detection is not the focus of this paper; however, collision detection provides the application context for product development. The aim of the corresponding project is the development of a bionic sensor system for reliable visual collision detection in unmanned flying objects. The model organism for this bionic approach constitutes the reliable optical collision detecting system of gregarious locusts. The neural basis for the bionic collision detector was first described by Judge and Rind [35]. The applications of a bionic collision detector are described by Yue and Rind [36], Fu et al. [37], and Cizek and Faigl [38]. The algorithm implemented in the hardware in the project is similar to that of Blanchard et al. [39], but instead of simulating artificial neurons connected via synapses, the gray values of the pixels are computed directly to estimate the collision risk.
Within the development of the new drone system, the activity of their right and left collision-detecting neurons was recorded in various critical flight situations. The results from this electrophysiological approach provided the basis for the development of a bionic algorithm for visual collision detection. This innovative concept is based on simple mathematical calculations that are performed at the level of pixels and efficiently extract the collision risk and a possible avoidance vector from the visual scene. In contrast to current systems, this bionic algorithm works without distance estimation and object recognition and is similar to the collision detection algorithm described for vehicles [40]. The new algorithm has been implemented on hardware in order to perform parallel image computations for real-time collision risk estimation and the computation of an evasion vector using Field-Programmable-Gate-Array (FPGA) technology installed in test drones. The sensor technology has been optimized on a virtual flight controller by using virtual reality technology before being tested with real drones.

2. Methods

The development of a new collision avoidance system for drones needs to be tested thoroughly. Within the project’s use-case, the tests are rather risky as the avoidance system should work as a last resort before the drone crashes into an obstacle, i.e., the test scenarios consist mostly of near-collision situations testing the behavior of the new algorithm in a new drone prototype. The failure of these tests could lead to the total loss of the drone. Since the drone is also a new development and only exists as a few individually manufactured prototypes, the manufacturing costs per drone are many times higher than the market prices of even professional drones. The total loss of the drones must therefore be avoided by all means. A virtual test environment is a safe place to run many tests without any harm for the drone. In this paper, we want to investigate the research question:
What are the benefits of testing a drone in a CAVE virtual environment?
  • and
What are the financial incentives to use VR technology?

Product Test Environment

Based on an existing virtual environment, we developed a test system for the drone. In the original configuration, it is possible for a human to navigate through the environment by various input devices, such as game controllers or optical human motion tracking devices.
For the product tests of the drones, the CAVE virtual environment has been adjusted and extended in various ways (see Figure 2 (left)).
  • simulation-based input device: In the initial configuration of the CAVE, control was assumed by a human. This task must now be taken over indirectly by the drone. In detail, the output signals from the drone controller, which is otherwise connected to the drone’s motors, are sent to a physics simulation server, which uses a physically correct real-time simulation to calculate the position of the drone in 3D. These data from the flight simulator are then the input data for control within the virtual world. For the drone to explore the virtual world, a corresponding network protocol has been added that handles the current position and orientation in real time.
  • real standstill: The advantages of a CAVE—for example, in comparison to head-mounted displays—are its own body perception and the possibility to move (with the limitations of the CAVE) together with other persons in order to explore virtual objects. In our test setup, the drone is placed at the center of the CAVE without the possibility to move. It explores its surroundings using its installed cameras. The new drone uses two cameras to enlarge its field of view. As the physical position of the drone remains unchanged during the tests, the CAVE tracking system has been deactivated.
  • stereoscopic rendering: A major advantage of the new collision detection algorithm is its simplicity. It does not require 3D reconstruction or computationally intensive object detection. The fields of view of the two cameras of the drone are therefore without overlap; they merely enlarge the field of view of the drone. The usual stereoscopic rendering in the CAVE is disabled since the collision avoidance algorithm is designed for a monoscopic view. By not using 3D stereo via shutter technology, the full brightness is available for the monoscopic test setup.
  • high refresh rate: The frame rate of the new collision detection algorithm is not based on human perception, but on the processing speed of the algorithm implemented in FPGA hardware. This is 120 Hz and the projection systems of the CAVE use the same frequency in the tests. This adjustment is necessary to prevent flickering effects.
  • color space, contrast, and brightness: During the planning phase, particular attention was paid to the compatibility of the visualization in the CAVE with the camera system of the drone. Since the drone can only record gray images, the color representation is rather irrelevant; contrast and brightness, however, are suboptimal, especially compared to real outdoor tests.
    The CAVE uses Digital Light Processing (DLP) projectors with a maximum brightness of 7500 ANSI lumen. Each wall screen is 3.3 m by 2.5 m large. The quadratic floor has an edge length of 3.3 m (see Figure 2 (right)). Consequently, each screen produces no more than 900 lux. Different lighting conditions are listed in Table 1 for comparison.
    This initial situation is regrettably suboptimal. Unfortunately, the solution to this problem is not feasible, as it would involve expensive hardware replacement; the CAVE is only rented and the provider refuses to undertake this investment.

3. Results and Discussion

The evaluation of the new algorithm was carried out in 14 test sessions, each lasting half a working day. In total, eight test sessions were conducted in VR, whereas six test sessions were held outdoors.
In virtual reality, the drone was able to control the virtual representation by sending position changes to the visualization. Different lighting conditions were tested within the possibilities of the projection system. The camera system of the drone was modified to work in low light conditions and with changing brightness levels. To further investigate the collision avoidance algorithm, the realistic environment was exchanged for an abstract version for some test runs. In all sessions, there were severe problems, with the collision avoidance algorithm leading to a virtual crash. In a real-world test, this would have resulted in a damaged drone.
Despite the extensive planning in advance and the numerous adjustments, circumstances arose during the product tests that had both positive and negative influences.
The lighting conditions, which cannot be changed in terms of brightness and contrast, only allow a limited test scenario. In particular, tests of bright sunny days with high-contrast light–shadow configurations cannot be tested in VR—at least not in the currently used setup.
±
Testing in VR guarantees absolute control over the virtual environment and the test configuration. This absolute control has both positive and negative effects. On the one hand, control and reproducibility simplify debugging and automatic testing; in particular, random elements would complicate this process considerably in many cases. On the other hand, everything must also be controlled, i.e., only what is programmed in advance happens in VR. The design space of possibilities may contain variations, but will hardly open up new dimensions that have not been designed before (see Figure 3), e.g., if no flock of birds has been programmed, they will not appear; if no pedestrians or other road users have been animated, the roads will remain empty. This is why real tests are still being carried out in the drone project to test with real-world complexity.
+
Despite many error avoidance strategies (test-driven development, pair programming, etc.), one error remained undetected until the first test: the calculated escape vector to avoid a collision was scaled incorrectly. As a consequence, the algorithm did not react to any obstacles. Even when the drone navigated directly towards a building, the algorithm showed no reaction. In a real test, this error would have inescapably (literally) led to a crash and thus probably to the total loss of a prototype. However, this first, complete functional test, which is very critical due to the test scenario, first took place in the CAVE environment for good reason.
+
Preventing prototypes from real crashes in this VR environment leads to economic benefits: Irrespective of the costs of a prototype due to increased material and personnel expenses, more efficient test phases are the result. In this context, we would like to point out the potential delays that would have occurred in real tests. A damaged drone in the tests must be repaired at an unknown cost (in terms of personnel costs and material costs) before the next test starts. After all, a virtual drone does not need to be repaired after a crash.
+
A side effect of testing in VR only becomes clear in comparison to real outdoor tests: unlike real tests, no VR tests had to be postponed due to fog, rain, or other bad weather conditions. The project schedule is easier to adhere to in VR. However, this cannot be generalized for every VR product test project.
+
Even though VR tests usually aim for the highest possible degree of realism, unreal conditions are also an advantage that does not exist in reality. In VR, the surroundings can be made abstract in order to simplify debugging.
In short, the collision avoidance system of the drone can be tested in its entirety. The processing of the camera images is done in the same way as it is done in the real world. Furthermore, the test setup is an authentic representation of the real world, without the danger of crashing the drone into an obstacle.
The financial aspects of VR-driven product testing for drones are listed in Table 2. It quantifies the costs using the project costs budget.
Within the research project, four prototypes of the new drone were built having the new algorithm hardware embedded. The accumulated costs of these four developments in personnel time and material costs amounted to EUR 32,908, i.e., EUR 8227 per drone. Since, at minimum, one total loss could be avoided through the use of VR-based testing, at least the cost of one drone can be attributed to the savings of using VR.

4. Conclusions

There is a clear case for the increased use of VR testing based on experience gained in VR-based product testing. In virtual realities, a wide variety of scenarios can be reproducibly run through—with automatic testing even 24/7. This automatic testing under almost realistic conditions is otherwise not possible. Furthermore, the level of realism can be reduced to a minimum. Such a minimalist scene with only a plane and a box is shown in Figure 4. This reduction enables the efficient debugging and testing of subsystems that would be difficult to test in reality (see Figure 3).
During the test sessions, the development team was able to identify some disadvantages with the collision avoidance algorithm, including a severe problem that would most likely have caused the total loss of a valuable prototype. This avoided crash also ensures a positive economic balance: VR testing is more cost-effective than real tests, even if not taking into account any project delays that may occur due to the repeated construction of new prototypes.
Concerning the future of virtual test environments, we dare to predict that this technology will continue to grow in importance; as the use of autonomous systems—whether based on AI methods or not—increases, more and more tests will become necessary. Whether autonomous driving or AI-driven drones, millions of test trials can only be completed in a VR environment efficiently. How else, if not by virtually reconstructed scenarios and through appropriate test procedures, can it be ensured that accidents that occur once are not repeated?

Author Contributions

Conceptualization, V.S. and K.K.; methodology, E.E., M.H. and T.U.; writing—original draft preparation, V.S. and T.U.; writing—review and editing, V.S., M.H., E.E., K.K. and T.U. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the Austria Research Promotion Agency, FTI Initiative “Take Off”, via the “Entwicklung eines bionischen Detektions- und Ausweichsystems für UAVs—BioKollAvoid” project under Grant No. 874494.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Caputoa, F.; Greco, A.; D’Amato, E.; Notaro, I.; Spada, S. On the use of Virtual Reality for a human-centered workplace design. Procedia Struct. Integr. 2018, 8, 297–308. [Google Scholar] [CrossRef]
  2. Pappas, M.; Karabatsou, V.; Mavrikios, D.; Chryssolouris, G. Ergonomic Evaluation Of Virtual Assembly Tasks. Digit. Enterp. Technol. 2007, 9, 511–518. [Google Scholar]
  3. Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
  4. Zhu, W.; Fan, X.; Zhang, Y. Applications and research trends of digital human models in the manufacturing industry. Virtual Real. Intell. Hardw. 2019, 1, 558–579. [Google Scholar] [CrossRef]
  5. da Silva, A.G.; Winkler, I.; Gomes, M.M.; De Melo Pinto, U. Ergonomic Analysis supported by Virtual Reality: A Systematic Literature Review. Symp. Virtual Augment. Real. (SVR) 2020, 22, 67–73. [Google Scholar]
  6. Nash, S.G. A History of Scientific Computing; Association for Computing Machinery: New York, NY, USA, 1990. [Google Scholar]
  7. Gustafsson, B. Scientific Computing—A Historical Perspective; Springer: Heidelberg, Germany, 2018. [Google Scholar]
  8. Cruz-Neira, C.; Sandin, D.J.; DeFanti, T.A.; Kenyon, R.V.; Hart, J.C. The CAVE: Audio Visual Experience Automatic Virtual Environment. Commun. ACM 1992, 35, 64–72. [Google Scholar] [CrossRef]
  9. Borrego, A.; Latorre, J.; Alcañiz, M.; Llorens, R. Comparison of Oculus Rift and HTC Vive: Feasibility for Virtual Reality-Based Exploration, Navigation, Exergaming, and Rehabilitation. Games Health J. 2018, 7, 151–156. [Google Scholar] [CrossRef]
  10. Mestre, D.R. CAVE versus Head-Mounted Displays: Ongoing thoughts. Electron. Imaging 2017, 3, 31–35. [Google Scholar] [CrossRef]
  11. Ottosson, S. Virtual Reality in the Product Development Process. J. Eng. Des. 2002, 13, 159–172. [Google Scholar] [CrossRef]
  12. Blümel, E.; Strassburger, S.; Sturek, R.; Kimura, I. Pragmatic approach to apply virtual reality technology in accelerating a product life cycle. Innovations 2004, 2004, 199–207. [Google Scholar]
  13. Söderman, M. Virtual reality in product evaluations with potential customers: An exploratory study comparing virtual reality with conventional product representations. J. Eng. Des. 2007, 16, 311–328. [Google Scholar] [CrossRef]
  14. Falcao, C.S.; Soares, M.M. Application of Virtual Reality Technologies in Consumer Product Usability. Des. User Exp. Usability 2013, 8015, 342–351. [Google Scholar]
  15. Porcherot, C.; Delplanque, S.; Gaudreau, N.; Ischer, M.; De Marles, A.; Cayeux, I. Immersive Techniques and Virtual Reality. Methods Consum. Res. 2018, 2, 69–83. [Google Scholar]
  16. Park, H.; Son, J.S.; Lee, K.H. Design evaluation of digital consumer products using virtual reality-based functional behaviour simulation. J. Eng. Des. 2008, 19, 359–375. [Google Scholar] [CrossRef]
  17. Bruno, F.; Muzzupappa, M. Product interface design: A participatory approach based on virtual reality. Int. J. Hum.-Comput. Stud. 2010, 68, 254–269. [Google Scholar] [CrossRef]
  18. Maurya, S.; Mougenot, C.; Takeda, Y. Impact of mixed reality implementation on early-stage interactive product design process. J. Eng. Des. 2021, 32, 1–27. [Google Scholar] [CrossRef]
  19. Rebelo, F.; Duarte, E.; Noriega, P.; Soares, M. Virtual Reality in Consumer Product Design: Methods and Applications. Hum. Factors Ergon. Consum. Prod. Des. 2011, 24, 381–402. [Google Scholar]
  20. Bauer, D.; Settgast, V.; Schrom-Feiertag, H.; Millonig, A. Making the usage of guidance systems in pedestrian infrastructures measurable using the virtual environment DAVE. Transp. Res. (Part F Traffic Psychol. Behav.) 2018, 59, 298–317. [Google Scholar] [CrossRef]
  21. Schrom-Feiertag, H.; Lorenz, F.; Regal, G.; Settgast, V. Augmented and Virtual Reality Applied for Innovative, Inclusive and Efficient Participatory Planning. Transp. Res. Arena 2018, 7, 1491568. [Google Scholar]
  22. Mujber, T.S.; Szecsi, T.; Hashmi, M.S.J. Virtual reality applications in manufacturing process simulation. J. Mater. Process. Technol. 2004, 155, 1834–1838. [Google Scholar] [CrossRef]
  23. Berg, L.; Vance, J. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef]
  24. Ikonomov, P.G.; Milkova, E.D. Using Virtual Reality Simulation Through Product Lifecycle. Proc. Int. Mech. Eng. Congr. Expo. 2008, 3719, 761–767. [Google Scholar]
  25. Ma, D.; Fan, X.; Gausemeier, J.; Grafe, M. Virtual Reality & Augmented Reality in Industry; Springer: Heidelberg, Germany, 2011. [Google Scholar]
  26. Banerjee, P.P. Virtual Reality and Automation. Handb. Autom. 2009, 2009, 269–278. [Google Scholar]
  27. Nabati, G.E.; Alvela Nieto, M.T.; Decker, A.; Thoben, K.D. Application of Virtual Reality Technologies for Achieving Energy Efficient Manufacturing: Literature Analysis and Findings. Adv. Prod. Manag. Syst. Path Digit. Transform. Innov. Prod. Manag. Syst. 2020, 1, 479–486. [Google Scholar]
  28. Staretu, I. From using virtual reality in the training process to virtual engineering. Glob. J. Comput. Sci. 2014, 4, 31–41. [Google Scholar]
  29. Jerald, J. The VR Book: Human-Centered Design for Virtual Reality; Association for Computing Machinery and Morgan & Claypool: Williston, ND, USA, 2015. [Google Scholar]
  30. Faithfull, P.T.; Ball, R.J.; Jones, R.P. An investigation into the use of hardware-in-the-loop simulation with a scaled physical prototype as an aid to design. J. Eng. Des. 2001, 12, 231–243. [Google Scholar] [CrossRef]
  31. Postal, G.R.; Pavan, W.; Rieder, R. A Virtual Environment for Drone Pilot Training Using VR Devices. Symp. Virtual Real. 2016, 18, 183–187. [Google Scholar]
  32. Liu, Y.; Yang, N.; Li, A.; Paterson, J.; McPherson, D.; Cheng, T.; Yang, A.Y. Usability Evaluation for Drone Mission Planning in Virtual Reality. In Proceedings of the International Conference on Virtual, Augmented and Mixed Reality, Las Vegas, FL, USA, 15–20 July 2018; Volume 10910, pp. 313–330. [Google Scholar]
  33. Nguyen, V.T.; Jung, K.; Dang, T. DroneVR: A Web Virtual Reality Simulator for Drone Operator. In Proceedings of the IEEE International Conference on Artificial Intelligence and Virtual Reality, San Diego, CA, USA, 9–11 December 2019; Volume 2, pp. 257–262. [Google Scholar]
  34. Freitas, F.; Oliveira, H.; Winkler, I.; Gomes, M. Virtual Reality on Product Usability Testing: A Systematic Literature Review. In Proceedings of the 2020 22nd Symposium on Virtual and Augmented Reality (SVR), Porto de Galinhas, Brazil, 7–10 November 2020; pp. 67–73. [Google Scholar] [CrossRef]
  35. Judge, S.; Rind, F. The locust DCMD, a movement-detecting neurone tightly tuned to collision trajectories. J. Exp. Biol. 1997, 200, 2209–2216. [Google Scholar] [CrossRef]
  36. Yue, S.; Rind, F. Collision detection in complex dynamic scenes using an LGMD-based visual neural network with feature enhancement. IEEE Trans. Neural Netw. 2006, 17, 705–716. [Google Scholar]
  37. Fu, Q.; Yue, S.; Hu, C. Bio-inspired collision detector with enhanced selectivity for ground robotic vision system. In Proceedings of the British Machine Vision Conference, York, UK, 19–22 September 2016; Volume 6, pp. 1–13. [Google Scholar]
  38. Cizek, P.; Faigl, J. Self-supervised learning of the biologically-inspired obstacle avoidance of hexapod walking robot. Bioinspir. Biomimetics 2019, 14, 046002. [Google Scholar] [CrossRef] [Green Version]
  39. Blanchard, M.; Rind, F.; Verschure, P.F.M.J. Collision avoidance using a model of the locust LGMD neuron. Robot. Auton. Syst. 2000, 30, 17–38. [Google Scholar] [CrossRef] [Green Version]
  40. Hartbauer, M. Simplified bionic solutions: A simple bio-inspired vehicle collision detection system. Bioinspir. Biomimetics 2017, 12, 026007. [Google Scholar] [CrossRef] [PubMed]
  41. Seidelmann, P.K. (Ed.) Explanatory Supplement to the Astronomical Almanac; University Science Books: Mill Valley, CA, USA, 1992. [Google Scholar]
Figure 1. The “usual” test case for a virtual reality environment is a product evaluation, i.e., inspecting and evaluating a product such as a building, a car, a yacht, etc., in order to obtain a realistic impression of the final product.
Figure 1. The “usual” test case for a virtual reality environment is a product evaluation, i.e., inspecting and evaluating a product such as a building, a car, a yacht, etc., in order to obtain a realistic impression of the final product.
Designs 06 00033 g001
Figure 2. The Cave Automated Virtual Environment (CAVE) consists of three projected walls (left, front, right) and a projected floor. The quadratic floor has an edge length of 3.3 m and the walls are 2.5 m high (left). The test setup uses the CAVE, a flight simulator running on a separate computer, and the drone sensor module with two cameras placed on a table (right).
Figure 2. The Cave Automated Virtual Environment (CAVE) consists of three projected walls (left, front, right) and a projected floor. The quadratic floor has an edge length of 3.3 m and the walls are 2.5 m high (left). The test setup uses the CAVE, a flight simulator running on a separate computer, and the drone sensor module with two cameras placed on a table (right).
Designs 06 00033 g002
Figure 3. The test area for the drone consists of trees and buildings that serve as potential obstacles. The VR scene guarantees absolute control and thus reproducibility, but it will not contain any elements which have not been designed in advance.
Figure 3. The test area for the drone consists of trees and buildings that serve as potential obstacles. The VR scene guarantees absolute control and thus reproducibility, but it will not contain any elements which have not been designed in advance.
Designs 06 00033 g003
Figure 4. An abstracted environment, e.g., consisting only of the horizon and a box, can be used to test the basic functionality of the collision avoidance algorithm.
Figure 4. An abstracted environment, e.g., consisting only of the horizon and a box, can be used to test the basic functionality of the collision avoidance algorithm.
Designs 06 00033 g004
Table 1. The comparison with different lighting conditions [41] allows us to interpret the CAVE’s conditions and gives an impression of its maximum brightness.
Table 1. The comparison with different lighting conditions [41] allows us to interpret the CAVE’s conditions and gives an impression of its maximum brightness.
IlluminanceExample
0.0001 luxmoonless night sky
0.25 luxclear night sky with full moon
400 luxambient sunrise on a clear day
1000–2000 luxtypical overcast day at midday
>100,000 luxbright sunlight
Table 2. The project budget shows the costs for 14 tests. Please note that no test was performed twice, i.e., this is not a comparison of a VR group with a control group in the statistical sense. Nevertheless, the costs show that it is worthwhile to perform all tests in VR environments that can be performed in VR and that can replace real tests.
Table 2. The project budget shows the costs for 14 tests. Please note that no test was performed twice, i.e., this is not a comparison of a VR group with a control group in the statistical sense. Nevertheless, the costs show that it is worthwhile to perform all tests in VR environments that can be performed in VR and that can replace real tests.
Testing in VRTesting Outdoors
Completed test sessions86
Total costsEUR 19,167EUR 40,377
These costs already include EUR 2000 rent and EUR 1300 for the provision of the VR models.This item includes all costs involved, including travel expenses, travel time, etc.
Costs per test sessionEUR 2395EUR 6729
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Settgast, V.; Kostarakos, K.; Eggeling, E.; Hartbauer, M.; Ullrich, T. Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones. Designs 2022, 6, 33. https://doi.org/10.3390/designs6020033

AMA Style

Settgast V, Kostarakos K, Eggeling E, Hartbauer M, Ullrich T. Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones. Designs. 2022; 6(2):33. https://doi.org/10.3390/designs6020033

Chicago/Turabian Style

Settgast, Volker, Konstantinos Kostarakos, Eva Eggeling, Manfred Hartbauer, and Torsten Ullrich. 2022. "Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones" Designs 6, no. 2: 33. https://doi.org/10.3390/designs6020033

APA Style

Settgast, V., Kostarakos, K., Eggeling, E., Hartbauer, M., & Ullrich, T. (2022). Product Tests in Virtual Reality: Lessons Learned during Collision Avoidance Development for Drones. Designs, 6(2), 33. https://doi.org/10.3390/designs6020033

Article Metrics

Back to TopTop