Guided Next Best View for 3D Reconstruction of Large Complex Structures
Abstract
:1. Introduction
- A new frontier based view generator is proposed that defines frontiers as low density cells in the volumetric map.
- A novel utility function is proposed to evaluate and select viewpoints based on information gain, distance, model density and predictive measures.
- A set of simulations and real experiments were performed using structures of different shapes to illustrate the effectiveness of the approach.
2. Methods
2.1. 3D Mapping
Point Density
2.2. Profiling
2.3. Exploiting Possible Symmetry
2.4. Viewpoint Sampling
2.5. Viewpoint Evaluation
2.6. Termination
3. Experimental Results
3.1. Simulated Experiments
3.1.1. Simulation Setup
3.1.2. Simulation Results
- Aircraft ModelA 3D reconstructed model of an aircraft was generated in this experiment by exploring the bounded box of the structure. A dense point cloud map was generated using the data collected at the selected viewpoints along the generated exploration path in simulation.Figure 8 presents the achieved coverage and the path generated to cover the aircraft model. Table 2 illustrates the results obtained by our proposed method when compared to other two approaches in terms of coverage, distance, and entropy reduction. As shown in the table, the density and resolution of the reconstructed model is low () since the profiling was performed at the edges of the bounded workspace from a long distance. The overall achieved coverage approaches high percentages () and improves the model density when our proposed Guided NBV method was used. Since the aircraft model is symmetric, the utility function weights were selected to achieve high density and utilize the existing prediction measures. Using the proposed utility function with the selected weights, as shown in Table 1, facilitated the capturing of more data by allowing the UAV to travel around the structure more efficiently. The plot shown in Figure 9 illustrates the average density attribute along the iterations for each of the used utility functions. As shown in the plot, the proposed utility function achieves higher density values per voxel in each iteration than the other two approaches. The plot is non-monotonic since the average density was computed based on the size of point cloud and the number of occupied voxels in the octree. At complex curved areas, some voxels hold few points compared to other voxels, which creates the difference in the average density. The details of the structure were captured with high coverage density when evaluated with voxels of resolution 0.05 m. However, this achieved improvement in coverage comes at a cost which is an increase in the traveled distance.
- Power Plant ModelSimilarly, the power plant structure was 3D reconstructed by exploring the bounded box of the model workspace. The collected data at each viewpoint in simulation were used to generate a dense point cloud map. The generated paths and achieved coverage are shown in Figure 10, and the results compared with two other approaches are illustrated in Table 3. As shown in the table, the resolution and density of the reconstructed structure is low ().The Guided NBV achievesd high coverage percentages compared to the two other illustrated approaches. The longer path distance is compensated by a better coverage percentage and average density (proposed density = 177 pt/voxel, Bircher et al. [17] = 140 pt/voxel, Isler et al. [29] = 152 pt/voxel). Moreover, the Guided NBV achieves high coverage () for the power plant structure compared to Song and Jo [21] () and Song and Jo [21] () using similar setup and structure model.The power plant model is not symmetric, thus the utility function weights were selected to achieve high density, as shown in Table 1. The plot shown in Figure 11 illustrates the average density and entropy attributes along the iterations showing the FVS effect on the reduction of the entropy and the increase of the density at some areas. Using the AGVS sampling method with FVS facilitated dense coverage, exceeding . This is mainly due to the ability to take larger steps to escape local minima.
3.2. Real Experiments
3.2.1. Real Experiment Setup
3.2.2. Real Experiment Results
- Representative ModelA representative model in an indoor lab environment was used in this experiment. The proposed Guided NBV was performed starting with a profiling stage, and then executing the NBV iterative process using quadrotor platform, as described previously. The generated exploration path is shown in Figure 13, which consists of 40 viewpoints, with a path length of 32 m, generated in min. The obtained average density is 33 points/voxel and the entropy reduction value is . The plot shown in Figure 14 shows the decreasing entropy and increasing density behaviors across the iterations.A feasible exploration path was generated by our proposed Guided NBV approach for the indoor structure. This is demonstrated in the 3D OctoMap (resolution of m) of the structure, which was generated while exploring the environment using the real quadrotor UAV platform and collecting data along the path. The final textured 3D reconstructed model is shown in Figure 15. Trajectory planning could be performed to connect the waypoints optimizing the path for the UAV acceleration or velocity as a further enhancement. The illustration of the indoor lab experiment of Scenario 3 is shown in a video at [37].
- Airplane ModelIn this experiment, another model of different complexity in terms of shape and texture was selected which is the airplane. The generated exploration path is shown in Figure 16 and it consists of 60 waypoints, with a path length of m, generated in min. Figure 17 shows the plot of the obtained increasing average density and decreasing entropy with an average density of 45 points/voxel and entropy reduction value is 28949.The achieved results show the applicability of our proposed NBV exploration method to different structures, which can be illustrated in the generated 3D OctoMap (resolution of ) of the airplane structure. A finer OctoMap resolution was used since the airplane consists of more details and edges than the previous representative model. The final textured 3D reconstructed model of the airplane is shown in Figure 18. The illustration of the indoor lab experiment of Scenario 4 is shown in a video at [38].
4. Discussion
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Faria, M.; Maza, I.; Viguria, A. Applying Frontier Cells Based Exploration and Lazy Theta* Path Planning over Single Grid-Based World Representation for Autonomous Inspection of Large 3D Structures with an UAS. J. Intell. Robot. Syst. 2019, 93, 113–133. [Google Scholar] [CrossRef]
- Hollinger, G.A.; Englot, B.; Hover, F.S.; Mitra, U.; Sukhatme, G.S. Active planning for underwater inspection and the benefit of adaptivity. Int. J. Robot. Res. 2013, 32, 3–18. [Google Scholar] [CrossRef]
- Jiang, S.; Jiang, W.; Huang, W.; Yang, L. UAV-Based Oblique Photogrammetry for Outdoor Data Acquisition and Offsite Visual Inspection of Transmission Line. Remote Sens. 2017, 9, 278. [Google Scholar] [CrossRef]
- Oleynikova, H.; Taylor, Z.; Siegwart, R.; Nieto, J. Safe Local Exploration for Replanning in Cluttered Unknown Environments for Micro-Aerial Vehicles. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’18), Brisbane, Australia, 21–25 May 2018. [Google Scholar]
- Environments, L.S.; Selin, M.; Tiger, M.; Duberg, D.; Heintz, F.; Jensfelt, P. Efficient Autonomous Exploration Planning of Large-Scale 3-D Environments. IEEE Robot. Autom. Lett. 2019, 4, 1699–1706. [Google Scholar] [Green Version]
- Bircher, A.; Kamel, M.; Alexis, K.; Burri, M.; Oettershagen, P.; Omari, S.; Mantel, T.; Siegwart, R. Three-dimensional coverage path planning via viewpoint resampling and tour optimization for aerial robots. Auton. Robot. 2015, 40, 1059–1078. [Google Scholar] [CrossRef]
- Almadhoun, R.; Taha, T.; Seneviratne, L.; Dias, J.; Cai, G. GPU accelerated coverage path planning optimized for accuracy in robotic inspection applications. In Proceedings of the 2016 IEEE 59th International Midwest Symposium on Circuits and Systems (MWSCAS), Abu Dhabi, UAE, 16–19 October 2016; pp. 1–4. [Google Scholar]
- Frías, E.; Díaz-Vilariño, L.; Balado, J.; Lorenzo, H. From BIM to Scan Planning and Optimization for Construction Control. Remote Sens. 2019, 11, 1963. [Google Scholar] [CrossRef]
- Martin, R.A.; Rojas, I.; Franke, K.; Hedengren, J.D. Evolutionary View Planning for Optimized UAV Terrain Modeling in a Simulated Environment. Remote Sens. 2016, 8, 26. [Google Scholar] [CrossRef]
- Vasquez-Gomez, J.I.; Sucar, L.E.; Murrieta-Cid, R.; Herrera-Lozada, J.C. Tree-based search of the next best view/state for three-dimensional object reconstruction. Int. J. Adv. Robot. Syst. 2018, 15, 1–11. [Google Scholar] [CrossRef]
- Palomeras, N.; Hurtos, N.; Carreras, M.; Ridao, P. Autonomous Mapping of Underwater 3-D Structures: From view planning to execution. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’18), Brisbane, Australia, 21–25 May 2018; pp. 1965–1971. [Google Scholar]
- Martin, R.A.; Blackburn, L.; Pulsipher, J.; Franke, K.; Hedengren, J.D. Potential Benefits of Combining Anomaly Detection with View Planning for UAV Infrastructure Modeling. Remote Sens. 2017, 9, 434. [Google Scholar] [CrossRef]
- Shade, R.; Newman, P. Choosing where to go: Complete 3D exploration with stereo. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 9–13 May 2011; pp. 2806–2811. [Google Scholar]
- Heng, L.; Gotovos, A.; Krause, A.; Pollefeys, M. Efficient visual exploration and coverage with a micro aerial vehicle in unknown environments. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 1071–1078. [Google Scholar]
- Haner, S.; Heyden, A. Discrete Optimal View Path Planning. In Proceedings of the 10th International Conference on Computer Vision Theory and Applications (VISAPP 2015), Berlin, Germany, 11–14 March 2015; pp. 411–419. [Google Scholar]
- Cieslewski, T.; Kaufmann, E.; Scaramuzza, D. Rapid exploration with multi-rotors: A frontier selection method for high speed flight. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 2135–2142. [Google Scholar]
- Bircher, A.; Kamel, M.; Alexis, K.; Oleynikova, H.; Siegwart, R. Receding Horizon “Next-Best-View” Planner for 3D Exploration. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 1462–1468. [Google Scholar]
- Oshima, S.; Nagakura, S.; Yongjin, J.; Kawamura, A.; Iwashita, Y.; Kurazume, R. Automatic planning of laser measurements for a large-scale environment using CPS-SLAM system. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–3 October 2015; pp. 4437–4444. [Google Scholar]
- Palomeras, N.; Hurtos, N.; Vidal Garcia, E.; Carreras, M. Autonomous exploration of complex underwater environments using a probabilistic Next-Best-View planner. IEEE Robot. Autom. Lett. 2019. [Google Scholar] [CrossRef]
- Palazzolo, E.; Stachniss, C. Effective Exploration for MAVs Based on the Expected Information Gain. Drones 2018, 2, 9. [Google Scholar] [CrossRef]
- Song, S.; Jo, S. Surface-based Exploration for Autonomous 3D Modeling. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA’18), Brisbane, Australia, 21–25 May 2018; pp. 4319–4326. [Google Scholar]
- Song, S.; Jo, S. Online inspection path planning for autonomous 3D modeling using a micro-aerial vehicle. In Proceedings of the IEEE International Conference on Robotics and Automation, Singapore, 29 May–3 June 2017; pp. 6217–6224. [Google Scholar]
- Hepp, B.; Nießner, M.; Hilliges, O. Plan3D: Viewpoint and Trajectory Optimization for Aerial Multi-View Stereo Reconstruction. ACM Trans. Graph. 2017, 38, 4:1–4:17. [Google Scholar] [CrossRef]
- Roberts, M.; Dey, D.; Truong, A.; Sinha, S.N.; Shah, S.; Kapoor, A.; Hanrahan, P.; Joshi, N. Submodular Trajectory Optimization for Aerial 3D Scanning. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 5334–5343. [Google Scholar]
- Hornung, A.; Wurm, K.M.; Bennewitz, M.; Stachniss, C.; Burgard, W. OctoMap: An efficient probabilistic 3D mapping framework based on octrees. Auton. Robot. 2013, 34, 189–206. [Google Scholar] [CrossRef]
- Source Implementation Package of the Proposed Algorithm. Available online: https://github.com/kucars/ma_nbv_cpp.git (accessed on 30 May 2018).
- Abduldayem, A.; Gan, D.; Seneviratne, L.; Taha, T. 3D Reconstruction of Complex Structures with Online Profiling and Adaptive Viewpoint Sampling. In Proceedings of the International Micro Air Vehicle Conference and Flight Competition, Malaga, Spain, 14–16 June 2017; pp. 278–285. [Google Scholar]
- Holz, D.; Basilico, N.; Amigoni, F.; Behnke, S. Evaluating the efficiency of frontier-based exploration strategies. In Proceedings of the ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics), Munich, Germany, 7–9 June 2010. [Google Scholar]
- Isler, S.; Sabzevari, R.; Delmerico, J.; Scaramuzza, D. An information gain formulation for active volumetric 3D reconstruction. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–20 May 2016; pp. 3477–3484. [Google Scholar]
- Quin, P.; Paul, G.; Alempijevic, A.; Liu, D.; Dissanayake, G. Efficient neighbourhood-based information gain approach for exploration of complex 3d environments. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany, 6–10 May 2013; pp. 1343–1348. [Google Scholar]
- Paul, G.; Webb, S.; Liu, D.; Dissanayake, G. Autonomous robot manipulator-based exploration and mapping system for bridge maintenance. Robot. Auton. Syst. 2011, 59, 543–554. [Google Scholar] [CrossRef] [Green Version]
- Abduldayem, A.A. Intelligent Unmanned Aerial Vehicles for Inspecting Indoor Complex Aero-Structures. Master’s Thesis, Khalifa University, Abu Dhabi, UAE, 2017. [Google Scholar]
- PX4. Available online: http://px4.io/ (accessed on 16 July 2019).
- Mavros. Available online: http://wiki.ros.org/mavros (accessed on 16 July 2019).
- Rusu, R.B.; Cousins, S. 3D Is Here: Point Cloud Library (PCL). In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 9–13 May 2011. [Google Scholar]
- Optitrack. Available online: http://www.optitrack.com/ (accessed on 25 March 2019).
- Experimentation Using Representative Model. Available online: https://youtu.be/fEtUjW0PNrA (accessed on 25 March 2019).
- Experimentation Using Airplane Model. Available online: https://youtu.be/jNX78Iyfg2Q (accessed on 25 March 2019).
Parameter | Scenario 1 | Scenario 2 |
---|---|---|
Workspace dimension | m | m |
Mesh Vertices/Faces | 17,899/31,952 | |
Camera FOV | [] | [] |
Camera tilt angle (pitch) | () | |
Number of Cameras | 2 | 1 |
Utility Weights () | ||
Global Entropy Change |
Profiling | Utility Function | Iterations | Entropy Reduction | Distance (m) | Coverage (Res = 0.05 m) | Coverage (Res = 0.10 m) | Coverage (Res = 0.50 m) |
---|---|---|---|---|---|---|---|
Yes | [17] | 577 | 10,504 | 1347.6 | 86.5% | 93.9% | 98.8% |
Yes | [29] | 585 | 10,846 | 1500.9 | 85.1% | 92.0% | 98.5% |
Yes | Proposed | 635 | 10,970 | 1396.5 | 94.1% | 97.2% | 99.8% |
Profile | — | — | — | 370.0 | 15.3% | 63.9% | 87.3% |
Profiling | Utility Function | Iterations | Entropy Reduction | Distance (m) | Coverage (Res = 0.05 m) | Coverage (Res = 0.10 m) | Coverage (Res = 0.50 m) |
---|---|---|---|---|---|---|---|
Yes | [17] | 162 | 14,574 | 479.5 | 95.9% | 87.0% | 98.5% |
Yes | [29] | 360 | 24,425 | 945.4 | 95.7% | 86.3% | 98.1% |
Yes | Proposed | 210 | 23,983 | 469.5 | 97.5% | 93.6% | 98.9% |
Profile | — | — | — | 512.0 | 15.1% | 69.5% | 86.3% |
Parameter | Scenario 3 | Scenario 4 |
---|---|---|
Model size | m | m |
Workspace dimension | m | m |
Octree resolution |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Almadhoun, R.; Abduldayem, A.; Taha, T.; Seneviratne, L.; Zweiri, Y. Guided Next Best View for 3D Reconstruction of Large Complex Structures. Remote Sens. 2019, 11, 2440. https://doi.org/10.3390/rs11202440
Almadhoun R, Abduldayem A, Taha T, Seneviratne L, Zweiri Y. Guided Next Best View for 3D Reconstruction of Large Complex Structures. Remote Sensing. 2019; 11(20):2440. https://doi.org/10.3390/rs11202440
Chicago/Turabian StyleAlmadhoun, Randa, Abdullah Abduldayem, Tarek Taha, Lakmal Seneviratne, and Yahya Zweiri. 2019. "Guided Next Best View for 3D Reconstruction of Large Complex Structures" Remote Sensing 11, no. 20: 2440. https://doi.org/10.3390/rs11202440
APA StyleAlmadhoun, R., Abduldayem, A., Taha, T., Seneviratne, L., & Zweiri, Y. (2019). Guided Next Best View for 3D Reconstruction of Large Complex Structures. Remote Sensing, 11(20), 2440. https://doi.org/10.3390/rs11202440