LiDAR-Based Non-Cooperative Tumbling Spacecraft Pose Tracking by Fusing Depth Maps and Point Clouds
Abstract
:1. Introduction
1.1. Related Works of LiDAR Based Relative Pose Determination
1.2. Objectives and Contributions
- The consistence of the depth map and the point cloud is explored, and a depth map aided point cloud registration strategy is proposed to obtain high accuracy relative pose.
- The roll angle variation in adjacent sensor data is computed by detecting and matching the lines from the adjacent depth map.
- A point cloud simplification process based on the real-time relative position is designed to reduce the computing time.
- For approaching the tumbling non-cooperative target in close range, the simulated sensor data are generated and the numerical simulations are conducted.
2. Proposed Relative Pose Estimation Method
2.1. Definition of Reference Frames and Relative Pose Parameters
2.2. The Framework of the Proposed Method
2.3. Sensor Point Clouds and Depth Maps
2.4. Line Detection and Matching
2.5. The Roll Angle Variation Calculation
2.6. Point Cloud Simplification
2.7. The Relative Pose Computation
Algorithm 1: The Procedure of the Proposed Pose Tracking Method |
Input: The model point cloud , the current k-th sensor point cloud While 1: Convert the to the depth map . 2: Detect the lines collection in the depth map . 3: Calculate the lines descriptors for each line. 4: Sort the lines by the line length in descending order. 5: Select the first lines and calculate the line similarity degree matrix . 6: Calculate the roll angle variation and the transformation matrix . 7: Perform the point cloud simplification process for and obtain the sparse sensor point cloud. 8: Calculate the transform matrix by aligning the sparse sensor point cloud with the . 9: Calculate the current k-th transformation matrix . 10: Calculate the six-DOF relative pose parameters. 11: , and go to step 1. end Output: the six-DOF relative pose, including the roll angle , the pitch angle , the yaw angle , the , the , the . |
3. Experiments
3.1. Test Setup
3.2. Emulational Experiment 1
3.3. Emulational Experiment 2
3.4. Discussion
4. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Guo, L.; Fang, J.C. Recent prospects on some problem of navigation guidance and sensing technology. Sci. Sin. Inf. 2017, 47, 1198–1208. [Google Scholar] [CrossRef]
- Wang, K.; Tang, L.; Li, K.H.; Chen, S.L.; Xu, S.J. Application of relative navigation and control technology in specific space missions. Aerosp. Control Appl. 2016, 42, 7–12. [Google Scholar]
- Opromolla, R.; Fasano, G.; Rufino, G.; Grassi, M. A review of cooperative and uncooperative spacecraft pose determination techniques for close-proximity operations. Prog. Aerosp. Sci. 2017, 93, 53–72. [Google Scholar] [CrossRef]
- Flores-Abad, A.; Ma, O.; Pham, M.; Ulrich, S. A review of space robotics technologies for on-orbit servicing. Prog. Aerosp. Sci. 2014, 68, 1–26. [Google Scholar] [CrossRef] [Green Version]
- Christian, J.A.; Cryan, S. A survey of LIDAR technology and its use in spacecraft relative navigation. In Proceedings of the AIAA Guidance, Navigation, and Control Conference, Boston, MA, USA, 19–22 August 2013; pp. 1–7. [Google Scholar]
- Sharma, S.; D’Amico, S. Comparative assessment of techniques for initial pose estimation using monocular vision. Acta Astronaut. 2016, 123, 435–445. [Google Scholar] [CrossRef]
- Song, J.Z.; Cao, C.X. Pose self-measurement of noncooperative spacecraft based on solar panel triangle structure. J. Rob. 2015, 472461, 1–6. [Google Scholar] [CrossRef]
- D’Amico, S.; Ardaens, J.S.; Gaisa, G.; Benninghoff, H.; Schlepp, B.; Jøogensen, J.L. Noncooperative rendezvous using angles-only optical navigation system design and flight results. J. Guidance Control Dyn. 2013, 36, 1576–1595. [Google Scholar] [CrossRef]
- Obermark, J.; Creamer, G.; Kelm, B.E.; Wagner, W.; Glen Henshaw, C. SUMO/FREND vision system for autonomous satellite grapple. In Proceedings of the SPIE Sensors and Systems for Space Applications, Orlando, FL, USA, 3 May 2007; pp. 1–11. [Google Scholar]
- English, C.; Okouneva, G.; Saint-Cyr, P.; Choudhuri, A.; Luu, T. Real-time dynamic pose estimation systems in space lessons learned for system design and performance evaluation. Int. J. Intell. Control Syst. 2011, 16, 79–96. [Google Scholar]
- Yin, F.; Chou, W.S.; Wu, Y.; Yang, G.; Xu, S. Sparse Unorganized Point Cloud Based Relative Pose Estimation for Uncooperative Space Target. Sensors 2018, 18, 1009. [Google Scholar] [CrossRef] [PubMed]
- Opromolla, R.; Fasano, G.; Rufino, G.; Grassi, M. A model-based 3D template matching technique for pose acquisition of an uncooperative space object. Sensors 2015, 15, 6360–6382. [Google Scholar] [CrossRef] [PubMed]
- Opromolla, R.; Fasano, G.; Rufino, G.; Grassi, M. Pose estimation for spacecraft relative navigation using model-based algorithms. IEEE Trans. Aerosp. Electron. Syst. 2017, 53, 431–447. [Google Scholar] [CrossRef]
- Opromolla, R.; Fasano, G.; Rufino, G.; Grassi, M. Uncooperative pose estimation with a lidar-based system. Acta Astronaut. 2015, 110, 287–297. [Google Scholar] [CrossRef]
- Woods, J.O.; Christian, J.A. LIDAR-based relative navigation with respect to non-cooperative objects. Acta Astronaut. 2016, 126, 298–311. [Google Scholar] [CrossRef]
- Liu, L.J.; Zhao, G.P.; Bo, Y.M. Point cloud based relative pose estimation of a satellite in close range. Sensors 2015, 16, 824. [Google Scholar] [CrossRef] [PubMed]
- Lim, T.W.; Ramos, P.F.; O’Dowd, M.C. Edge detection using point cloud data for noncooperative pose estimation. J. Spacecr. Rockets 2017, 54, 499–504. [Google Scholar] [CrossRef]
- He, Y.; Liang, B.; He, J.; Li, S.Z. Non-cooperative spacecraft pose tracking based on point cloud feature. Acta Astronaut. 2017, 139, 213–221. [Google Scholar] [CrossRef]
- Liang, B.; He, Y.; Zou, Y.; Yang, J. Application of Time-of-Flight camera for relative measurement of non-cooperative target in close range. J. Astronaut. 2016, 37, 1080–1088. [Google Scholar]
- Tzschichholz, T.; Ma, L.; Schilling, K. Model-based spacecraft pose estimation and motion prediction using a photonic mixer device camera. Acta Astronaut. 2011, 68, 1156–1167. [Google Scholar] [CrossRef]
- Regoli, L.; Ravandoor, K.; Schmidt, M.; Schilling, K. On-line robust pose estimation for rendezvous and docking in space using photonic mixer devices. Acta Astronaut. 2014, 96, 159–165. [Google Scholar] [CrossRef]
- Tzschichholz, T.; Boge, T.; Schilling, K. Relative pose estimation of satellites using PMD-/CCD-sensor data fusion. Acta Astronaut. 2015, 109, 25–33. [Google Scholar] [CrossRef]
- Hao, G.T.; Du, X.P.; Chen, H.; Song, J.J.; Gao, T.F. Scale-unambiguous relative pose estimation of space uncooperative targets based on the fusion of three-dimensional time-of-flight camera and monocular camera. Opt. Eng. 2015, 54, 1–12. [Google Scholar] [CrossRef]
- Maiseli, B.; Gu, Y.F.; Gao, H.J. Recent developments and trends in point set registration methods. J. Vis. Commun. Image Represent. 2017, 46, 95–106. [Google Scholar] [CrossRef]
- Volpe, R.; Palmerini, G.B.; Sabatini, M. A passive camera based determination of a non-cooperative and unknown satellite’s pose and shape. Acta Astronaut. 2018, 151, 805–817. [Google Scholar] [CrossRef]
- Chen, F.; Zhang, Z.X.; Wang, Y.; Liu, Y.; Huang, J.M. Application study of 3D reconstruction using image sequences in space target detection and recognition. Manned Spacefl. 2016, 22, 732–736. [Google Scholar]
- Von Gioi, R.G.; Jakubowicz, J.; Morel, J.M.; Randall, G. LSD: A fast line segment detector with a false detection control. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 722–732. [Google Scholar] [CrossRef] [PubMed]
- Han, H.Y.; Han, X.; Sun, F.S.; Huang, C.Y. Point cloud simplification with preserved edge based on normal vector. Optik 2015, 126, 2157–2162. [Google Scholar] [CrossRef]
- Point Cloud Library. Available online: http://pointclouds.org/ (accessed on 1 March 2017).
- NASA 3D Resources. Available online: https://nasa3d.arc.nasa.gov/ (accessed on 1 March 2017).
- Woods, J.O.; Christian, J.A. Glidar: An opengl-based, real-time, and open source 3D sensor simulator for testing computer vision algorithms. J. Imaging 2016, 2, 5. [Google Scholar] [CrossRef]
- Lu, Y.; Liu, X.G.; Zhou, Y.; Liu, C.C. Review of detumbling technologies for active removal of uncooperative targets. Acta Aeronaut. Astronaut. Sin. 2018, 39, 1–13. [Google Scholar]
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhao, G.; Xu, S.; Bo, Y. LiDAR-Based Non-Cooperative Tumbling Spacecraft Pose Tracking by Fusing Depth Maps and Point Clouds. Sensors 2018, 18, 3432. https://doi.org/10.3390/s18103432
Zhao G, Xu S, Bo Y. LiDAR-Based Non-Cooperative Tumbling Spacecraft Pose Tracking by Fusing Depth Maps and Point Clouds. Sensors. 2018; 18(10):3432. https://doi.org/10.3390/s18103432
Chicago/Turabian StyleZhao, Gaopeng, Sixiong Xu, and Yuming Bo. 2018. "LiDAR-Based Non-Cooperative Tumbling Spacecraft Pose Tracking by Fusing Depth Maps and Point Clouds" Sensors 18, no. 10: 3432. https://doi.org/10.3390/s18103432
APA StyleZhao, G., Xu, S., & Bo, Y. (2018). LiDAR-Based Non-Cooperative Tumbling Spacecraft Pose Tracking by Fusing Depth Maps and Point Clouds. Sensors, 18(10), 3432. https://doi.org/10.3390/s18103432