Probability-Based LIDAR–Camera Calibration Considering Target Positions and Parameter Evaluation Using a Data Fusion Map
Abstract
:1. Introduction
2. Related Works
3. Methods
3.1. Estimation of the Camera External Parameters
3.2. Calibration Targets
3.3. Circle Detection
3.4. Design of the Likelihood Function and Parameter Sampling
- First, we employ the values of the radius extracted from the 3-D circles using PCL, as described in Section 3.3, as the first sample. We define the radius as .
- We estimate the camera external parameters by solving the PnP problem [59] using , after which we derive in Equation (7). Thereafter, is calculated using Equation (5).
- We sample the next parameter, , randomly based on a priori information and derive the camera external parameters using and , following which we calculate .
- If is accepted, the camera external parameters derived using the are applied to obtain a posteriori probability density, and is replaced with . If is rejected, new radii are sampled and remains unchanged.
- We repeat processes 2–5 at a designed count and obtain a posterior probability density consisting of the accepted parameters.
4. Experiments
4.1. Measurement System
4.2. Experimental Setups
5. Camera External Parameter Results
6. Evaluation of Data Fusion for the Objects Set at Fixed Positions
6.1. Experimental Setup
6.2. Evaluation Results
7. Evaluation of Data Fusion in 3-D Mapping
7.1. Experimental Setup
7.2. Evaluation Results
8. Discussion
9. Conclusions
- -
- Usage of multiple targets or poses set at various distances.
- -
- Matching the positions of the calibration targets to the object positions in data fusions, if the latter positions are determined.
- -
- Usage of the dense LIDAR point cloud of the calibration target.
- -
- Restoring the shapes of the calibration target to a circular shape using appropriate parameter control.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zhan, G.; Singh, S. LOAM: Lidar Odometry and Mapping in Real-time. Robot. Sci. Syst. 2014, 2, 1–9. [Google Scholar]
- Shan, T.; Englot, B. LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Shan, T.; Englot, B.; Meyers, D.; Wang, W.; Ratti, C.; Rus, D. LIO-SAM: Tightly-couped Lidar Inertial Odometry via Smoothing and Mapping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 24 October 2020–24 January 2021. [Google Scholar]
- Zhang, F.; Clarke, D.; Knoll, A. Vehicle Detection Based on LiDAR and Camera Fusion. In Proceedings of the 2014 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014. [Google Scholar]
- Wu, T.; Tsai, C.; Guo, J. LiDAR/Camera Sensor Fusion Technology for Pedestrian Detection. In Proceedings of the 2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), Kuala Lumpur, Malaysia, 12–15 December 2017. [Google Scholar]
- Zhao, X.; Sun, P.; Xu, Z.; Min, H.; Yu, H. Fusion of 3D LIDAR and Camera Data for Object Detection in Autonomous Vehicle Applications. IEEE Sens. J. 2020, 20, 4901–4913. [Google Scholar] [CrossRef]
- Silva, D.V.; Roche, J.; Kondoz, A. Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots. Sensors 2018, 18, 2730. [Google Scholar] [CrossRef]
- Zhen, W.; Hu, Y.; Liu, J.; Schere, S. A Joint Optimization Approach of LiDAR-Camera Fusion for Accurate Dense 3D Reconstructions. IEEE Robot. Autom. Lett. 2019, 4, 3585–3592. [Google Scholar] [CrossRef]
- Geiger, A.; Moosmann, F.; Car, ö.; Schuster, B. Automatic camera and range sensor calibration using a single shot. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA, 14–18 May 2012. [Google Scholar]
- Beltrán, J.; Guindel, C.; Escalera, A.; Garcia, F. Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups. IEEE Trans. Intell. Transp. Syst. 2022, 23, 17677–17689. [Google Scholar] [CrossRef]
- Zhang, Q.; Pless, R. Extrinsic Calibration of a Camera and Laser Range Finder (improves camera calibration). In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sendai, Japan, 28 September–2 October 2004. [Google Scholar]
- Ou, J.; Huang, P.; Zhou, J.; Zhao, Y.; Lin, L. Automatic Extrinsic Calibration of 3D LIDAR and Multi-Cameras Based on Graph Optimization. Sensors 2022, 22, 2221. [Google Scholar] [CrossRef] [PubMed]
- Ranganathan, A. The Levenberg-Marquardt Algorithm. Tutor. LM Algorithm 2004, 11, 101–110. [Google Scholar]
- Carlone, L. A convergence analysis for pose graph optimization via Gauss-Newton methods. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar]
- Singandhupe, A.; La, M.H.; Ha, P.Q. Single Frame Lidar-Camera Calibration using Registration of 3D planes. In Proceedings of the 6th IEEE International Conference on Robotic Computing (IRC), Naples, Italy, 5–7 December 2022. [Google Scholar]
- Verma, S.; Berrio, S.J.; Worrall, S.; Nebot, E. Automatic extrinsic calibration between a camera and a 3D Lidar using 3D point and plane correspondences. In Proceedings of the 2019 IEEE International IEEE Conference on Intelligent Transportation Systems (ITSC), Auckland, New Zealand, 27–30 October 2019. [Google Scholar]
- Kim, E.; Park, S. Extrinsic Calibration between Camera and LiDAR Sensors by Matching Multiple 3D Planes. Sensors 2020, 20, 52. [Google Scholar] [CrossRef] [PubMed]
- Unnikrishanan, R.; Hebert, M. Fast Extrinsic Calibration of a Laser Rangefinder to a Camera; Technical Report; CMU-RI-TR-05-09; Robotics Institute: Pittsburgh, PA, USA, 2005. [Google Scholar]
- Huang, J.; Grizzle, W.J. Improvements to target-based 3D LiDAR to Camera Calibration. IEEE Access 2020, 8, 134101–134110. [Google Scholar] [CrossRef]
- Li, Y.; Ruichek, Y.; Cappelle, C. 3D Triangulation Based Extrinsic Calibration between a Stereo Vision System and a LIDAR. In Proceedings of the 2011 14th International IEEE Conference on Intelligent Transportation Systems (ITSC), Washington, DC, USA, 5–7 October 2011. [Google Scholar]
- Vasconcels, F.; Barreto, P.J.; Nunes, U. A Minimal Solution for the Extrinsic Calibration of a Camera and a Laser-Rangefinder. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 2097–2107. [Google Scholar] [CrossRef]
- Cai, H.; Pang, W.; Chen, X.; Wang, Y.; Liang, H. A Novel Calibration Board and Experiments for 3D LiDAR and Camera Calibration. Sensors 2020, 20, 1130. [Google Scholar] [CrossRef]
- Zhou, L.; Li, Z.; Kaess, M. Automatic Extrinsic Calibration of a Camera and a 3D LiDAR using Line and Plane Correspondeces. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Fu, B.; Wang, Y.; Ding, X.; Jiao, Y.; Tang, L.; Xiong, R. LiDAR Camera Calibration under Arbitrary Configurations: Observability and Methods. IEEE Trans. Instrum. Meas. 2019, 69, 3089–3102. [Google Scholar] [CrossRef]
- Matlab Lidar Camera Calibrator. Available online: https://jp.mathworks.com/help/lidar/ug/lidar-and-camera-calibration.html (accessed on 1 March 2024).
- Narodistky, O.; Patterson, A.; Daniilidis, K. Automatic alignment of a camera with line scan LIDAR system. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011. [Google Scholar]
- Park, Y.; Yun, S.; Won, S.C.; Cho, K.; Um, K.; Sim, S. Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board. Sensors 2014, 14, 5333–5353. [Google Scholar] [CrossRef]
- Debattisti, S.; Mazzei, L.; Panciroli, M. Automated Extrinsic Laser and Camera Inter-Calibration Using Triangular Targets. In Proceedings of the 2013 IEEE Intelligent Vehicles Symposium, Gold Coast, Australia, 23–26 June 2013. [Google Scholar]
- Pereira, M.; Silva, M.; Santos, V. Self calibration of multiple LIDARs and cameras on autonomous vehicles. Robot. Auton. Syst. 2016, 83, 326–337. [Google Scholar] [CrossRef]
- Hassanein, M.; Moussa, A. A New Automatic System Calibration of Multi-Cameras and LIDAR Sensors. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016. [Google Scholar]
- Pusztai, Z.; Eichhardt, I.; Hajder, L. Accurate Calibration of Multi-LiDAR-Multi-Camera Systems. Sensors 2018, 18, 2139. [Google Scholar] [CrossRef] [PubMed]
- Grammatikopolous, L.; Papanagnou, A.; Venianakis, A.; Kalisperakis, I.; Stentoumis, C. An Effective Camera-to-Lidar Spatiotemporal Calibration Based on a Simple Calibration Target. Sensors 2022, 22, 5576. [Google Scholar] [CrossRef]
- Dhall, A.; Chelnai, K.; Radhakrishnan, V.; Khrishna, M.K. LIDAR-Camera Calibration using 3D-3D Point correspondences. arXiv 2017, arXiv:1705.09785. [Google Scholar] [CrossRef]
- An, P.; Ma, T.; Yu, K.; Fang, B.; Zhang, G.; Fu, W.; Ma, J. Geometric calibration for LIDAR-camera system fusing 3D-2D and 3D-3D point correspondences. Opt. Express 2020, 28, 2122–2141. [Google Scholar] [CrossRef] [PubMed]
- Florez, R.A.S.; Fremont, V.; Bonnifait, P. Extrinsic calibration between a multi-layer lidar and a camera. In Proceedings of the 2008 IEEE International Conference on Multi-Sensor Fusion and Integration for Intelligent Systems, Seoul, Republic of Korea, 20–22 August 2008. [Google Scholar]
- Alismail, H.; Baker, D.L.; Browning, B. Automatic Calibration of a Range Sensor and Camera System. In Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, Zurich, Switzerland, 13–15 October 2012. [Google Scholar]
- Velas, M.; Spanel, M.; Materna, Z.; Herout, Z. Calibration of RGB Camera With Velodyne LiDAR. In Communication Papers Proceedings, Proceedings of the International Conference on Computer Graphics, Visualization and Computer Vison (WSCG), Plzen, Czech Republic, 2–5 June 2014; Václav Skala-UNION Agency: Plzen, Czech Republic, 2014; pp. 135–144. [Google Scholar]
- Guindel, C.; Beltrán, J.; Martin, D.; Garcia, F. Automatic Extrinsic Calibration for Lidar-Stereo Vehicle Sensor Setup. In Proceedings of the 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan, 16–19 October 2017. [Google Scholar]
- Yamada, R.; Yaguchi, Y. Evaluation of calibration methods to construct a 3-D environmental map with good color projection using both camera images and laser scanning data. Artif. Life Robot. 2020, 25, 434–439. [Google Scholar] [CrossRef]
- Scaramuzza, D.; Hrati, A.; Siegwart, R. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Diego, CA, USA, 29 October–2 November 2007. [Google Scholar]
- Pandey, G.; McBride, R.J.; Savarese, S.; Eustice, M.R. Automatic Targetless Extrinsic Calibration of a 3D Lidar and Camera by Maximizing Mutual Information. In Proceedings of the 26th AAAI Conference on Artificial Intelligence, Toronto, ON, Canada, 22–26 July 2012. [Google Scholar]
- Taylor, Z.; Nieto, J. Automatic Calibration of Lidar and Camera Images using Normalized Mutual Information. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar]
- Pandey, G.; McBride, R.J.; Savarese, S.; Eustice, M.R. Automatic Extrinsic Calibration of Vision and Lidar by Maximizing Mutual Information. J. Field Robot. 2014, 32, 696–722. [Google Scholar] [CrossRef]
- Irie, K.; Sugiyama, M.; Tomono, M. Target-less Camera-LiDAR Extrinsic Calibration Using a Bagged Dependence Estimator. In Proceedings of the 2016 IEEE International Conference on Automation Science and Engineering (CASE), Fort Worth, TX, USA, 21–25 August 2016. [Google Scholar]
- Koide, K.; Oishi, S.; Yokozuka, M.; Banno, A. General, Single-shot, Target-less, and Automatic LiDAR-Camera Extrinsic Calibration Toolbox. In Proceedings of the 2023 IEEE International Conference on Robotics and Automation, London, UK, 29 May–2 June 2023. [Google Scholar]
- Monghadam, P.; Bosse, M.; Zlot, R. Line-based Extrinsic Calibration of Range and Image Sensors. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar]
- Bai, Z.; Jiang, G.; Xu, A. LiDAR-Camera Calibration Using Line Correspondences. Sensors 2020, 20, 6319. [Google Scholar] [CrossRef] [PubMed]
- Ma, T.; Liu, Z.; Yan, G.; Li, Y. CRLF: Automatic Calibration and Refinement based on Line Feature for LiDAR and Camera in Road Scenes. In Proceedings of the 2021 IEEE International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 29–30 September 2021. [Google Scholar]
- Wang, W.; Nobuhra, S.; Nakmura, R.; Sakurada, K. SOIC: Semantic Online Initialization and Calibration for LiDAR and Camera. arXiv 2020, arXiv:2003.04260. [Google Scholar] [CrossRef]
- Zhu, Y.; Li, C.; Zhang, Y. Online Camera-LiDAR Calibration with Sensor Semantic Information. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation, Paris, France, 31 May–31 August 2020. [Google Scholar]
- Schneider, N.; Piewak, F.; Stiller, C.; Franke, U. RegNet: Multimodal Sensor Registration Using Deep Neural Networks. In proceeding of the 2017 IEEE intelligent vehicles symposium (IV), 11–14 June 2017; pp. 1803–1810. [Google Scholar]
- Iyer, G.; Ram, R.K.; Murthy, J.K.; Krishna, K.M. CalibNet: Geometrically Supervised Extrinsic Calibration using 3D Spatial Transformer Networks. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019. [Google Scholar]
- Lv, X.; Wang, B.; Dou, Z.; Ye, D.; Wang, S. LCCNet: LiDAR and Camera Self-Calibration using Cost Volume Network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, CVPR Workshops 2021, Virtual, 19–25 June 2021. [Google Scholar]
- Mosegaard, K.; Tarantola, A. Monte Carlo sampling of solutions to inverse problems. J. Geophys. Res. 1995, 100, 12431–12447. [Google Scholar] [CrossRef]
- Khan, A.; Connolly, D.A.J.; Maclennan, J.; Mosegarrd, K. Joint inversion of seismic and gravity data for lunar composition and thermal state. Geophys. J. Int. 2007, 168, 243–258. [Google Scholar] [CrossRef]
- Yamada, R.; Garia, F.R.; Lognooné, P.; Feuvre, L.M.; Calvet, M.; Gagnepain-Beyneix, J. Optimization of seismic network design: Application to a geophysical international network. Planet. Space Sci. 2011, 59, 343–354. [Google Scholar] [CrossRef]
- Matsumoto, K.; Yamada, R.; Kikuchi, F.; Kamata, S.; Ishihara, Y.; Iwata, T.; Hanada, H.; Sasaki, S. Internal structure of the Moon inferred from Apollo seismic data and selenodetic data from GRAIL and LLR. Geophys. Res. Lett. 2015, 42, 7351–7358. [Google Scholar] [CrossRef]
- OpenCV Camera Calibration. Available online: https://docs.opencv.org/4.x/dc/dbb/tutorial_py_calibration.html (accessed on 1 April 2024).
- Quan, L.; Lan, Z. Linear n-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell. 1999, 21, 774–780. [Google Scholar] [CrossRef]
- Yuen, K.H.; Princen, J.; Illingworth, J.; Kittler, J. Comparative study of hough transform methods for circle finding. Image Vis. Comput. 1990, 8, 71–77. [Google Scholar] [CrossRef]
- Fischler, A.M.; Bolles, C.R. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Bentley, L.J. Multidimensional binary search trees used for associative searching. Commun. ACM 1975, 18, 509–517. [Google Scholar] [CrossRef]
Conditions | |
---|---|
Case A | Three boards set at distances of 3.0, 1.5, and 4.5 m (Figure 5a) |
Case B | Three boards set at a distance of 1.5 m (Figure 5b) |
Case C | Three boards set at a distance of 3.0 m (Figure 5c) |
Case D | Three boards set at a distance of 4.5 m (Figure 5d) |
Case E | Three boards set at distances of 3.0, 1.5, and 4.5 m and heights of 1.0, 0.7, and 1.3 m (Figure 5e) |
Case F | Two boards, one above the other, set at a distance of 3.0 m (Figure 5f) |
Case G | Checkerboard taken at three poses at ~1.5, 3.0, and 4.5 m (Figure 5g) |
Case H | Same as Case A. The radii of the boards are fixed, and parameter sampling is not performed |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yamada, R.; Yaguchi, Y. Probability-Based LIDAR–Camera Calibration Considering Target Positions and Parameter Evaluation Using a Data Fusion Map. Sensors 2024, 24, 3981. https://doi.org/10.3390/s24123981
Yamada R, Yaguchi Y. Probability-Based LIDAR–Camera Calibration Considering Target Positions and Parameter Evaluation Using a Data Fusion Map. Sensors. 2024; 24(12):3981. https://doi.org/10.3390/s24123981
Chicago/Turabian StyleYamada, Ryuhei, and Yuichi Yaguchi. 2024. "Probability-Based LIDAR–Camera Calibration Considering Target Positions and Parameter Evaluation Using a Data Fusion Map" Sensors 24, no. 12: 3981. https://doi.org/10.3390/s24123981
APA StyleYamada, R., & Yaguchi, Y. (2024). Probability-Based LIDAR–Camera Calibration Considering Target Positions and Parameter Evaluation Using a Data Fusion Map. Sensors, 24(12), 3981. https://doi.org/10.3390/s24123981