Next Article in Journal
An Improved Genetic Method for Satellite Selection in Multi-Global Navigation Satellite System Positioning for Mobile Robots
Next Article in Special Issue
YOLO Adaptive Developments in Complex Natural Environments for Tiny Object Detection
Previous Article in Journal
Test Coverage in Microservice Systems: An Automated Approach to E2E and API Test Coverage Metrics
Previous Article in Special Issue
Research on PointPillars Algorithm Based on Feature-Enhanced Backbone Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

DPCalib: Dual-Perspective View Network for LiDAR-Camera Joint Calibration

School of Electronic Science and Engineering, Nanjing University, Nanjing 210023, China
*
Author to whom correspondence should be addressed.
Electronics 2024, 13(10), 1914; https://doi.org/10.3390/electronics13101914
Submission received: 1 April 2024 / Revised: 28 April 2024 / Accepted: 12 May 2024 / Published: 13 May 2024

Abstract

The precise calibration of a LiDAR-camera system is a crucial prerequisite for multimodal 3D information fusion in perception systems. The accuracy and robustness of existing traditional offline calibration methods are inferior to methods based on deep learning. Meanwhile, most parameter regression-based online calibration methods directly project LiDAR data onto a specific plane, leading to information loss and perceptual limitations. A novel network, DPCalib, a dual perspective view network that mitigates the aforementioned issue, is proposed in this paper. This paper proposes a novel neural network architecture to achieve the fusion and reuse of input information. We design a feature encoder that effectively extracts features from two orthogonal views using attention mechanisms. Furthermore, we propose an effective decoder that aggregates features from two views, thereby obtaining accurate extrinsic parameter estimation outputs. The experimental results demonstrate that our approach outperforms existing SOTA methods, and the ablation experiments validate the rationality and effectiveness of our work.
Keywords: LiDAR-camera calibration; multimodal; sensor fusion; deep learning LiDAR-camera calibration; multimodal; sensor fusion; deep learning

Share and Cite

MDPI and ACS Style

Cao, J.; Yang, X.; Liu, S.; Tang, T.; Li, Y.; Du, S. DPCalib: Dual-Perspective View Network for LiDAR-Camera Joint Calibration. Electronics 2024, 13, 1914. https://doi.org/10.3390/electronics13101914

AMA Style

Cao J, Yang X, Liu S, Tang T, Li Y, Du S. DPCalib: Dual-Perspective View Network for LiDAR-Camera Joint Calibration. Electronics. 2024; 13(10):1914. https://doi.org/10.3390/electronics13101914

Chicago/Turabian Style

Cao, Jinghao, Xiong Yang, Sheng Liu, Tiejian Tang, Yang Li, and Sidan Du. 2024. "DPCalib: Dual-Perspective View Network for LiDAR-Camera Joint Calibration" Electronics 13, no. 10: 1914. https://doi.org/10.3390/electronics13101914

APA Style

Cao, J., Yang, X., Liu, S., Tang, T., Li, Y., & Du, S. (2024). DPCalib: Dual-Perspective View Network for LiDAR-Camera Joint Calibration. Electronics, 13(10), 1914. https://doi.org/10.3390/electronics13101914

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop