DA-IRRK: Data-Adaptive Iteratively Reweighted Robust Kernel-Based Approach for Back-End Optimization in Visual SLAM
Abstract
:1. Introduction
- (1)
- A data-adaptive iteratively reweighted robust kernel-based (DA-IRRK) method is proposed for back-end optimization in VSLAM. In this method, a kernel function is adopted as the objective function for the back-end optimization problem. The robustness parameter in the kernel function is adaptively updated according to the reprojection error, which is reflected through the median absolute deviation (MAD). The proposed method brings out robustness for different scenarios. In addition, the formulated back-end optimization problem is solved iteratively through a reweighted updating process.
- (2)
- The proposed method is implemented in different VSLAM frameworks, including ORB-SLAM3, JORB-SLAM, and CCM-SLAM, to demonstrate its effectiveness in visual-only SLAM, multi-sensor fusion SLAM, and collaborative VSLAM. The proposed method is tested on both indoor and outdoor datasets and compared with other robust kernel methods as well as the state-of-the-art MCMCC method.
- (3)
- The performance difference between the proposed method and other methods is analyzed from the perspective of reprojection error statistics, which provides insights into the VSLAM back-end problem in the context of adaptivity.
2. Related Work
2.1. VSLAM Frameworks
2.2. Back-End Optimization
2.3. Robust Kernel Functions
2.4. Adaptive Methods
3. Back-End Optimization Based on DA-IRRK
3.1. Back-End Optimization
3.2. DA-IRRK
- (1)
- Calculate the current reprojection error based on the camera pose and map points at the front end;
- (2)
- Determine the adaptive threshold according to Equation (6) after computing the MAD of reprojection errors using Equation (5), where represents the median of a set of data and represents the i-th sample in the set. The MAD strategy is preferred over standard deviation due to its higher breakdown point (50% vs. 0%) for outlier resistance [40], which is crucial for handling outliers in SLAM. The coefficient 1.4826 in Equation (6) equals , where is the inverse standard normal CDF. This scaling ensures matches the standard deviation for normally distributed data, maintaining compatibility with Gaussian kernels [41]. This adaptive mechanism enables automatic adjustment of the robustness parameters based on real-time sensor data characteristics, representing a key advantage over traditional fixed-parameter methods like MCMCC. These computations are performed in the tangent space of the manifold, which provides a vector space approximation for the nonlinear optimization problem while maintaining the geometric properties of the original space. It is worth noting that the MAD strategy may not be sufficiently robust for data distributions containing a larger number of large outliers;
- (3)
- Calculate the robustness parameter in the Huber kernel function using Equation (7), where c represents the scaling factor. The scale factor corresponds to 95% confidence. is the vector form of ;
- (4)
- Solve the objective function iteratively through a reweighted updating process;
- (5)
- Utilize the L–M algorithm to update the camera pose and map points.
Algorithm 1 DA-IRRK-based back-end optimization. |
Require: Camera pose and map point , denoted as in the algorithm; Ensure: Updated model parameters and ;
|
4. Experiments
- Adaptive Edge Module: New EdgeSE3-DAIRRK class implements the following:
- Dynamic kernel parameter computation (Equations (5) and (6)).
- Real-time robust kernel selection (Equation (7)).
- Information matrix weighting mechanism.
- Solver Module: Enhanced LinearSolverEigen features the following:
- Optimized sparse matrix storage pattern.
- Improved marginalization strategy.
4.1. Experimental Datasets
- Simple: MH01 (80.6 m), MH02 (73.5 m), V101 (58.6 m), V201 (36.5 m).
- Moderate: MH03 (130.9 m), V102 (75.9 m), V202 (83.2 m).
- Challenging: V103 (79.0 m), V203 (86.1 m), MH04 (91.7 m), MH05 (97.6 m).
4.2. Experimental Frameworks and Evaluation Benchmarks
4.2.1. Experimental Frameworks
4.2.2. Evaluation Metrics
4.3. Case Study
4.3.1. Visual-Only SLAM
- (a)
- Adaptive Robustness in DA-IRRK vs. Fixed-Parameter Huber Kernel
- (b)
- Noise Distribution Analysis and MAD Strategy Efficacy
4.3.2. Multi-Sensor Fusion VSLAM
4.3.3. Collaborative VSLAM
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Jia, Y.; Yan, X.; Xu, Y. A Survey of simultaneous localization and mapping for robot. In Proceedings of the 2019 IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), Chengdu, China, 20–22 December 2019; pp. 857–861. [Google Scholar]
- Placed, J.A.; Strader, J.; Carrillo, H.; Atanasov, N.; Indelman, V.; Carlone, L.; Castellanos, J.A. A survey on active simultaneous localization and mapping: State of the art and new frontiers. IEEE Trans. Robot. 2023, 39, 1686–1705. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Tardós, J.D. Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [Google Scholar] [CrossRef]
- Zhang, Z. Parameter estimation techniques: A tutorial with application to conic fitting. Image Vis. Comput. 1997, 15, 59–76. [Google Scholar] [CrossRef]
- Chen, S.Y. Kalman filter for robot vision: A survey. IEEE Trans. Ind. Electron. 2011, 59, 4409–4420. [Google Scholar] [CrossRef]
- Campos, C.; Elvira, R.; Rodríguez, J.J.G.; Montiel, J.M.; Tardós, J.D. Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Trans. Robot. 2021, 37, 1874–1890. [Google Scholar] [CrossRef]
- Schmuck, P.; Chli, M. CCM-SLAM: Robust and efficient centralized collaborative monocular simultaneous localization and mapping for robotic teams. J. Field Robot. 2019, 36, 763–781. [Google Scholar] [CrossRef]
- Huber, P.J. Robust estimation of a location parameter. In Breakthroughs in Statistics: Methodology and Distribution; Springer: New York, NY, USA, 1992; pp. 492–518. [Google Scholar]
- Steven, K. De Schrijver and El-Houssaine Aghezzaf and Hendrik Vanmaele. Double precision rational approximation algorithm for the inverse standard normal first order loss function. Appl. Math. Comput. 2012, 219, 1375–1382. [Google Scholar]
- Black, M.J.; Anandan, P. The robust estimation of multiple motions: Parametric and piecewise-smooth flow fields. Comput. Vis. Image Underst. 1996, 63, 75–104. [Google Scholar] [CrossRef]
- Bas, E. Robust fuzzy regression functions approaches. Inf. Sci. 2022, 613, 419–434. [Google Scholar] [CrossRef]
- Fu, S.; Wang, X.; Tang, J.; Lan, S.; Tian, Y. Generalized robust loss functions for machine learning. Neural Netw. 2024, 171, 200–214. [Google Scholar] [CrossRef]
- Wang, L.; Zheng, C.; Zhou, W.; Zhou, W.X. A new principle for tuning-free Huber regression. Stat. Sin. 2021, 31, 2153–2177. [Google Scholar] [CrossRef]
- Kargoll, B.; Omidalizarandi, M.; Loth, I.; Paffenholz, J.A.; Alkhatib, H. An iteratively reweighted least-squares approach to adaptive robust adjustment of parameters in linear regression models with autoregressive and t-distributed deviations. J. Geod. 2018, 92, 271–297. [Google Scholar] [CrossRef]
- Huang, B.; Zhao, J.; Liu, J. A survey of simultaneous localization and mapping with an envision in 6G wireless networks. arXiv 2019, arXiv:1909.05214. [Google Scholar]
- Chakraborty, K.; Deegan, M.; Kulkarni, P.; Searle, C.; Zhong, Y. Jorb-slam: A Jointly Optimized Multi-Robot Visual Slam. 2022. Available online: https://um-mobrob-t12-w19.github.io/docs/report.pdf (accessed on 1 December 2024).
- Zhao, X.; Shao, S.; Wang, T.; Fang, C.; Zhang, J.; Zhao, H. A review of multi-robot collaborative simultaneous localization and mapping. In Proceedings of the 2023 IEEE International Conference on Unmanned Systems (ICUS), Hefei, China, 13–15 October 2023; pp. 900–905. [Google Scholar]
- Wei, X.; Zhang, Y.; Li, Z.; Fu, Y.; Xue, X. Deepsfm: Structure from motion via deep bundle adjustment. In Proceedings of the Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28 2020; Proceedings, Part I 16. Springer International Publishing: Cham, Switzerland, 2020; pp. 230–247. [Google Scholar]
- Teed, Z.; Deng, J. Droid-slam: Deep visual slam for monocular, stereo, and rgb-d cameras. Adv. Neural Inf. Process. Syst. 2021, 34, 16558–16569. [Google Scholar]
- Saxena, A.; Chiu, C.Y.; Shrivastava, R.; Menke, J.; Sastry, S. Simultaneous localization and mapping: Through the lens of nonlinear optimization. IEEE Robot. Autom. Lett. 2022, 7, 7148–7155. [Google Scholar] [CrossRef]
- Davison, A.J.; Reid, I.D.; Molton, N.D.; Stasse, O. MonoSLAM: Real-time single camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 29, 1052–1067. [Google Scholar] [CrossRef]
- Zheng, B.; Zhang, Z. An Improved EKF-SLAM for Mars Surface Exploration. Int. J. Aerosp. Eng. 2019, 2019, 7637469. [Google Scholar] [CrossRef]
- Klein, G.; Murray, D. Parallel tracking and mapping for small AR workspaces. In Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 13–16 November 2007; pp. 225–234. [Google Scholar]
- Engel, J.; Koltun, V.; Cremers, D. Direct sparse odometry. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 611–625. [Google Scholar] [CrossRef]
- Kümmerle, R.; Grisetti, G.; Strasdat, H.; Konolige, K.; Burgard, W. G2o: A general framework for graph optimization. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3607–3613. [Google Scholar]
- Wang, Y. Gauss–newton method. Wiley Interdiscip. Rev. Comput. Stat. 2012, 4, 415–420. [Google Scholar] [CrossRef]
- Gavin, H.P. The Levenberg-Marquardt algorithm for nonlinear least squares curve-fitting problems. Dep. Civ. Environ. Eng. Duke Univ. August 2019, 3, 1–23. [Google Scholar]
- MacTavish, K.; Barfoot, T.D. At all costs: A comparison of robust cost functions for camera correspondence outliers. In Proceedings of the 2015 12th Conference on Computer and Robot Vision, Halifax, NS, Canada, 3–5 June 2015; pp. 62–69. [Google Scholar]
- Bosse, M.; Agamennoni, G.; Gilitschenski, I. Robust estimation and applications in robotics. Found. Trends® Robot. 2016, 4, 225–269. [Google Scholar] [CrossRef]
- Babin, P.; Giguere, P.; Pomerleau, F. Analysis of robust functions for registration algorithms. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 1451–1457. [Google Scholar]
- Barron, J.T. A general and adaptive robust loss function. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 4331–4339. [Google Scholar]
- Chebrolu, N.; Läbe, T.; Vysotska, O.; Behley, J.; Stachniss, C. Adaptive robust kernels for non-linear least squares problems. IEEE Robot. Autom. Lett. 2021, 6, 2240–2247. [Google Scholar] [CrossRef]
- Agamennoni, G.; Furgale, P.; Siegwart, R. Self-tuning M-estimators. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015; pp. 4628–4635. [Google Scholar]
- Ye, F.; Duan, P.; Meng, L.; Sang, H.; Gao, K. An enhanced artificial bee colony algorithm with self-learning optimization mechanism for multi-objective path planning problem. Eng. Appl. Artif. Intell. 2025, 149, 110444. [Google Scholar] [CrossRef]
- Kelly, G. Adaptive choice of tuning constant for robust regression estimators. J. R. Stat. Soc. Ser. D Stat. 1996, 45, 35–40. [Google Scholar] [CrossRef]
- Sun, Q.; Zhou, W.X.; Fan, J. Adaptive huber regression. J. Am. Stat. Assoc. 2020, 115, 254–265. [Google Scholar] [CrossRef]
- Catoni, O. Challenging the empirical mean and empirical variance: A deviation study. Ann. l’IHP Probab. Stat. 2012, 48, 1148–1185. [Google Scholar] [CrossRef]
- Fan, J.; Li, Q.; Wang, Y. Estimation of high dimensional mean regression in the absence of symmetry and light tail assumptions. J. R. Stat. Soc. Ser. B Stat. Methodol. 2017, 79, 247–265. [Google Scholar] [CrossRef] [PubMed]
- Ruppert, D. Robust Statistics: The Approach Based on Influence Functions. Technometrics 1987, 29, 240–241. [Google Scholar] [CrossRef]
- Huber, P.J. Robust Statistics. In International Encyclopedia of Statistical Science; Springer: Berlin/Heidelberg, Germany, 2011; pp. 1248–1251. [Google Scholar]
- Olson, E.; Strom, J.; Morton, R.; Richardson, A.; Ranganathan, P.; Goeddel, R.; Bulic, M.; Crossman, J.; Marinier, B. Progress toward multi-robot reconnaissance and the MAGIC 2010 competition. J. Field Robot. 2012, 29, 762–792. [Google Scholar] [CrossRef]
- Cheng, L.; Wang, T.; Xu, X.; Yan, G.; Ren, M.; Zhang, Z. Nonlinear back-end optimization method for VSLAM with multi-convex combined maximum correntropy criterion. ISA Trans. 2023, 142, 731–746. [Google Scholar] [CrossRef]
- Guennebaud, G.; Jacob, B. Eigen v3. 2010. Available online: http://eigen.tuxfamily.org (accessed on 1 December 2024).
- Będkowski, J.; Pełka, M.; Majek, K.; Fitri, T.; Naruniec, J. Open source robotic 3D mapping framework with ROS—Robot Operating System, PC—Point Cloud Library and Cloud Compare. In Proceedings of the 2015 International Conference on Electrical Engineering and Informatics (ICEEI), Denpasar, Indonesia, 10–11 August 2015; pp. 644–649. [Google Scholar]
- Chen, X.; Dathathri, R.; Gill, G.; Pingali, K. Pangolin: An efficient and flexible graph mining system on CPU and GPU. Proc. VLDB Endow. 2020, 13, 1190–1205. [Google Scholar] [CrossRef]
- Pulli, K.; Baksheev, A.; Kornyakov, K.; Eruhimov, V. Anatoly and Kornyakov, Kirill and Eruhimov, Victor. Real-time computer vision with OpenCV. Commun. ACM 2012, 55, 61–69. [Google Scholar] [CrossRef]
- Burri, M.; Nikolic, J.; Gohl, P.; Schneider, T.; Rehder, J.; Omari, S.; Achtelik, M.W.; Siegwart, R. The EuRoC micro aerial vehicle datasets. Int. J. Robot. Res. 2016, 35, 1157–1163. [Google Scholar] [CrossRef]
- Geiger, A.; Lenz, P.; Urtasun, R. Are we ready for autonomous driving? In the kitti vision benchmark suite. In Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA, 16–21 June 2012; pp. 3354–3361. [Google Scholar]
- Saha, A.; Dhara, B.C.; Umer, S.; AlZubi, A.A.; Alanazi, J.M.; Yurii, K. Corb2i-slam: An adaptive collaborative visual-inertial slam for multiple robots. Electronics 2022, 11, 2814. [Google Scholar] [CrossRef]
- Schmuck, P.; Ziegler, T.; Karrer, M.; Perraudin, J.; Chli, M. Covins: Visual-inertial slam for centralized collaboration. In Proceedings of the 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Bari, Italy, 4–8 October 2021; pp. 171–176. [Google Scholar]
- Zhang, Z.; Scaramuzza, D. A tutorial on quantitative trajectory evaluation for visual (-inertial) odometry. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 7244–7251. [Google Scholar]
- Rebecq, H.; Horstschaefer, T.; Gallego, G.; Scaramuzza, D. EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real Time. IEEE Robot. Autom. Lett. 2017, 2, 593–600. [Google Scholar] [CrossRef]
MH01 | MH02 | MH03 | MH04 | MH05 | V101 | V102 | V103 | V201 | V202 | V203 | |
---|---|---|---|---|---|---|---|---|---|---|---|
Ours | 0.0363 | 0.0308 | 0.0405 | 0.0601 | 0.0461 | 0.0788 | 0.0603 | 0.0738 | 0.0748 | 0.0557 | 0.5490 |
Huber | 0.0391 | 0.0333 | 0.0411 | 0.0639 | 0.0529 | 0.0826 | 0.0637 | 0.0749 | 0.0729 | 0.0582 | 0.4940 |
MCMCC | 0.0380 | 0.0332 | 0.0373 | 0.0660 | 0.0518 | 0.0826 | 0.0641 | 0.0729 | 0.0528 | 0.0565 | 0.4150 |
Cauchy | 0.0395 | 0.0332 | 0.0442 | 0.0814 | 0.0534 | 0.0809 | 0.0672 | 0.1160 | 0.0744 | 0.0683 | 0.8560 |
Tukey | 0.0391 | 0.0345 | 0.0433 | 0.0892 | 0.0518 | 0.0815 | 0.0647 | 0.1140 | 0.0832 | 0.0859 | 0.8410 |
00 | 01 | 02 | 03 | 04 | 05 | 06 | 07 | 08 | 09 | 10 | |
---|---|---|---|---|---|---|---|---|---|---|---|
Ours | 0.9445 | 5.0316 | 5.2060 | 0.2806 | 0.1794 | 0.3591 | 0.6449 | 0.4207 | 2.7705 | 1.7737 | 0.9458 |
Huber | 0.9454 | 5.8984 | 6.1795 | 0.2962 | 0.1990 | 0.3920 | 0.6357 | 0.4833 | 2.8597 | 1.7117 | 1.0910 |
MCMCC | 0.9567 | 5.6801 | 5.5742 | 0.2309 | 0.1978 | 0.3708 | 0.6309 | 0.4224 | 2.9752 | 1.7946 | 0.9226 |
Cauchy | 1.0448 | 9.1711 | 5.8003 | 0.2945 | 0.1833 | 0.3958 | 0.7584 | 0.7923 | 3.4113 | 1.7704 | 1.0993 |
Tukey | 0.9932 | 7.9178 | 7.5824 | 0.3973 | 0.2941 | 0.4572 | 0.5805 | 0.5024 | 2.8851 | 1.8602 | 1.6109 |
MH01 | MH02 | MH03 | MH04 | MH05 | V101 | V102 | V103 | V201 | V202 | V203 | |
---|---|---|---|---|---|---|---|---|---|---|---|
Ours | 0.0190 | 0.0154 | 0.0228 | 0.0408 | 0.0535 | 0.0326 | 0.0075 | 0.0161 | 0.0147 | 0.0102 | 0.0164 |
Huber | 0.0231 | 0.0209 | 0.0258 | 0.0454 | 0.0560 | 0.0338 | 0.0087 | 0.0171 | 0.0135 | 0.0119 | 0.0153 |
Impr_huber | 18.08% | 26.29% | 11.92% | 10.02% | 4.64% | 3.64% | 13.53% | 6.13% | −9.44% | 14.68% | −6.60% |
00 | 01 | 02 | 03 | 04 | 05 | 06 | 07 | 08 | 09 | 10 | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Ours | A | 0.1845 | 0.1128 | 0.0590 | 0.0664 | 0.0648 | 0.0564 | 0.0184 | 0.0785 | 1.0785 | 0.0449 | 0.0397 |
B | 0.1023 | 0.3891 | 0.0849 | 0.0441 | 0.0662 | 0.0512 | 0.0288 | 0.0358 | 0.4829 | 0.0643 | 0.0397 | |
Huber | A | 0.1881 | 0.1591 | 0.0828 | 0.0657 | 0.0614 | 0.0625 | 0.0226 | 0.0754 | 1.0793 | 0.0519 | 0.0442 |
B | 0.1093 | 0.4829 | 0.0973 | 0.0524 | 0.0645 | 0.0537 | 0.0335 | 0.0322 | 0.4822 | 0.0677 | 0.0558 | |
Impr_huber | A | 1.91% | 29.10% | 28.73% | −0.94% | −5.49% | 9.83% | 18.45% | −4.12% | 0.07% | 13.52% | 10.23% |
B | 6.40% | 19.42% | 12.82% | 15.84% | −2.64% | 4.65% | 13.87% | −11.12% | −0.15% | 5.02% | 28.88% |
MH01-MH02 | MH02-MH03 | MH03-MH04 | MH04-MH05 | V101-V102 | V102-V103 | V201-V202 | V202-V203 | |
---|---|---|---|---|---|---|---|---|
Ours | MH01 0.1296 | MH02 0.1202 | MH03 0.1506 | MH04 0.1710 | V101 0.0093 | V102 0.1046 | V201 0.0290 | - |
MH02 0.1066 | MH03 0.1077 | MH04 0.1222 | MH05 0.1626 | V102 0.0347 | V103 0.1025 | V202 0.03576 | - | |
Huber | MH01 0.3069 | MH02 0.1862 | MH03 0.2049 | MH04 1.0253 | V101 0.1261 | V102 0.1525 | V201 0.0860 | - |
MH02 0.2508 | MH03 0.1397 | MH04 0.1186 | MH05 0.1620 | V102 0.1464 | V103 0.2125 | V202 0.0988 | - | |
Impr_huber | MH01 57.77% | MH02 35.45% | MH03 26.50% | MH04 83.32% | V101 92.62% | V102 31.41% | V201 66.28% | - |
MH02 57.50% | MH03 22.91% | MH04 −3.04% | MH05 −0.37% | V102 76.30% | V103 51.76% | V202 59.21% | - |
Al | Huber | DA-IRRK (Ours) | Huber | DA-IRRK (Ours) | |
---|---|---|---|---|---|
Settings | Sensor | Stereo | Stereo | Mono-Inertial | Mono-Inertial |
Cam.Resolution | 752 × 480 | 752 × 480 | 600 × 350 | 600 × 350 | |
Cam.fps | 20 Hz | 20 Hz | 20 Hz | 20 Hz | |
IMU | - | - | 200 Hz | 200 Hz | |
ORB Feat. | 1200 | 1200 | 1000 | 1000 | |
RMSE ATE | 0.08269 | 0.06968 | 0.03729 | 0.01611 | |
Tracking | ORB extract | 14.566 | 15.259 | 6.137 | 5.941 |
Stereo match | 2.7792 | 2.7234 | - | - | |
IMU integr. | - | - | 0.0478 | 0.04515 | |
Pose pred | 1.633 | 2.243 | 0.051 | 0.044 | |
Local Mapping (LM) Track | 2.918 | 3.703 | 4.491 | 4.399 | |
New Key Frame (KF) dec | 0.0715 | 0.0762 | 0.05 | 0.0463 | |
Total | 24.416 | 26.583 | - | - | |
Local Mapping (LM) | KF Insert | 3.879 | 4.064 | 6.337 | 6.317 |
Map Point (MP) Culling | 0.254 | 0.291 | 0.0743 | 0.0773 | |
MP Creation | 10.051 | 10.269 | 20.797 | 20.591 | |
Local BA (LBA) | 38.444 | 60.007 | 63.51 | 62.972 | |
KF Culling | 3.489 | 3.379 | 17.834 | 16.637 | |
Total | 55.655 | 77.275 | 107.93 | 107.26 | |
LBA complexity | LBA Edges | 9087.7 | 9203 | 1628.3 | 1727.6 |
LBA KF optimized | 21.067 | 20.584 | 5.923 | 5.833 | |
LBA KF fixed | 37.366 | 35.174 | 1 | 1 | |
Map Size KFs | 165 | 160 | 341 | 346 | |
Map Size MPs | 7979 | 8348 | 11572 | 11308 | |
Full BA | Global BA (GBA) | 130.73 | 230.62 | 775.59 | 686.97 |
Map Update | 16.793 | 4.1786 | 57.345 | 40.312 | |
Total | 147.52 | 234.8 | 832.94 | 727.28 | |
BA Size KFs | 73 | 73 | 139 | 126 | |
BA Size MPs | 5020 | 5108 | 6330 | 5406 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hu, Z.; Cheng, L.; Wei, J.; Xu, X.; Zhang, Z.; Yan, G. DA-IRRK: Data-Adaptive Iteratively Reweighted Robust Kernel-Based Approach for Back-End Optimization in Visual SLAM. Sensors 2025, 25, 2529. https://doi.org/10.3390/s25082529
Hu Z, Cheng L, Wei J, Xu X, Zhang Z, Yan G. DA-IRRK: Data-Adaptive Iteratively Reweighted Robust Kernel-Based Approach for Back-End Optimization in Visual SLAM. Sensors. 2025; 25(8):2529. https://doi.org/10.3390/s25082529
Chicago/Turabian StyleHu, Zhimin, Lan Cheng, Jiangxia Wei, Xinying Xu, Zhe Zhang, and Gaowei Yan. 2025. "DA-IRRK: Data-Adaptive Iteratively Reweighted Robust Kernel-Based Approach for Back-End Optimization in Visual SLAM" Sensors 25, no. 8: 2529. https://doi.org/10.3390/s25082529
APA StyleHu, Z., Cheng, L., Wei, J., Xu, X., Zhang, Z., & Yan, G. (2025). DA-IRRK: Data-Adaptive Iteratively Reweighted Robust Kernel-Based Approach for Back-End Optimization in Visual SLAM. Sensors, 25(8), 2529. https://doi.org/10.3390/s25082529