A Spatial-Frequency Domain Associated Image-Optimization Method for Illumination-Robust Image Matching
Abstract
:1. Introduction
2. Related Work
2.1. Illumination-Robust Feature-Based Matching
2.2. Image Optimization in the Spatial Domain
2.3. Image Optimization in the Frequency Domain
- (1)
- How can we find an effective approach to reduce the radiometric variations while preserving the naturalness for the images under complex illuminations?
- (2)
- How can the approach handle inconspicuous image features caused by extreme illuminations without over-enhancement and loss of the naturalness?
- (3)
- How can the method apply in practical applications to achieve robust image matching when image sequences contain both geometric and radiometric variations?
3. Contribution
- (1)
- An adaptive luminance equalization model is proposed based on the spatial domain analysis to equalize non-uniform illumination with preserving the image naturalness.
- (2)
- A frequency domain analysis-based feature enhancement model is constructed to enhance the image details without over-enhancement and destruction of naturalness.
- (3)
- A spatial-frequency domain associated image-optimization method is proposed by combining the advantages of the spatial and frequency domain analyses to improve image matching in complex illuminations. The demo of our approach can be available at: https://github.com/jiashoujun/image-optimization-for-image-matching.
- (4)
- Comprehensive performance evaluation and analysis of the proposed method and four other state-of-the-art methods are presented using real scenario and standard datasets.
4. The Proposed Method
4.1. Adaptive Luminance Equalization in the Spatial Domain
4.1.1. Luminance Intensity Estimation
4.1.2. Luminance Equalization Model
4.1.3. Adaptive Equalizing Scheme
4.2. Frequency Domain Analysis-Based Feature Enhancement
4.2.1. Irradiance-Reflectance Component Decomposing
4.2.2. Multi-Interval Frequency Domain Equalization
4.2.3. Synthesis of Restrained Irradiance and Heightened Reflectance
5. Experimental Results and Analysis
5.1. Experimental Dataset and Implementation
5.2. Evaluation Criteria
5.3. Matching Performance
5.3.1. Performance of Visual Indicators
5.3.2. Performance of Numerical Indicators
5.4. Application to SFM and MVS
6. Discussion
6.1. Image Naturalness Assessment
6.2. Frequency Domain Division Influence
6.3. Parameter Influence
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Cheng, Z.-Q.; Chen, Y.; Martin, R.R.; Lai, Y.-K.; Wang, A. SuperMatching: Feature Matching Using Supersymmetric Geometric Constraints. IEEE Trans. Vis. Comput. Graph. 2013, 19, 1885–1894. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xing, J.; Wei, Z.; Zhang, G. A Line Matching Method Based on Multiple Intensity Ordering with Uniformly Spaced Sampling. Sensors 2020, 20, 1639. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Berman, M.G.; Hout, M.C.; Kardan, O.; Hunter, M.R.; Yourganov, G.; Henderson, J.M.; Hanayik, T.; Karimi, H.; Jonides, J. The Perception of Naturalness Correlates with Low-Level Visual Features of Environmental Scenes. PLoS ONE 2014, 9, e114572. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ibarra, F.F.; Kardan, O.; Hunter, M.R.; Kotabe, H.P.; Meyer, F.A.C.; Berman, M.G. Image Feature Types and Their Predictions of Aesthetic Preference and Naturalness. Front. Psychol. 2017, 8, 632. [Google Scholar] [CrossRef] [PubMed]
- Zheng, Z.; Liu, Y.; Huang, B.Q.; Yu, H.W. No-reference stereoscopic images quality assessment method based on monocular superpixel visual features and binocular visual features. J. Vis. Commun. Image Represent. 2020, 71, 102848. [Google Scholar] [CrossRef]
- Yendrikhovskij, S.N.; Blommaert, F.J.J.; de Ridder, H. Color reproduction and the naturalness constraint. Color Res. Appl. 1999, 24, 52–67. [Google Scholar] [CrossRef] [Green Version]
- Leng, C.; Zhang, H.; Li, B.; Cai, G.; Pei, Z.; He, L. Local Feature Descriptor for Image Matching: A Survey. IEEE Access 2019, 7, 6424–6434. [Google Scholar] [CrossRef]
- Kim, Y.; Han, W.; Lee, Y.-H.; Kim, C.G.; Kim, K.J. Object Tracking and Recognition Based on Reliability Assessment of Learning in Mobile Environments. Wirel. Pers. Commun. 2017, 94, 267–282. [Google Scholar] [CrossRef]
- Li, Y.; Hu, Z.; Huang, G.; Li, Z.; Angel Sotelo, M. Image Sequence Matching Using Both Holistic and Local Features for Loop Closure Detection. IEEE Access 2017, 5, 13835–13846. [Google Scholar] [CrossRef]
- Ozyesil, O.; Voroninski, V.; Basri, R.; Singer, A. A survey of structure from motion. Acta Numer. 2017, 26, 305–364. [Google Scholar] [CrossRef]
- Chen, L.; Huang, P.; Cai, J. Extracting and Matching Lines of Low-Textured Region in Close-Range Navigation of Tethered Space Robot. IEEE Trans. Ind. Electron. 2019, 66, 7131–7140. [Google Scholar] [CrossRef]
- Ye, Y.; Shan, J. A local descriptor based registration method for multispectral remote sensing images with non-linear intensity differences. ISPRS J. Photogramm. Remote Sens. 2014, 90, 83–95. [Google Scholar] [CrossRef]
- Fan, J.; Wu, Y.; Li, M.; Liang, W.; Cao, Y. SAR and Optical Image Registration Using Nonlinear Diffusion and Phase Congruency Structural Descriptor. IEEE Trans. Geosci. Remote Sens. 2018, 56, 5368–5379. [Google Scholar] [CrossRef]
- Ye, Y.; Shan, J.; Hao, S.; Bruzzone, L.; Qin, Y. A local phase based invariant feature for remote sensing image matching. ISPRS J. Photogramm. Remote Sens. 2018, 142, 205–221. [Google Scholar] [CrossRef]
- Yu, Q.; Zhou, S.; Jiang, Y.; Wu, P.; Xu, Y. High-Performance SAR Image Matching Using Improved SIFT Framework Based on Rolling Guidance Filter and ROEWA-Powered Feature. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 920–933. [Google Scholar] [CrossRef]
- Trujillo, J.C.; Munguia, R.; Urzua, S.; Guerra, E.; Grau, A. Monocular Visual SLAM Based on a Cooperative UAV-Target System. Sensors 2020, 20, 3531. [Google Scholar] [CrossRef]
- Uygur, I.; Miyagusuku, R.; Pathak, S.; Moro, A.; Yamashita, A.; Asama, H. Robust and Efficient Indoor Localization Using Sparse Semantic Information from a Spherical Camera. Sensors 2020, 20, 4128. [Google Scholar] [CrossRef]
- Yang, Y.; Li, Z.G.; Wu, S.Q. Low-Light Image Brightening via Fusing Additional Virtual Images. Sensors 2020, 20, 4614. [Google Scholar] [CrossRef]
- Voicu, L.I.; Myler, H.R.; Weeks, A.R. Practical considerations on color image enhancement using homomorphic filtering. J. Electron. Imaging 1997, 6, 108–113. [Google Scholar] [CrossRef]
- Bi, G.-L.; Xu, Z.-J.; Zhao, J.; Sun, Q. Multispectral image enhancement based on irradiation-reflection model and bounded operation. Acta Phys. Sin. 2015, 64, 100701. [Google Scholar]
- Lu, L.; Ichimura, S.; Moriyama, T.; Yamagishi, A.; Rokunohe, T. A System to Detect Small Amounts of Oil Leakage with Oil Visualization for Transformers using Fluorescence Recognition. IEEE Trans. Dielectr. Electr. Insul. 2017, 24, 1249–1255. [Google Scholar] [CrossRef]
- Mouats, T.; Aouf, N.; Richardson, M.A. A Novel Image Representation via Local Frequency Analysis for Illumination Invariant Stereo Matching. IEEE Trans. Image Process. 2015, 24, 2685–2700. [Google Scholar] [CrossRef] [PubMed]
- Rana, A.; Valenzise, G.; Dufaux, F. An Evaluation of HDR Image Matching under Extreme Illumination Changes; Visual Communications & Image Processing IEEE: Piscataway, NJ, USA, 2016. [Google Scholar]
- Jacobs, D.W.; Belhumeur, P.N.; Basri, R. Comparing Images under Variable Illumination. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Santa Barbara, CA, USA, 25 June 1998; p. 610. [Google Scholar]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Bay, H.; Ess, A.; Tuytelaars, T.; Van Gool, L. Speeded-Up Robust Features (SURF). Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
- Morel, J.-M.; Yu, G. ASIFT: A New Framework for Fully Affine Invariant Image Comparison. Siam J. Imaging Sci. 2009, 2, 438–469. [Google Scholar] [CrossRef]
- Zhang, S.L.; Tian, Q.; Lu, K.; Huang, Q.M.; Gao, W. Edge-SIFT: Discriminative Binary Descriptor for Scalable Partial-Duplicate Mobile Search. IEEE Trans. Image Process. 2013, 22, 2889–2902. [Google Scholar] [CrossRef]
- Sedaghat, A.; Mohammadi, N. Illumination-Robust remote sensing image matching based on oriented self-similarity. ISPRS J. Photogramm. Remote Sens. 2019, 153, 21–35. [Google Scholar] [CrossRef]
- Wan, X.; Liu, J.G.; Yan, H.S.; Morgan, G.L.K. Illumination-invariant image matching for autonomous UAV localisation based on optical sensing. ISPRS J. Photogramm. Remote Sens. 2016, 119, 198–213. [Google Scholar] [CrossRef]
- Wachinger, C.; Navab, N. Entropy and Laplacian images: Structural representations for multi-modal registration. Med. Image Anal. 2012, 16, 1–17. [Google Scholar] [CrossRef]
- Ma, J.Y.; Jiang, X.Y.; Fan, A.X.; Jiang, J.J.; Yan, J.C. Image Matching from Handcrafted to Deep Features: A Survey; International Journal of Computer Vision; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Li, Y.F.; Wang, H.J. An Efficient and Robust Method for Detecting Region Duplication Forgery Based on Non-parametric Local Transforms. In Proceedings of the IEEE International Congress on Image & Signal Processing, Chongqing, China, 16–18 October 2012. [Google Scholar]
- Luan, X.; Yu, F.; Zhou, H.; Li, X.; Dalei, S.; Bingwei, W. Illumination-robust area-based stereo matching with improved census transform. In Proceedings of the IEEE International Conference on Measurement, Harbin, China, 18–20 May 2012. [Google Scholar]
- Hill, P.R.; Bhaskar, H.; Al-Mualla, M.E.; Bull, D.R.; IEEE. Improved illumination invariant homomorphic filtering using the dual tree complexwavelet transform. In Proceedings of the 2016 IEEE International Conference on Acoustics, Speech and Signal Processing Proceedings, Shanghai, China, 20–25 March 2016; pp. 1214–1218. [Google Scholar]
- Tang, F.; Lim, S.H.; Chang, N.L.; Tao, H.; IEEE. A Novel Feature Descriptor Invariant to Complex Brightness Changes. In Proceedings of the 2009 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2009), Miami, FL, USA, 20–25 June 2009. [Google Scholar]
- Kharbat, M.; Aouf, N.; Tsourdos, A.; White, B. Robust Brightness Description for Computing Optical Flow. In Proceedings of the British Machine Conference, Leeds, UK, 1–4 September 2008. [Google Scholar] [CrossRef] [Green Version]
- Ye, Y.X.; Shen, L.; Hao, M.; Wang, J.C.; Xu, Z. Robust Optical-to-SAR Image Matching Based on Shape Properties. IEEE Geosci. Remote Sens. Lett. 2017, 14, 564–568. [Google Scholar] [CrossRef]
- Gijsenij, A.; Gevers, T.; van de Weijer, J. Computational Color Constancy: Survey and Experiments. IEEE Trans. Image Process. 2011, 20, 2475–2489. [Google Scholar] [CrossRef] [PubMed]
- Van de Weijer, J.; Gevers, T.; Gijsenij, A. Edge-based color constancy. IEEE Trans. Image Process. 2007, 16, 2207–2214. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gijsenij, A.; Gevers, T.; van de Weijer, J. Improving Color Constancy by Photometric Edge Weighting. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 918–929. [Google Scholar] [CrossRef]
- Heo, Y.S.; Lee, K.M.; Lee, S.U. Joint Depth Map and Color Consistency Estimation for Stereo Images with Different Illuminations and Cameras. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1094–1106. [Google Scholar] [PubMed]
- Deepa; Jyothi, K.; IEEE. A Robust and Efficient Pre Processing Techniques for Stereo Images. In Proceedings of the 2017 International Conference on Electrical, Electronics, Communication, Computer, and Optimization Techniques (ICEECCOT), Mysuru, India, 15–16 December 2017; pp. 89–92. [Google Scholar]
- Thamizharasi, A.; Jayasudha, J.S. An Image Enhancement Technique for Poor Illumination Face Images. In International Proceedings on Advances in Soft Computing, Intelligent Systems and Applications, Asisa 2016; Reddy, M.S., Viswanath, K., Prasad, K.M.S., Eds.; Springer: Singapore, 2018; Volume 628, pp. 167–179. [Google Scholar]
- Tan, S.F.; Isa, N.A.M. Exposure Based Multi-Histogram Equalization Contrast Enhancement for Non-Uniform Illumination Images. IEEE Access 2019, 7, 70842–70861. [Google Scholar] [CrossRef]
- Lecca, M.; Torresani, A.; Remondino, F. On Image Enhancement for Unsupervised Image Description and Matching; Springer: Cham, Switzerland, 2019; pp. 82–92. [Google Scholar]
- Guo, X.J.; Li, Y.; Ling, H.B. LIME: Low-Light Image Enhancement via Illumination Map Estimation. IEEE Trans. Image Process. 2017, 26, 982–993. [Google Scholar] [CrossRef]
- Li, H.C.; Man, Y.Y. Robust Multi-Source Image Registration for Optical Satellite Based on Phase Information. Photogramm. Eng. Remote Sens. 2016, 82, 865–878. [Google Scholar] [CrossRef]
- Ye, Z.; Tong, X.H.; Zheng, S.Z.; Guo, C.C.; Gao, S.; Liu, S.J.; Xu, X.; Jin, Y.M.; Xie, H.; Liu, S.C.; et al. Illumination-Robust Subpixel Fourier-Based Image Correlation Methods Based on Phase Congruency. IEEE Trans. Geosci. Remote Sens. 2019, 57, 1995–2008. [Google Scholar] [CrossRef]
- Wang, S.H.; Zheng, J.; Hu, H.M.; Li, B. Naturalness Preserved Enhancement Algorithm for Non-Uniform Illumination Images. IEEE Trans. Image Process. 2013, 22, 3538–3548. [Google Scholar] [CrossRef]
- Ali, R.; Szilagyi, T.; Gooding, M.; Christlieb, M.; Brady, M. On the Use of Low-Pass Filters for Image Processing with Inverse Laplacian Models. J. Math. Imaging Vis. 2012, 43, 156–165. [Google Scholar] [CrossRef]
- Lee, S.; Kwon, H.; Han, H.; Lee, G.; Kang, B. A Space-Variant Luminance Map based Color Image Enhancement. IEEE Trans. Consum. Electron. 2010, 56, 2636–2643. [Google Scholar] [CrossRef]
- Lee, S.L.; Tseng, C.C.; IEEE. Image Enhancement Using DCT-Based Matrix Homomorphic Filtering Method. In Proceedings of the 2016 IEEE Asia Pacific Conference on Circuits and Systems (APCCAS), Jeju, Korea, 25–28 October 2016; pp. 1–4. [Google Scholar]
- Plichoski, G.F.; Chidambaram, C.; Parpinelli, R.S. Optimizing a Homomorphic Filter for Illumination Compensation In Face Recognition Using Population-based Algorithms. In Proceedings of the 2017 Workshop of Computer Vision (WVC), Natal, Brazil, 30 October–1 November 2017; pp. 78–83. [Google Scholar]
- Orcioni, S.; Paffi, A.; Camera, F.; Apollonio, F.; Liberti, M. Automatic decoding of input sinusoidal signal in a neuron model: High pass homomorphic filtering. Neurocomputing 2018, 292, 165–173. [Google Scholar] [CrossRef]
- Kaur, K.; Jindal, N.; Singh, K. Improved homomorphic filtering using fractional derivatives for enhancement of low contrast and non-uniformly illuminated images. Multimed. Tools Appl. 2019, 78, 27891–27914. [Google Scholar] [CrossRef]
- Zhang, C.M.; Liu, W.B.; Xing, W.W. Color image enhancement based on local spatial homomorphic filtering and gradient domain variance guided image filtering. J. Electron. Imaging 2018, 27, 063026. [Google Scholar] [CrossRef]
- Fan, B.; Wu, F.C.; Hu, Z.Y. Robust line matching through line-point invariants. Pattern Recognit. 2012, 45, 794–805. [Google Scholar] [CrossRef]
- Zhang, L.L.; Koch, R. An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency. J. Vis. Commun. Image Represent. 2013, 24, 794–805. [Google Scholar] [CrossRef]
- Hossein-Nejad, Z.; Nasri, M. An adaptive image registration method based on SIFT features and RANSAC transform. Comput. Electr. Eng. 2017, 62, 524–537. [Google Scholar] [CrossRef]
- Mohammed, H.M.; El-Sheimy, N. A Descriptor-less Well-Distributed Feature Matching Method Using Geometrical Constraints and Template Matching. Remote Sens. 2018, 10, 747. [Google Scholar] [CrossRef] [Green Version]
Parameter | Weak Image | Normal Image | Bright Image |
---|---|---|---|
Low frequency range | 5 | 5 | 5 |
High frequency range | 15 | 10 | 20 |
Compression coefficients | 0.2 | 0.2 | 0.2 |
Enhancement coefficients | 1 | 1 | 2 |
Measure | The First Image Pair (Weak) | The Second Image Pair (Normal) | The Third Image Pair (Bright) | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Original | HES | LFA | LIME | NPEA | OURS | Original | HES | LFA | LIME | NPEA | OURS | Original | HES | LFA | LIME | NPEA | OURS | |
F1 | 6677 | 16,661 | 44,408 | 17,245 | 11,480 | 5930 | 1018 | 6862 | 53,830 | 6241 | 3477 | 1297 | 641 | 7073 | 26,274 | 1294 | 790 | 909 |
F2 | 7474 | 16,252 | 27,128 | 17,483 | 13,224 | 6422 | 1165 | 7365 | 55,946 | 6943 | 3793 | 1455 | 465 | 7278 | 28,232 | 2011 | 1221 | 730 |
M | 215 | 172 | 42 | 465 | 385 | 282 | 135 | 161 | 331 | 230 | 156 | 192 | 118 | 143 | 321 | 168 | 105 | 136 |
NPM | 140 | 88 | 8 | 256 | 220 | 189 | 45 | 77 | 125 | 73 | 60 | 111 | 64 | 93 | 120 | 93 | 68 | 100 |
T(s) | 22.44 | 39.09 | 103.44 | 44.39 | 35.67 | 20.73 | 15.45 | 22.28 | 252.25 | 21.72 | 16.78 | 15.11 | 15.16 | 22.54 | 71.04 | 15.78 | 14.41 | 14.28 |
RPMF | 0.021 | 0.005 | 0.000 | 0.014 | 0.019 | 0.032 | 0.044 | 0.011 | 0.002 | 0.012 | 0.017 | 0.086 | 0.138 | 0.013 | 0.005 | 0.072 | 0.086 | 0.137 |
MP | 0.651 | 0.512 | 0.190 | 0.551 | 0.571 | 0.670 | 0.333 | 0.478 | 0.378 | 0.317 | 0.385 | 0.578 | 0.542 | 0.650 | 0.374 | 0.554 | 0.648 | 0.735 |
RPMT | 6.239 | 2.251 | 0.077 | 5.767 | 6.168 | 9.117 | 2.913 | 3.456 | 0.496 | 3.361 | 3.576 | 7.346 | 4.222 | 4.126 | 1.689 | 5.894 | 4.719 | 7.003 |
Measure | The First Image Pair (Weak) | The Second Image Pair (Normal) | The Third Image Pair (Bright) | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Original | HES | LFA | LIME | NPEA | OURS | Original | HES | LFA | LIME | NPEA | OURS | Original | HES | LFA | LIME | NPEA | OURS | |
F1 | 4408 | 9738 | 24,377 | 9092 | 7426 | 6115 | 1761 | 8155 | 23,750 | 5342 | 3711 | 2179 | 1134 | 8152 | 19,392 | 2220 | 1721 | 1796 |
F2 | 4568 | 11,349 | 18,486 | 10,156 | 7787 | 5604 | 2189 | 8502 | 24,274 | 5415 | 4187 | 3094 | 1375 | 7969 | 20,080 | 2887 | 2138 | 2322 |
M | 68 | 79 | 32 | 175 | 147 | 102 | 31 | 49 | 85 | 73 | 76 | 60 | 62 | 49 | 113 | 53 | 25 | 81 |
NPM | 33 | 29 | 5 | 20 | 30 | 59 | 4 | 7 | 18 | 9 | 14 | 32 | 16 | 8 | 26 | 5 | 7 | 34 |
T(s) | 16.88 | 33.78 | 96.51 | 31.91 | 25.15 | 20.15 | 11.91 | 25.49 | 118.6 | 18.94 | 15.72 | 11.87 | 10.58 | 24.3 | 82.89 | 13.31 | 12.21 | 10.51 |
RPMF | 0.007 | 0.003 | 0.000 | 0.002 | 0.004 | 0.011 | 0.002 | 0.001 | 0.001 | 0.002 | 0.004 | 0.015 | 0.014 | 0.001 | 0.001 | 0.002 | 0.004 | 0.019 |
MP | 0.485 | 0.367 | 0.156 | 0.114 | 0.204 | 0.578 | 0.129 | 0.143 | 0.212 | 0.123 | 0.184 | 0.533 | 0.258 | 0.163 | 0.230 | 0.094 | 0.280 | 0.420 |
RPMT | 1.955 | 0.858 | 0.052 | 0.627 | 1.193 | 2.928 | 0.336 | 0.275 | 0.152 | 0.475 | 0.891 | 2.696 | 1.512 | 0.329 | 0.314 | 0.376 | 0.573 | 3.235 |
Measure | The Fourth Image Pair | The Fifth Image Pair | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Original | HES | LFA | LIME | NPEA | OURS | Original | HES | LFA | LIME | NPEA | OURS | |
F1 | 2701 | 6823 | 2902 | 3576 | 2720 | 4120 | 1251 | 6730 | 2755 | 3598 | 2801 | 3636 |
F2 | 1540 | 6991 | 2870 | 3979 | 3302 | 3784 | 2314 | 7163 | 2891 | 3950 | 2950 | 3719 |
M | 248 | 705 | 747 | 662 | 570 | 571 | 316 | 437 | 832 | 710 | 647 | 592 |
NPM | 209 | 519 | 493 | 508 | 485 | 511 | 282 | 373 | 547 | 594 | 539 | 557 |
T(s) | 3.47 | 9.88 | 4.27 | 5.21 | 4.24 | 5.15 | 3.81 | 9.78 | 4.18 | 5.57 | 4.26 | 5.54 |
RPMF | 0.136 | 0.076 | 0.172 | 0.142 | 0.178 | 0.135 | 0.225 | 0.055 | 0.199 | 0.165 | 0.192 | 0.153 |
MP | 0.843 | 0.736 | 0.660 | 0.767 | 0.851 | 0.895 | 0.892 | 0.854 | 0.657 | 0.837 | 0.833 | 0.941 |
RPMT | 60.231 | 52.530 | 115.457 | 97.505 | 114.387 | 99.223 | 74.016 | 38.139 | 130.861 | 106.643 | 126.526 | 100.542 |
Measure | The Fourth Image Pair | The Fifth Image Pair | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Original | HES | LFA | LIME | NPEA | OURS | Original | HES | LFA | LIME | NPEA | OURS | |
F1 | 1762 | 2567 | 2071 | 2065 | 1725 | 2486 | 1483 | 2598 | 2066 | 1986 | 1783 | 2371 |
F2 | 931 | 2742 | 2170 | 2262 | 1982 | 2297 | 751 | 2706 | 2153 | 2220 | 1883 | 2161 |
M | 207 | 315 | 478 | 454 | 395 | 428 | 245 | 344 | 597 | 508 | 417 | 485 |
NPM | 114 | 213 | 223 | 227 | 208 | 308 | 137 | 229 | 331 | 323 | 246 | 337 |
T(s) | 2.76 | 5.15 | 3.98 | 4.65 | 3.61 | 2.65 | 2.57 | 5.07 | 4.03 | 3.94 | 3.69 | 2.55 |
RPMF | 0.122 | 0.083 | 0.108 | 0.110 | 0.121 | 0.134 | 0.182 | 0.088 | 0.160 | 0.163 | 0.138 | 0.156 |
MP | 0.551 | 0.676 | 0.467 | 0.500 | 0.527 | 0.720 | 0.559 | 0.666 | 0.554 | 0.636 | 0.590 | 0.695 |
RPMT | 41.304 | 41.359 | 56.030 | 48.817 | 57.618 | 116.226 | 54.582 | 45.168 | 82.134 | 81.980 | 66.667 | 132.157 |
Indicators | Original | HES | LFA | LIME | NPEA | OURS |
---|---|---|---|---|---|---|
SRIL | 69.4% | 71.2% | 90.0% | 83.4% | 78.6% | 85.4% |
RE | 0.78 | 0.77 | 1.04 | 0.99 | 0.95 | 0.75 |
Check Point | Illumination | Original | HES | LFA | LIME | NPEA | OURS |
---|---|---|---|---|---|---|---|
1 | Normal | 20 | 37 | 21 | loss | 31 | 17 |
2 | Normal | 36 | 32 | 19 | 29 | 39 | 29 |
3 | Normal | 28 | 31 | 26 | 49 | 15 | 25 |
4 | Bright | 22 | loss | 7 | 24 | 14 | 10 |
5 | Bright | 23 | loss | 39 | loss | 38 | 21 |
6 | Weak | 27 | loss | 62 | loss | 42 | 20 |
Mean | 26 | 33 | 29 | 34 | 30 | 20 |
Images | Orig. | HES | LFA | NPEA | LIME | Prop. |
---|---|---|---|---|---|---|
Bright 1 | 0 | 131.93 | 1621.00 | 167.92 | 320.10 | 174.54 |
Bright 2 | 0 | 131.49 | 1531.00 | 210.94 | 303.74 | 194.57 |
Normal 1 | 0 | 142.22 | 1456.00 | 123.33 | 338.60 | 244.25 |
Normal 2 | 0 | 152.59 | 1358.00 | 100.62 | 318.45 | 262.50 |
Weak 1 | 0 | 158.57 | 1879.00 | 364.47 | 266.37 | 205.89 |
Weak 2 | 0 | 154.15 | 1856.00 | 468.04 | 316.28 | 221.72 |
Img1 | 0 | 546.42 | 821.06 | 193.25 | 412.88 | 266.66 |
Img2 | 0 | 523.86 | 847.37 | 291.50 | 407.24 | 320.27 |
Img3 | 0 | 550.91 | 903.15 | 555.59 | 404.51 | 401.31 |
Img4 | 0 | 545.04 | 920.05 | 650.41 | 422.71 | 406.44 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, C.; Jia, S.; Wu, H.; Zeng, D.; Cheng, F.; Zhang, S. A Spatial-Frequency Domain Associated Image-Optimization Method for Illumination-Robust Image Matching. Sensors 2020, 20, 6489. https://doi.org/10.3390/s20226489
Liu C, Jia S, Wu H, Zeng D, Cheng F, Zhang S. A Spatial-Frequency Domain Associated Image-Optimization Method for Illumination-Robust Image Matching. Sensors. 2020; 20(22):6489. https://doi.org/10.3390/s20226489
Chicago/Turabian StyleLiu, Chun, Shoujun Jia, Hangbin Wu, Doudou Zeng, Fanjin Cheng, and Shuhang Zhang. 2020. "A Spatial-Frequency Domain Associated Image-Optimization Method for Illumination-Robust Image Matching" Sensors 20, no. 22: 6489. https://doi.org/10.3390/s20226489
APA StyleLiu, C., Jia, S., Wu, H., Zeng, D., Cheng, F., & Zhang, S. (2020). A Spatial-Frequency Domain Associated Image-Optimization Method for Illumination-Robust Image Matching. Sensors, 20(22), 6489. https://doi.org/10.3390/s20226489