Detecting and Tracking Moving Airplanes from Space Based on Normalized Frame Difference Labeling and Improved Similarity Measures
Abstract
:1. Introduction
2. Background
2.1. Moving Object Detection Algorithms
2.2. Moving Object Tracking Algorithms
3. Materials and Methods
3.1. Satellite Video Data and Preprocessing
3.2. Methods
3.2.1. Moving Airplane Detection by Normalized Frame Difference Labeling
3.2.2. Moving Airplane Tracking by Template Matching
3.2.3. Computational Complexity Analysis
3.2.4. Rotation Invariance Assessment
3.2.5. Accuracy Assessment Metrics
4. Experimental Results
4.1. Results of Moving Airplane Detection
4.2. Rotation Invariance Assessment
4.3. Results of Moving Airplane Tracking
5. Discussion
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Shi, F.; Qiu, F.; Li, X.; Tang, Y.; Zhong, R.; Yang, C. A Method for Detecting and Tracking Moving Airplanes from a Satellite Video. Remote Sens. 2020, 12, 2390. [Google Scholar] [CrossRef]
- Shao, J.; Du, B.; Wu, C.; Zhang, L. Tracking Objects from Satellite Videos: A Velocity Feature Based Correlation Filter. IEEE Trans. Geosci. Remote Sens. 2019, 57, 7860–7871. [Google Scholar] [CrossRef]
- Yang, T.; Wang, X.; Yao, B.; Li, J.; Zhang, Y.; He, Z.; Duan, W. Small moving vehicle detection in a satellite video of an urban area. Sensors 2016, 16, 1528. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kopsiaftis, G.; Karantzalos, K. Vehicle detection and traffic density monitoring from very high resolution satellite video data. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 1881–1884. [Google Scholar]
- Guo, Y.; Yang, D.; Chen, Z. Object Tracking on Satellite Videos: A Correlation Filter-Based Tracking Method with Trajectory Correction by Kalman Filter. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3538–3551. [Google Scholar] [CrossRef]
- Du, B.; Cai, S.; Wu, C.; Zhang, L.; Dacheng, T. Object Tracking in Satellite Videos Based on a Multi-Frame Optical Flow Tracker. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3043–3055. [Google Scholar] [CrossRef] [Green Version]
- Ahmadi, S.A.; Ghorbanian, A.; Mohammadzadeh, A. Moving vehicle detection, tracking and traffic parameter estimation from a satellite video: A perspective on a smarter city. Int. J. Remote Sens. 2019, 40, 8379–8394. [Google Scholar] [CrossRef]
- Shao, J.; Du, B.; Wu, C.; Zhang, L. Can We Track Targets From Space? A Hybrid Kernel Correlation Filter Tracker for Satellite Video. IEEE Trans. Geosci. Remote Sens. 2019, 57, 8719–8731. [Google Scholar] [CrossRef]
- Du, B.; Sun, Y.; Cai, S.; Wu, C.; Du, Q. Object Tracking in Satellite Videos by Fusing the Kernel Correlation Filter and the Three-Frame-Difference Algorithm. IEEE Geosci. Remote Sens. Lett. 2018, 15, 168–172. [Google Scholar] [CrossRef]
- Zhang, J.; Jia, X.; Hu, J. Motion Flow Clustering for Moving Vehicle Detection from Satellite High Definition Video. In Proceedings of the 2017 International Conference on Digital Image Computing: Techniques and Applications (DICTA), Sydney, Australia, 29 November–1 December 2017; pp. 1–7. [Google Scholar]
- Mou, L.; Zhu, X.; Vakalopoulou, M.; Karantzalos, K.; Paragios, N.; Le Saux, B.; Moser, G.; Tuia, D. Multitemporal Very High Resolution from Space: Outcome of the 2016 IEEE GRSS Data Fusion Contest. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 3435–3447. [Google Scholar] [CrossRef] [Green Version]
- Ahmadi, S.A.; Mohammadzadeh, A. A simple method for detecting and tracking vehicles and vessels from high resolution spaceborne videos. In Proceedings of the Joint Urban Remote Sensing Event (JURSE), Dubai, UAE, 6–8 March 2017; pp. 1–4. [Google Scholar] [CrossRef]
- D’Angelo, P.; Kuschk, G.; Reinartz, P. Evaluation of skybox video and still image products. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 95. [Google Scholar] [CrossRef] [Green Version]
- Leitloff, J.; Hinz, S.; Stilla, U. Vehicle detection in very high resolution satellite images of city areas. IEEE Trans. Geosci. Remote Sens. 2010, 48, 2795–2806. [Google Scholar] [CrossRef]
- Liu, W.; Yamazaki, F.; Vu, T.T. Automated Vehicle Extraction and Speed Determination From QuickBird Satellite Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2011, 4, 75–82. [Google Scholar] [CrossRef]
- Salehi, B.; Zhang, Y.; Zhong, M. Automatic moving vehicles information extraction from single-pass worldView-2 imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 135–145. [Google Scholar] [CrossRef]
- Teutsch, M.; Kruger, W. Robust and fast detection of moving vehicles in aerial videos using sliding windows. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Boston, MA, USA, 7–12 June 2015; pp. 26–34. [Google Scholar]
- Bouwmans, T. Traditional and recent approaches in background modeling for foreground detection: An overview. Comput. Sci. Rev. 2014, 11, 31–66. [Google Scholar] [CrossRef]
- Barnich, O.; van Droogenbroeck, M. ViBe: A universal background subtraction algorithm for video sequences. IEEE Trans. Image Process. 2010, 20, 1709–1724. [Google Scholar] [CrossRef] [Green Version]
- Hu, Z.; Yang, D.; Zhang, K.; Chen, Z. Object tracking in satellite videos based on convolutional regression network with appearance and motion features. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 783–793. [Google Scholar] [CrossRef]
- Piccardi, M. Background subtraction techniques: A review. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Hague, The Netherlands, 10–13 October 2004; Volume 4, pp. 3099–3104. [Google Scholar]
- Elhabian, S.Y.; El-Sayed, K.M.; Ahmed, S.H. Moving object detection in spatial domain using background removal techniques-state-of-art. Recent Patents Comput. Sci. 2008, 1, 32–54. [Google Scholar] [CrossRef]
- Cheung, S.C.S.; Kamath, C. Robust techniques for background subtraction in urban traffic video. Vis. Commun. Image Process. 2004, 5308, 881–892. [Google Scholar] [CrossRef]
- Barnich, O.; van Droogenbroeck, M. ViBe: A powerful random technique to estimate the background in video sequences. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Taipei, Taiwan, 19–24 April 2009; pp. 945–948. [Google Scholar]
- Comaniciu, D.; Ramesh, V.; Peter, M. Real-time tracking of non-rigid objects using mean shift. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head Island, SC, USA, 15 June 2000; Volume 2, pp. 142–149. [Google Scholar]
- Henriques, J.F.; Caseiro, R.; Martins, P.; Batista, J. High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 37, 583–596. [Google Scholar] [CrossRef] [Green Version]
- Babenko, B.; Yang, M.H.; Serge, B. Visual tracking with semi-supervised online weighted multiple instance learning. In Proceedings of the IEEE Conference on computer vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009; pp. 983–990. [Google Scholar]
- Kalal, Z.; Mikolajczyk, K.; Matas, J. Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 34, 1409–1422. [Google Scholar] [CrossRef] [Green Version]
- Bolme, D.S.; Beveridge, J.R.; Draper, B.A.; Lui, Y.M. Visual object tracking using adaptive correlation filters. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; pp. 2544–2550. [Google Scholar]
- Ma, W.; Wen, Z.; Wu, Y.; Jiao, L.; Gong, M.; Zheng, Y.; Liu, L. Remote Sensing Image Registration with Modified SIFT and Enhanced Feature Matching. IEEE Geosci. Remote Sens. Lett. 2017, 14, 3–7. [Google Scholar] [CrossRef]
- Jain, A.K. Data clustering: 50 years beyond K-means. Pattern Recognit. Lett. 2010, 31, 651–666. [Google Scholar] [CrossRef]
- He, L.; Chao, Y.; Suzuki, K. A Run-based one-and-a-half-scan connected-component labeling algorithm. Int. J. Pattern Recognit. Artif. Intell. 2010, 24, 557–579. [Google Scholar] [CrossRef]
- Liwei, W.; Yan, Z.; Jufu, F. On the Euclidean distance of images. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1334–1339. [Google Scholar] [CrossRef]
- Nakhmani, A.; Tannenbaum, A. A new distance measure based on generalized Image Normalized Cross-Correlation for robust video tracking and image recognition. Pattern Recognit. Lett. 2013, 34, 315–321. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mahmood, A.; Khan, S. Correlation Coefficient Based Fast Template Matching Through Partial Elimination. IEEE Trans. Image Process. 2011, 21, 2099–2108. [Google Scholar] [CrossRef]
- Briechle, K.; Hanebeck, U.D. Template matching using fast normalized cross correlation. In Optical Pattern Recognition XII; International Society for Optics and Photonics: Bellingham, DC, USA, 2001; Volume 4387, pp. 95–102. [Google Scholar]
- Schweitzer, H.; Bell, J.W.; Wu, F. Very fast template matching. In European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2002; pp. 358–372. [Google Scholar]
- Goshtasby, A. Template Matching in Rotated Images. IEEE Trans. Pattern Anal. Mach. Intell. 1985, 338–344. [Google Scholar] [CrossRef]
- Tsai, D.M.; Tsai, Y.H. Rotation-invariant pattern matching with color ring-projection. Pattern Recognit. 2002, 35, 131–141. [Google Scholar] [CrossRef]
- Wu, Y.; Lim, J.; Yang, M.H. Online object tracking: A benchmark. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 2411–2418. [Google Scholar]
- Kim, K.; Chalidabhongse, T.H.; Harwood, D.; Davis, L. Real-time foreground-background segmentation using codebook model. Real Time Imaging 2005, 11, 172–185. [Google Scholar] [CrossRef] [Green Version]
- Stauffer, C.; Grimson, W.E.L. Adaptive background mixture models for real-time tracking. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Fort Collins, CO, USA, 23–25 June 1999; Volume 2, pp. 246–252. [Google Scholar] [CrossRef]
- Ester, M.; Kriegel, H.P.; Sander, J.; Xu, X. A density-based algorithm for discovering clusters in large spatial databases with noise. In Proceedings of the Second International Conference on Knowledge Discovery and Data Mining (KDD-96), Portland, OR, USA, 2–4 August 1996; Volume 96, pp. 226–231. [Google Scholar]
1 | 0 | −1 | |||||||
---|---|---|---|---|---|---|---|---|---|
Fq | BRT | BRT | BCK | BRT | DRK | BCK | DRK | DRK | BCK |
Fr | BCK | DRK | DRK | BRT | DRK | BCK | BRT | BCK | BRT |
Frame ID | 1 | 67 | 154 | 248 | Total | Frame ID | 1 | 67 | 154 | 248 | Average | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
GT | 11 | 11 | 9 | 9 | 40 | |||||||
ViBe | TP | 0 | 8 | 9 | 9 | 26 | Precision | 0.77 | 0.67 | 0.75 | 0.90 | 0.76 |
IPGBBS | 0 | 8 | 8 | 9 | 25 | 0.84 | 0.73 | 0.89 | 0.90 | 0.83 | ||
Our | 11 | 10 | 9 | 9 | 39 | 0.92 | 0.71 | 0.82 | 0.75 | 0.80 | ||
ViBe | FP | 0 | 4 | 3 | 1 | 8 | Recall | 0.91 | 0.73 | 1.00 | 1.00 | 0.65 |
IPGBBS | 0 | 3 | 1 | 1 | 5 | 0.87 | 0.73 | 0.89 | 1.00 | 0.63 | ||
Our | 1 | 4 | 2 | 3 | 10 | 1.00 | 0.91 | 1.00 | 1.00 | 0.98 | ||
ViBe | FN | 11 | 3 | 0 | 0 | 14 | F1 score | 0.84 | 0.70 | 0.86 | 0.95 | 0.70 |
IPGBBS | 11 | 3 | 1 | 0 | 15 | 0.86 | 0.73 | 0.89 | 0.95 | 0.71 | ||
Our | 0 | 1 | 0 | 0 | 1 | 0.96 | 0.80 | 0.90 | 0.86 | 0.88 |
Algorithm | TM-TSMs | TM-ISMs | ||||
---|---|---|---|---|---|---|
Similarity measure | NCC | ZNCC | NSD | GWNCC | GWZNCC | GWNSD |
AUC | 0.672 | 0.684 | 0.708 | 0.921 | 0.921 | 0.912 |
Algorithm | Leave-the-Scene | AUC | FPS | RT |
---|---|---|---|---|
KCF [26] | √ | 0.950 | 53.60 | 51.10 |
P-SIFT KM [1] | × | 0.895 | 216.12 | 12.67 |
MOSSE [29] | unknown | 0.466 | 93.48 | 29.30 |
TLD [28] | unknown | 0.289 | 2.63 | 1041.44 |
MIL [27] | × | 0.913 | 7.99 | 342.80 |
Meanshift [25] | × | 0.831 | 31.23 | 87.70 |
TM-GWNSD | √ | 0.912 | 448.28 | 6.11 |
TM-GWZNCC | √ | 0.921 | 392.41 | 6.98 |
TM-GWNCC | √ | 0.921 | 470.62 | 5.82 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shi, F.; Qiu, F.; Li, X.; Zhong, R.; Yang, C.; Tang, Y. Detecting and Tracking Moving Airplanes from Space Based on Normalized Frame Difference Labeling and Improved Similarity Measures. Remote Sens. 2020, 12, 3589. https://doi.org/10.3390/rs12213589
Shi F, Qiu F, Li X, Zhong R, Yang C, Tang Y. Detecting and Tracking Moving Airplanes from Space Based on Normalized Frame Difference Labeling and Improved Similarity Measures. Remote Sensing. 2020; 12(21):3589. https://doi.org/10.3390/rs12213589
Chicago/Turabian StyleShi, Fan, Fang Qiu, Xiao Li, Ruofei Zhong, Cankun Yang, and Yunwei Tang. 2020. "Detecting and Tracking Moving Airplanes from Space Based on Normalized Frame Difference Labeling and Improved Similarity Measures" Remote Sensing 12, no. 21: 3589. https://doi.org/10.3390/rs12213589
APA StyleShi, F., Qiu, F., Li, X., Zhong, R., Yang, C., & Tang, Y. (2020). Detecting and Tracking Moving Airplanes from Space Based on Normalized Frame Difference Labeling and Improved Similarity Measures. Remote Sensing, 12(21), 3589. https://doi.org/10.3390/rs12213589