Research on Cam–Kalm Automatic Tracking Technology of Low, Slow, and Small Target Based on Gm-APD LiDAR
Abstract
:1. Introduction
- Radar early warning technology:
- (1)
- American Echo Shield 4D radar combines Ku-band BSA beamforming and dynamic waveform synthesis 6 [14]. It offers wireless positioning service at 15.7–16.6 GHz and wireless navigation function at 15.4–15.7 GHz, with a tracking accuracy of 0.5° and an effective detection distance of 3 km.
- (2)
- HARRIER DSR UAV surveillance radar developed by DeTecT Company of the United States adopts a solid-state Doppler detection radar system, which is used for small RCS targets in complex clutter environments. The detection range of UAVs flying without RF and GPS programs is 3.2 km [15].
- (3)
- Hof Institute of High-Frequency Physics and Radar Technology in Flawn, Germany has proposed Scanning Surveillance Radar System (SSRS) technology [16]. The FMCW radar sensor is scanned mechanically, with a scanning frequency of 8 Hz and a bandwidth of 1 GHz. The maximum range resolution is 15 cm, and the maximum detection range is 153.6 m.
- Passive optical imaging technology [17]:
- (1)
- Poland Advanced Protection System Company has designed the SKYctrl anti-UAV system, which includes an ultra-precision FIELDctrl 3D MIMO radar, PTZ day/night camera, WiFi sensor, and acoustic array. The detecting target’s minimum height is 1 m.
- (2)
- France’s France–Germany Institute in Saint Louis has adopted two optical sensing channels, while a passive color camera provides the image of the target area with a field of view of 4° × 3° and can detect a DJI UAV from 100 m to 200 m.
- (3)
- The National Defense Development Agency of Korea has proposed a UAV tracking method using KCF and adaptive thresholding. The system includes a visual camera, pan/tilt, image processing system, pan/tilt control computer, driving system, and a liquid crystal display that can detect the UAV with a target size of about 30 × 30 cm, flying speed of about 10 m/s, and distance of 1.5 km.
- (4)
- DMA’s low-altitude early warning and tracking photoelectric system developed by China Hepu Company integrates a high-definition visible-light camera and an infrared imaging sensor. The system can detect micro-UAVs within 2 km, and the tracking angular velocity can reach 200°.
- Acoustic detection technology:
- (1)
- Dedrone Company of Germany has developed Drone Tracker, a UAV detection system that uses distributed acoustic and optical sensors to comprehensively detect UAVs. It can detect illegally invading UAVs 50–80 m in advance [18].
- (2)
- Holy Cross University of Technology in Poland studied the acoustic signal of a rotary-wing UAV; using an Olympus LS-11 digital recorder, the acoustic signal was recorded at a 44 kHz sampling rate and 16-bit signal resolution, and was obtained at a distance as far as 200 m.
- The targets are small and difficult to detect: The reflection cross-section of UAV radar is actively tiny, making it difficult for traditional radar to capture effectively.
- Complex background interference: Interference is greatly influenced by ground clutter, which increases the difficulty of detection.
- Low noise and low infrared radiation: The sound wave and infrared characteristics generated by UAVs are weak, making them difficult to detect via sound wave or infrared detection.
- Significant environmental impact: Weather conditions such as smog and rainy days affect the performance of traditional optical and acoustic detection.
2. System Design
2.1. Design of Gm-APD LiDAR System
2.2. Detection Process of Gm-APD LiDAR Imaging System
- (1)
- Laser emission: The high-power and narrow-pulse laser is pointed to the UAV target after optical shaping.
- (2)
- Laser echo reception: Echo photons reflected by the target are converged on the Gm-APD detector through the receiving optical system, triggering the avalanche effect.
- (3)
- Signal recording: The Gm-APD detector records the echo signal and transmits it to the FPGA data acquisition and processing system.
- (4)
- Signal processing: The acquisition system processes the signal and reconstructs the intensity image and range image of the UAV target.
- (5)
- Target fitting and tracking: Threshold filtering, Gaussian fitting, and canny edge detection methods are adopted, and the combined Cam–Kalm algorithm is utilized to realize real-time automatic target tracking.
- (6)
- Dynamic adjustment: The FPGA system calculates the angle according to the miss distance, adjusts the pitch or azimuth of the servo platform, and dynamically tracks the target.
3. Description of Tracking Algorithm
3.1. Target Extraction
- (1)
- There is little difference between the spatial distribution of the target intensity image and the geometric ideal shape.
- (2)
- The actual detected target presents an irregular convex pattern relative to the background, consistent with the smooth central convex feature of the standard model.
3.2. Target Tracking and State Prediction
- (1)
- Increase forecasting ability: Based on the target position prediction, velocity and acceleration variables are introduced to improve the adaptability to the rapid change of UAV position; this not only improves the system’s robustness but also reduces the iterative calculation of the Camshift algorithm and accelerates the coordination with the two-dimensional tracking platform.
- (2)
- Reduce the false alarm rate: By predicting the speed and acceleration of the target, the moving trend of the target can be judged and the influence of ghosts in LiDAR imaging can be reduced, thereby reducing the false alarm rate.
- (3)
- Adaptive adjustment of search window: Adaptive adjustment of the search window can be realized to effectively solve the problem of loss of the tracking target caused by the Camshift algorithm’s expansion of the tracking window.
- Step 1
- Initialization.
- (1)
- Automatic extraction of the first frame target. We use the fitting method in Section 3.1 to obtain the ROI of the target point and synchronously initialize the center position and size of the search window.
- (2)
- Initialize the Kalman filter. The state prediction equation and the observation equation of the tracking system is as follows:
- Step 2
- Backproject the histogram of the LiDAR intensity image.
- (1)
- Target area calculation. The intensity image of the first frame of the Gm-APD LiDAR is read and the pixel intensity distribution of the automatically detected target area is calculated.
- (2)
- Backprojecting the histogram of the search area. Each pixel value is mapped to the corresponding probability value in the target histogram to generate a backprojection.
- Step 3
- Calculate the search window.The centroid of the range profile search window of the LiDAR is calculated using its zero-order moment and first-order moment; represents the pixel values at in the back projection image, where x and y change within the search window . The calculation steps are as follows:
- (1)
- Calculate the zero-order moment at the initial moment:
- (2)
- Calculate the first moment in the x and y directions::
- (3)
- Calculate the centroid coordinates of the tracking window:
- (4)
- Move the center of the search window to the center-of-mass position .
- Step 4
- Calculate the trace window.The Camshift algorithm obtains the directional target area through the second-order matrix:
- (1)
- Calculate the second moment in the x and y directions:
- (2)
- Calculate the new tracking window size () as follows:
- (3)
- Iterate continuously until the centroid position converges. The centroid of the target point is the iterative result , which is used to update and predict the Kalman filtering time .
- Step 5
- Kalman filter prediction.At moment , the state vector of UAV target motion isThe motion observation equation of the system is updated as follows:
- Step 6
- Repeat Steps 2, 3, 4, and 5.A confidence threshold is set. When the centroid variation exceeds the threshold, the predicted value of the kth frame is set as the search window center of the th frame search area; otherwise, the calculated value in the camshaft algorithm is set as the search window center of the th frame search area.
4. Analysis of Experimental Results
4.1. Target Tracking Center Estimation Results of Gm-APD LiDAR System
4.2. Results of Tracking Algorithm for Gm-APD LiDAR System in Air and Space Background
- Strong robustness: The Cam–Kalm algorithm can smooth the target’s trajectory by introducing a Kalman filter, correcting the jitter caused by noise or incomplete imaging and making the tracking more stable. This filter plays a particularly significant role when the target is subjected to rapid acceleration and deceleration, helping to maintain high tracking stability.
- Higher tracking accuracy: During the tracking process, the algorithm’s CLE is less than 3 pixels, the average CLE of multiple frames is 0.8964, and the variance is 0.4028, showing high accuracy and stability. In comparison, the values achieved with the Meanshift and Camshift algorithms are lower, indicating that the Cam–Kalm algorithm is superior in terms of tracking accuracy.
- Higher tracking efficiency: The traditional Meanshift and Camshift algorithms may need to be compared in the whole field of vision every time, which becomes significant when the target suddenly changes position or moves quickly, as this global search method will waste many computing resources. By using the Kalman filter to predict the trajectory of the target, the Cam–Kalm algorithm can effectively limit the search area to the vicinity of the prediction area and reduce the amount of invalid calculations. In this way, the algorithm can quickly find the target in a small range, leading to improved tracking efficiency.
- Dynamically adjustable tracking strategy: By setting the confidence threshold, the Cam–Kalm algorithm can flexibly choose different tracking strategies according to the changes in the centroid of the target. When the movement of the target changes considerably, such as in the case of acceleration or sudden change of direction, the algorithm can rely more on the prediction results of the Kalman filter, which is helpful for accurately predicting and updating the target position when the target moves quickly and avoiding tracking lag or failure caused by high speeds. This is especially important for fast-changing targets such as drones, and can ensure accurate tracking when the target is flying at high speed.
- –
- If the target changes little, such as moving at a constant speed, the algorithm is more inclined to trust the Camshift results. Camshift’s adaptability can help to track and locate the target more accurately.
- –
- If the target changes considerably, such as in the case of sudden acceleration, the Kalman filter’s prediction ability can help to reduce error and lag while ensuring tracking continuity.
- Sensitivity to the centroid change setting: The confidence threshold set in the algorithm has an important influence on the results. This threshold determines how many pixels the centroid changes by. When the centroid changes by less than this threshold, the algorithm will believe the prediction result of the Kalman filter; when the centroid changes less than this threshold, the algorithm will rely on the Camshift prediction value. If the threshold is not set correctly, it may lead to tracking errors, especially when the target trajectory changes significantly, affecting the algorithm’s reliability.
- Reliance on accurate initial estimates: The Kalman filter relies on accurate initial position estimation. If the initial estimation is wrong, then the Kalman filter’s prediction may deviate from the actual target, affecting the subsequent tracking effect.
- Misjudgment: Although Camshift’s adaptive characteristics enable it to cope with changes in target size, the Kalman filter itself does not adapt to rapid changes in the target’s appearance, for example when the target is blocked. Thus, misjudgment may occur, especially in scenes where the imaging size changes significantly due to very complex scenes or the presence of rushing water.
4.3. Results of Tracking Algorithm for Gm-APD LiDAR System in Complex Background of Urban Buildings
4.4. Long-Distance Tracking Results of Gm-APD LiDAR Based on Cam–Kalm Algorithm
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Farlík, J.; Gacho, L. Researching UAV threat–new challenges. In Proceedings of the 2021 International Conference on Military Technologies (ICMT), Brno, Czech Republic, 8–11 June 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–6. [Google Scholar]
- Lyu, C.; Zhan, R. Global analysis of active defense technologies for unmanned aerial vehicle. IEEE Aerosp. Electron. Syst. Mag. 2022, 37, 6–31. [Google Scholar] [CrossRef]
- Zhou, Y.; Rao, B.; Wang, W. UAV swarm intelligence: Recent advances and future trends. IEEE Access 2020, 8, 183856–183878. [Google Scholar] [CrossRef]
- Mohsan, S.A.H.; Khan, M.A.; Noor, F.; Ullah, I.; Alsharif, M.H. Towards the unmanned aerial vehicles (UAVs): A comprehensive review. Drones 2022, 6, 147. [Google Scholar] [CrossRef]
- Peri, D. Expanding anti-UAVs market to counter drone technology. CLAWS J. Winter 2015, 152–158. [Google Scholar]
- Kang, H.; Joung, J.; Kim, J.; Kang, J.; Cho, Y.S. Protect your sky: A survey of counter unmanned aerial vehicle systems. IEEE Access 2020, 8, 168671–168710. [Google Scholar] [CrossRef]
- Liu, S.; Wu, R.; Qu, J.; Li, Y. HDA-Net: Hybrid convolutional neural networks for small objects recognization at airports. IEEE Trans. Instrum. Meas. 2022, 71, 2521314. [Google Scholar] [CrossRef]
- Solodov, A.; Williams, A.; Al Hanaei, S.; Goddard, B. Analyzing the threat of unmanned aerial vehicles (UAV) to nuclear facilities. Secur. J. 2018, 31, 305–324. [Google Scholar] [CrossRef]
- Taneski, N.; Caminski, B.; Petrovski, A. Use of weaponized unmanned aerial vehicles (UAVs) supported by gis as a growing terrorist threat. In Science and Society Contribution of Humanities and Social Sciences: Proceedings of the International Conference on the Occasion of the Centennial Anniversary of the Faculty of Philosophy, Struga, North Macedonia, 2–5 September 2020; Faculty of Philosophy: Skopje, North Macedonia, 2021; pp. 553–567. [Google Scholar]
- Wang, J.; Liu, Y.; Song, H. Counter-unmanned aircraft system (s)(C-UAS): State of the art, challenges, and future trends. IEEE Aerosp. Electron. Syst. Mag. 2021, 36, 4–29. [Google Scholar] [CrossRef]
- Wang, L.; Luo, J.; Li, Z.; Wang, M.; Li, R.; Xu, W.; He, F.; Xu, B. Thinking about anti-drone strategies. In Proceedings of the 2024 3rd International Conference on Artificial Intelligence, Internet of Things and Cloud Computing Technology (AIoTC), Wuhan, China, 13–15 September 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 53–56. [Google Scholar]
- Anil, A.; Hennemann, A.; Kimmel, H.; Mayer, C.; Müller, L.; Reischl, T. PERSEUS-Post-Emergency Response and Surveillance UAV System; Deutsche Gesellschaft für Luft-und Raumfahrt-Lilienthal-Oberth eV: Bonn, Germany, 2024. [Google Scholar]
- Bi, Z.; Chen, H.; Hu, J.; Liu, L.; Yang, J.; Bai, C. Analysis of UAV Typical War Cases and Combat Assessment Research. In Proceedings of the 2022 IEEE International Conference on Unmanned Systems (ICUS), Guangzhou, China, 28–30 October 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1449–1453. [Google Scholar]
- Gong, J.; Yan, J.; Kong, D.; Li, D. Introduction to Drone Detection Radar with Emphasis on Automatic Target Recognition (ATR) technology. arXiv 2023, arXiv:2307.10326. [Google Scholar]
- Bressler, M.S.; Bressler, L. Beware the unfriendly skies: How drones are being used as the latest weapon in cybercrime. J. Technol. Res. 2016, 7, 1–12. [Google Scholar]
- Kim, E.; Sivits, K. Blended secondary surveillance radar solutions to improve air traffic surveillance. Aerosp. Sci. Technol. 2015, 45, 203–208. [Google Scholar] [CrossRef]
- Zmysłowski, D.; Skokowski, P.; Kelner, J.M. Anti-drone sensors, effectors, and systems–a concise overview. TransNav Int. J. Mar. Navig. Saf. Sea Transp. 2023, 17, 455–461. [Google Scholar] [CrossRef]
- Herrera, G.J.; Dechant, J.A.; Green, E.; Klein, E.A. Technology Trends in Small Unmanned Aircraft Systems (sUAS) and Counter-UAS: A Five-Year Outlook; Institute for Defense Analyses: Alexandria, VA, USA, 2022. [Google Scholar]
- Lykou, G.; Moustakas, D.; Gritzalis, D. Defending airports from UAS: A survey on cyber-attacks and counter-drone sensing technologies. Sensors 2020, 20, 3537. [Google Scholar] [CrossRef] [PubMed]
- Park, S.; Kim, H.T.; Lee, S.; Joo, H.; Kim, H. Survey on anti-drone systems: Components, designs, and challenges. IEEE Access 2021, 9, 42635–42659. [Google Scholar] [CrossRef]
- McManamon, P.F. Review of ladar: A historic, yet emerging, sensor technology with rich phenomenology. Opt. Eng. 2012, 51, 060901. [Google Scholar] [CrossRef]
- Becker, W.; Bergmann, A. Multi-dimensional time-correlated single photon counting. In Reviews in Fluorescence 2005; Springer: Boston, MA, USA, 2005; pp. 77–108. [Google Scholar]
- Prochazka, I.; Hamal, K.; Sopko, B. Recent achievements in single photon detectors and their applications. J. Mod. Opt. 2004, 51, 1289–1313. [Google Scholar] [CrossRef]
- Yuan, Z.; Kardynal, B.; Sharpe, A.; Shields, A. High speed single photon detection in the near infrared. Appl. Phys. Lett. 2007, 91, 041114. [Google Scholar] [CrossRef]
- Buller, G.; Collins, R. Single-photon generation and detection. Meas. Sci. Technol. 2009, 21, 012002. [Google Scholar] [CrossRef]
- Eisaman, M.D.; Fan, J.; Migdall, A.; Polyakov, S.V. Invited review article: Single-photon sources and detectors. Rev. Sci. Instrum. 2011, 82, 071101. [Google Scholar] [CrossRef]
- Fersch, T.; Weigel, R.; Koelpin, A. Challenges in miniaturized automotive long-range lidar system design. In Proceedings of the Three-Dimensional Imaging, Visualization, and Display 2017, Anaheim, CA, USA, 9–13 April 2017; SPIE: Bellingham, WA, USA, 2017; Volume 10219, pp. 160–171. [Google Scholar]
- Raj, T.; Hanim Hashim, F.; Baseri Huddin, A.; Ibrahim, M.F.; Hussain, A. A survey on LiDAR scanning mechanisms. Electronics 2020, 9, 741. [Google Scholar] [CrossRef]
- Pfeifer, N.; Briese, C. Laser scanning–principles and applications. In Proceedings of the Geosiberia 2007—International Exhibition and Scientific Congress, Novosibirsk, Russia, 25 April 2007; European Association of Geoscientists & Engineers: Bunnik, The Netherlands, 2007; pp. cp–59. [Google Scholar]
- Kim, B.H.; Khan, D.; Bohak, C.; Choi, W.; Lee, H.J.; Kim, M.Y. V-RBNN based small drone detection in augmented datasets for 3D LADAR system. Sensors 2018, 18, 3825. [Google Scholar] [CrossRef] [PubMed]
- Chen, Z.; Liu, B.; Guo, G. Adaptive single photon detection under fluctuating background noise. Opt. Express 2020, 28, 30199–30209. [Google Scholar] [CrossRef] [PubMed]
- Pfennigbauer, M.; Möbius, B.; do Carmo, J.P. Echo digitizing imaging lidar for rendezvous and docking. In Proceedings of the Laser Radar Technology and Applications XIV, Orlando, FL, USA, 13–17 April 2009; SPIE: Bellingham, WA, USA, 2009; Volume 7323, pp. 9–17. [Google Scholar]
- McCarthy, A.; Ren, X.; Della Frera, A.; Gemmell, N.R.; Krichel, N.J.; Scarcella, C.; Ruggeri, A.; Tosi, A.; Buller, G.S. Kilometer-range depth imaging at 1550 nm wavelength using an InGaAs/InP single-photon avalanche diode detector. Opt. Express 2013, 21, 22098–22113. [Google Scholar] [CrossRef] [PubMed]
- Pawlikowska, A.M.; Halimi, A.; Lamb, R.A.; Buller, G.S. Single-photon three-dimensional imaging at up to 10 kilometers range. Opt. Express 2017, 25, 11919–11931. [Google Scholar] [CrossRef] [PubMed]
- Zhou, H.; He, Y.; You, L.; Chen, S.; Zhang, W.; Wu, J.; Wang, Z.; Xie, X. Few-photon imaging at 1550 nm using a low-timing-jitter superconducting nanowire single-photon detector. Opt. Express 2015, 23, 14603–14611. [Google Scholar] [CrossRef]
- Liu, B.; Yu, Y.; Chen, Z.; Han, W. True random coded photon counting Lidar. Opto-Electron. Adv. 2020, 3, 190044. [Google Scholar] [CrossRef]
- Li, Z.P.; Ye, J.T.; Huang, X.; Jiang, P.Y.; Cao, Y.; Hong, Y.; Yu, C.; Zhang, J.; Zhang, Q.; Peng, C.Z.; et al. Single-photon imaging over 200 km. Optica 2021, 8, 344–349. [Google Scholar] [CrossRef]
- Kirmani, A.; Venkatraman, D.; Shin, D.; Colaço, A.; Wong, F.N.; Shapiro, J.H.; Goyal, V.K. First-photon imaging. Science 2014, 343, 58–61. [Google Scholar] [CrossRef]
- Hua, K.; Liu, B.; Chen, Z.; Wang, H.; Fang, L.; Jiang, Y. Fast photon-counting imaging with low acquisition time method. IEEE Photonics J. 2021, 13, 7800312. [Google Scholar] [CrossRef]
- Chen, Z.; Liu, B.; Guo, G.; He, C. Single photon imaging with multi-scale time resolution. Opt. Express 2022, 30, 15895–15904. [Google Scholar] [CrossRef]
- Ding, Y.; Qu, Y.; Zhang, Q.; Tong, J.; Yang, X.; Sun, J. Research on UAV detection technology of Gm-APD Lidar based on YOLO model. In Proceedings of the 2021 IEEE International Conference on Unmanned Systems (ICUS), Beijing, China, 15–17 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 105–109. [Google Scholar]
- Zhou, X.; Sun, J.; Jiang, P.; Qiu, C.; Wang, Q. Improvement of detection probability and ranging performance of Gm-APD LiDAR with spatial correlation and adaptive adjustment of the aperture diameter. Opt. Lasers Eng. 2021, 138, 106452. [Google Scholar] [CrossRef]
- Alspach, D.L. A Gaussian sum approach to the multi-target identification-tracking problem. Automatica 1975, 11, 285–296. [Google Scholar] [CrossRef]
- Simon, D. Kalman filtering. Embed. Syst. Program. 2001, 14, 72–79. [Google Scholar]
- Exner, D.; Bruns, E.; Kurz, D.; Grundhöfer, A.; Bimber, O. Fast and robust CAMShift tracking. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, San Francisco, CA, USA, 13–18 June 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 9–16. [Google Scholar]
- Bi, H.; Ma, J.; Wang, F. An improved particle filter algorithm based on ensemble Kalman filter and Markov chain Monte Carlo method. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 8, 447–459. [Google Scholar] [CrossRef]
- Kulkarni, M.; Wadekar, P.; Dagale, H. Block division based camshift algorithm for real-time object tracking using distributed smart cameras. In Proceedings of the 2013 IEEE International Symposium on Multimedia, Anaheim, CA, USA, 9–11 December 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 292–296. [Google Scholar]
- Yang, P. Efficient particle filter algorithm for ultrasonic sensor-based 2D range-only simultaneous localisation and mapping application. IET Wirel. Sens. Syst. 2012, 2, 394–401. [Google Scholar] [CrossRef]
- Cong, D.; Shi, P.; Zhou, D. An improved camshift algorithm based on RGB histogram equalization. In Proceedings of the 2014 7th International Congress on Image and Signal Processing, Dalian, China, 14–16 October 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 426–430. [Google Scholar]
- Xu, X.; Zhang, H.; Luo, M.; Tan, Z.; Zhang, M.; Yang, H.; Li, Z. Research on target echo characteristics and ranging accuracy for laser radar. Infrared Phys. Technol. 2019, 96, 330–339. [Google Scholar] [CrossRef]
- Chauve, A.; Mallet, C.; Bretar, F.; Durrieu, S.; Deseilligny, M.P.; Puech, W. Processing full-waveform lidar data: Modelling raw signals. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2007, 36, W52. [Google Scholar]
- Laurenzis, M.; Bacher, E.; Christnacher, F. Measuring laser reflection cross-sections of small unmanned aerial vehicles for laser detection, ranging and tracking. In Proceedings of the Laser Radar Technology and Applications XXII, Anaheim, CA, USA, 9–13 April 2017; SPIE: Bellingham, WA, USA, 2017; Volume 10191, pp. 74–82. [Google Scholar]
- Zhang, Y. Detection and tracking of human motion targets in video images based on camshift algorithms. IEEE Sens. J. 2019, 20, 11887–11893. [Google Scholar] [CrossRef]
- Bankar, R.; Salankar, S. Improvement of head gesture recognition using camshift based face tracking with UKF. In Proceedings of the 2019 9th International Conference on Emerging Trends in Engineering and Technology-Signal and Information Processing (ICETET-SIP-19), Nagpur, India, 1–2 November 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–5. [Google Scholar]
- Vincent, L. Morphological area openings and closings for grey-scale images. In Shape in Picture: Mathematical Description of Shape in Grey-Level Images; Springer: Berlin/Heidelberg, Germany, 1994; pp. 197–208. [Google Scholar]
- Nishiguchi, K.I.; Kobayashi, M.; Ichikawa, A. Small target detection from image sequences using recursive max filter. In Proceedings of the Signal and Data Processing of Small Targets 1995, San Diego, CA, USA, 9–14 July 1995; SPIE: Bellingham, WA, USA, 1995; Volume 2561, pp. 153–166. [Google Scholar]
- Rong, W.; Li, Z.; Zhang, W.; Sun, L. An improved CANNY edge detection algorithm. In Proceedings of the 2014 IEEE International Conference on Mechatronics and Automation, Tianjin, China, 3–6 August 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 577–582. [Google Scholar]
- Mondal, S.; Mukherjee, J. Image similarity measurement using region props, color and texture: An approach. Int. J. Comput. Appl. 2015, 121, 23–26. [Google Scholar] [CrossRef]
- Solanki, P.B.; Al-Rubaiai, M.; Tan, X. Extended Kalman filter-based active alignment control for LED optical communication. IEEE/ASME Trans. Mechatron. 2018, 23, 1501–1511. [Google Scholar] [CrossRef]
- Guo, G.; Zhao, S. 3D multi-object tracking with adaptive cubature Kalman filter for autonomous driving. IEEE Trans. Intell. Veh. 2022, 8, 512–519. [Google Scholar] [CrossRef]
- Li, Y.; Bian, C.; Chen, H. Object tracking in satellite videos: Correlation particle filter tracking method with motion estimation by Kalman filter. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5630112. [Google Scholar] [CrossRef]
- Ma, C.; Yang, X.; Zhang, C.; Yang, M.H. Long-term correlation tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 5388–5396. [Google Scholar]
- Hodson, T.O. Root mean square error (RMSE) or mean absolute error (MAE): When to use them or not. Geosci. Model Dev. Discuss. 2022, 15, 5481–5487. [Google Scholar] [CrossRef]
- Cheng, Y. Mean shift, mode seeking, and clustering. IEEE Trans. Pattern Anal. Mach. Intell. 1995, 17, 790–799. [Google Scholar] [CrossRef]
- Comaniciu, D.; Meer, P. Mean shift analysis and applications. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; IEEE: Piscataway, NJ, USA, 1999; Volume 2, pp. 1197–1203. [Google Scholar]
- Carreira-Perpinán, M.A. A review of mean-shift algorithms for clustering. arXiv 2015, arXiv:1503.00687. [Google Scholar]
- Wu, K.L.; Yang, M.S. Mean shift-based clustering. Pattern Recognit. 2007, 40, 3035–3052. [Google Scholar] [CrossRef]
- Yang, J.; Rahardja, S.; Fränti, P. Mean-shift outlier detection and filtering. Pattern Recognit. 2021, 115, 107874. [Google Scholar] [CrossRef]
Fitting Method | MSE | R2 |
---|---|---|
Local peak | 17.08 | −5.1084 |
Centroid weighting | 0.70 | 0.7569 |
Proposed in this paper | 0.25 | 0.9143 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Guo, D.; Qu, Y.; Zhou, X.; Sun, J.; Yin, S.; Lu, J.; Liu, F. Research on Cam–Kalm Automatic Tracking Technology of Low, Slow, and Small Target Based on Gm-APD LiDAR. Remote Sens. 2025, 17, 165. https://doi.org/10.3390/rs17010165
Guo D, Qu Y, Zhou X, Sun J, Yin S, Lu J, Liu F. Research on Cam–Kalm Automatic Tracking Technology of Low, Slow, and Small Target Based on Gm-APD LiDAR. Remote Sensing. 2025; 17(1):165. https://doi.org/10.3390/rs17010165
Chicago/Turabian StyleGuo, Dongfang, Yanchen Qu, Xin Zhou, Jianfeng Sun, Shengwen Yin, Jie Lu, and Feng Liu. 2025. "Research on Cam–Kalm Automatic Tracking Technology of Low, Slow, and Small Target Based on Gm-APD LiDAR" Remote Sensing 17, no. 1: 165. https://doi.org/10.3390/rs17010165
APA StyleGuo, D., Qu, Y., Zhou, X., Sun, J., Yin, S., Lu, J., & Liu, F. (2025). Research on Cam–Kalm Automatic Tracking Technology of Low, Slow, and Small Target Based on Gm-APD LiDAR. Remote Sensing, 17(1), 165. https://doi.org/10.3390/rs17010165