LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques
Abstract
:1. Introduction
- In the field of security and defense, the ability of timely UAV detection prevents dangerous activities, such as espionage, smuggling, and terrorist attacks;
- Drone identification on the issue of confidentiality allows for the protection of individuals and organizations from unauthorized control;
- Timely detection of suspicious UAVs in the matter of maintaining the safety of the airways allows us to prevent an air collision.
2. Fundamentals of LiDAR Technology
2.1. Principles and Components
2.2. Classification and Mechanisms
- -
- -
- -
3. LiDAR Data Processing Techniques
3.1. Clustering-Based Approaches for LiDAR Data Processing
3.2. DL-Based Approaches for LiDAR Data Processing
3.3. Real-World Applications of LiDAR with Deep Learning
3.4. Sensor Fusion for Enhanced UAV Detection
3.4.1. Integrating LiDAR for Robust UAV Detection in GNSS-Denied Environments
3.4.2. Challenges in Sensor Fusion-Based UAV Detection
4. Discussion
5. Conclusions and Future Works
Author Contributions
Funding
Conflicts of Interest
Abbreviations
ADAS | Advanced driving assistance systems |
APE | Absolute pose error |
AVG | Automated Guided Vehicle |
AI | Artificial Intelligence |
AP | Average precision |
APE | Average Pose Error (APE) |
APD | Avalanche photodiode |
API | Application Programming Interface |
ATSP | Asymmetric traveling salesman problem |
BEV | Bird’s-Eye View |
BiLSTM | Bidirectional long short-term memory |
BVLOS | Beyond Visual Line of Sight |
CBRDD | Clustering algorithm based on relative distance and density |
CLSTM | Convolutional long short-term memory |
CNN | Convolutional neural network |
CRNN | Convolutional recurrent neural network |
DBSCAN | Density-Based Spatial Clustering of Applications with Noise |
DGCNN | Dynamic graph convolutional neural network |
DNN | Deep neural network |
EDC | Euclidean Distance Clustering |
EIG | Environmental information gain |
FAA | Federal Aviation Administration |
FAEP | Fast Autonomous Exploration Planner |
FMCW | Frequency modulated Continuous wave |
FDR | False discovery rate |
FNR | False negative rate |
FoV | Field of View |
FUEL | Fast UAV Exploration |
GAN | Generative Adversarial Networks |
GCN | Graph Convolutional Network |
GNN | Graph neural network |
GNSS | Global Navigation Satellite System |
GRU | Gated Recurrent Unit |
HAR | Human active recognition |
IoU | Intersection over Union |
IEEE | Institute of Electrical and Electronics Engineers |
IMU | Inertial Measurement Unit |
ITS | Intelligent Transport System |
LAEA | LiDAR-assisted Exploration Algorithm |
LD | Laser diode |
LiDAR | Light detection and ranging |
LoS | Line-of-Sight |
LSTM | Long short-term memory |
MLP | Multi-Layer Perceptron |
MOCAP | Motion capture |
MOT | Multi-object tracking |
MRS | Mobile robot system |
MVF | Multi-view fusion |
NIR | Near-infrared ray |
OPA | Optical Phased Array |
PCA | Principal component analysis |
PIC | Photonic integrated circuit |
RCS | Radar cross section |
RGB-D | Red, Green, Blue plus Depth |
R-LVIO | Resilient LiDAR-Visual-Inertial Odometry |
RMSE | Root Mean Squared Error |
ROI | Region of interest |
SECOND | Sparsely Embedded Convolutional Detection |
SNR | Signal-to-noise ratio |
SPAD | Single Photon Avalanche Diodes |
TCP/IP | Transmission Control Protocol/Internet Protocol |
TDC | Time-to-digital converter |
ToF | Time of Flight |
References
- Federal Aviation Administration (FAA). Drone Sightings Near Airports. Reported UAS Sightings (January–March 2025). Available online: https://www.faa.gov/uas/resources/public_records/uas_sightings_report (accessed on 12 April 2025).
- Seidaliyeva, U.; Ilipbayeva, L.; Taissariyeva, K.; Smailov, N.; Matson, E.T. Advances and Challenges in Drone Detection and Classification Techniques: A State-of-the-Art Review. Sensors 2024, 24, 125. [Google Scholar] [CrossRef] [PubMed]
- Drone Incident Review: First Half of 2023. Available online: https://d-fendsolutions.com/blog/drone-incident-review-first-half-2023/ (accessed on 8 August 2023).
- Yan, J.; Hu, H.; Gong, J.; Kong, D.; Li, D. Exploring Radar Micro-Doppler Signatures for Recognition of Drone Types. Drones 2023, 7, 280. [Google Scholar] [CrossRef]
- Rudys, S.; Laučys, A.; Ragulis, P.; Aleksiejūnas, R.; Stankevičius, K.; Kinka, M.; Razgūnas, M.; Bručas, D.; Udris, D.; Pomarnacki, R. Hostile UAV Detection and Neutralization Using a UAV System. Drones 2022, 6, 250. [Google Scholar] [CrossRef]
- Brighente, A.; Ciattaglia, G.; Peruzzi, G.; Pozzebon, A.; Spinsante, S. Radar-Based Autonomous Identification of Propellers Type for Malicious Drone Detection. In Proceedings of the 2024 IEEE Sensors Applications Symposium (SAS), Naples, Italy, 23–25 July 2024; pp. 1–6. [Google Scholar]
- Alam, S.S.; Chakma, A.; Rahman, M.H.; Bin Mofidul, R.; Alam, M.M.; Utama, I.B.K.Y.; Jang, Y.M. RF-Enabled Deep-Learning-Assisted Drone Detection and Identification: An End-to-End Approach. Sensors 2023, 23, 4202. [Google Scholar] [CrossRef]
- Yousaf, J.; Zia, H.; Alhalabi, M.; Yaghi, M.; Basmaji, T.; Shehhi, E.A.; Gad, A.; Alkhedher, M.; Ghazal, M. Drone and Controller Detection and Localization: Trends and Challenges. Appl. Sci. 2022, 12, 12612. [Google Scholar] [CrossRef]
- Aouladhadj, D.; Kpre, E.; Deniau, V.; Kharchouf, A.; Gransart, C.; Gaquière, C. Drone Detection and Tracking Using RF Identification Signals. Sensors 2023, 23, 7650. [Google Scholar] [CrossRef]
- Lofù, D.; Di Gennaro, P.; Tedeschi, P.; Di Noia, T.; Di Sciascio, E. URANUS: Radio Frequency Tracking, Classification and Identification of Unmanned Aircraft Vehicles. IEEE Open J. Veh. Technol. 2023, 4, 921–935. [Google Scholar] [CrossRef]
- Casabianca, P.; Zhang, Y. Acoustic-Based UAV Detection Using Late Fusion of Deep Neural Networks. Drones 2021, 5, 54. [Google Scholar] [CrossRef]
- Tejera-Berengue, D.; Zhu-Zhou, F.; Utrilla-Manso, M.; Gil-Pita, R.; Rosa-Zurera, M. Analysis of Distance and Environmental Impact on UAV Acoustic Detection. Electronics 2024, 13, 643. [Google Scholar] [CrossRef]
- Utebayeva, D.; Ilipbayeva, L.; Matson, E.T. Practical Study of Recurrent Neural Networks for Efficient Real-Time Drone Sound Detection: A Review. Drones 2023, 7, 26. [Google Scholar] [CrossRef]
- Salman, S.; Mir, J.; Farooq, M.T.; Malik, A.N.; Haleemdeen, R. Machine Learning Inspired Efficient Audio Drone Detection using Acoustic Features. In Proceedings of the 2021 International Bhurban Conference on Applied Sciences and Technologies (IBCAST), Islamabad, Pakistan, 12–16 January 2021; pp. 335–339. [Google Scholar]
- Sun, Y.; Zhi, X.; Han, H.; Jiang, S.; Shi, T.; Gong, J.; Zhang, W. Enhancing UAV Detection in Surveillance Camera Videos through Spatiotemporal Information and Optical Flow. Sensors 2023, 23, 6037. [Google Scholar] [CrossRef] [PubMed]
- Seidaliyeva, U.; Akhmetov, D.; Ilipbayeva, L.; Matson, E.T. Real-Time and Accurate Drone Detection in a Video with a Static Background. Sensors 2020, 20, 3856. [Google Scholar] [CrossRef] [PubMed]
- Samadzadegan, F.; Dadrass Javan, F.; Ashtari Mahini, F.; Gholamshahi, M. Detection and Recognition of Drones Based on a Deep Convolutional Neural Network Using Visible Imagery. Aerospace 2022, 9, 31. [Google Scholar] [CrossRef]
- Jamil, S.; Fawad; Rahman, M.; Ullah, A.; Badnava, S.; Forsat, M.; Mirjavadi, S.S. Malicious UAV Detection Using Integrated Audio and Visual Features for Public Safety Applications. Sensors 2020, 20, 3923. [Google Scholar] [CrossRef]
- Kim, J.; Lee, D.; Kim, Y.; Shin, H.; Heo, Y.; Wang, Y. Deep learning-based Malicious Drone Detection Using Acoustic and Image Data. In Proceedings of the 2022 Sixth IEEE International Conference on Robotic Computing (IRC), Naples, Italy, 5–7 December 2022; pp. 91–92. [Google Scholar]
- Aledhari, M.; Razzak, R.; Parizi, R.M.; Srivastava, G. Sensor Fusion for Drone Detection. In Proceedings of the 2021 IEEE 93rd Vehicular Technology Conference (VTC2021-Spring), Helsinki, Finland, 25–28 April 2021; pp. 1–7. [Google Scholar]
- Xie, W.; Wan, Y.; Wu, G.; Li, Y.; Zhou, F.; Wu, Q. A RF-Visual Directional Fusion Framework for Precise UAV Positioning. IEEE Internet Things J. 2024, 11, 36736–36747. [Google Scholar] [CrossRef]
- Lyu, H. Detect and avoid system based on multi sensor fusion for UAV. In Proceedings of the 2018 International Conference on Information and Communication Technology Convergence (ICTC), Jeju, Republic of Korea, 17–19 October 2018; pp. 1107–1109. [Google Scholar]
- Mehta, V.; Dadboud, F.; Bolic, M.; Mantegh, I. A Deep learning Approach for Drone Detection and Classification Using Radar and Camera Sensor Fusion. In Proceedings of the 2023 IEEE Sensors Applications Symposium (SAS), Ottawa, ON, Canada, 18–20 July 2023; pp. 1–6. [Google Scholar]
- Li, N.; Ho, C.; Xue, J.; Lim, L.; Chen, G.; Fu, Y.H.; Lee, L. A Progress Review on Solid-State LiDAR and Nanophotonics-Based LiDAR Sensors. Laser Photonics Rev. 2022, 16, 2100511. [Google Scholar] [CrossRef]
- Behroozpour, B.; Sandborn, P.; Wu, M.; Boser, B. Lidar System Architectures and Circuits. IEEE Commun. Mag. 2017, 55, 135–142. [Google Scholar] [CrossRef]
- Assunção, E.; Williams, S. Comparison of continuous wave and pulsed wave laser welding effects. Opt. Lasers Eng. 2013, 51, 674–680. [Google Scholar] [CrossRef]
- Lee, S.; Lee, D.; Choi, P.; Park, D. Accuracy–Power Controllable LiDAR Sensor System with 3D Object Recognition for Autonomous Vehicle. Sensors 2020, 20, 5706. [Google Scholar] [CrossRef]
- Chen, C.; Guo, J.; Wu, H.; Li, Y.; Shi, B. Performance Comparison of Filtering Algorithms for High-Density Airborne LiDAR Point Clouds over Complex LandScapes. Remote Sens. 2021, 13, 2663. [Google Scholar] [CrossRef]
- Feneyrou, P.; Leviandier, L.; Minet, J.; Pillet, G.; Martin, A.; Dolfi, D.; Schlotterbeck, J.P.; Rondeau, P.; Lacondemine, X.; Rieu, A.; et al. Frequency-modulated multifunction lidar for anemometry, range finding, and velocimetry—1. Theory and signal processing. Appl. Opt. 2017, 56, 9663–9675. [Google Scholar] [CrossRef] [PubMed]
- Wang, D.; Watkins, C.; Xie, H. MEMS Mirrors for LiDAR: A Review. Micromachines 2020, 11, 456. [Google Scholar] [CrossRef] [PubMed]
- Lin, C.H.; Zhang, H.S.; Lin, C.P.; Su, G.D.J. Design and Realization of Wide Field-of-View 3D MEMS LiDAR. IEEE Sens. J. 2022, 22, 115–120. [Google Scholar] [CrossRef]
- Berens, F.; Reischl, M.; Elser, S. Generation of synthetic Point Clouds for MEMS LiDAR Sensor. TechRxiv 2022. [Google Scholar] [CrossRef]
- Haider, A.; Cho, Y.; Pigniczki, M.; Köhler, M.H.; Haas, L.; Kastner, L.; Fink, M.; Schardt, M.; Cichy, Y.; Koyama, S.; et al. Performance Evaluation of MEMS-Based Automotive LiDAR Sensor and Its Simulation Model as per ASTM E3125-17 Standard. Sensors 2023, 23, 3113. [Google Scholar] [CrossRef]
- Yoo, H.W.; Druml, N.; Brunner, D.; Schwarzl, C.; Thurner, T.; Hennecke, M.; Schitter, G. MEMS-based lidar for autonomous driving. Elektrotech. Inftech. 2018, 135, 408–415. [Google Scholar] [CrossRef]
- Li, L.; Xing, K.; Zhao, M.; Wang, B.; Chen, J.; Zhuang, P. Optical–Mechanical Integration Analysis and Validation of LiDAR Integrated Systems with a Small Field of View and High Repetition Frequency. Photonics 2024, 11, 179. [Google Scholar] [CrossRef]
- Raj, T.; Hashim, F.H.; Huddin, A.B.; Ibrahim, M.F.; Hussain, A. A Survey on LiDAR Scanning Mechanisms. Electronics 2020, 9, 741. [Google Scholar] [CrossRef]
- Zheng, H.; Han, Y.; Qiu, L.; Zong, Y.; Li, J.; Zhou, Y.; He, Y.; Liu, J.; Wang, G.; Chen, H.; et al. Long-Range Imaging LiDAR with Multiple Denoising Technologies. Appl. Sci. 2024, 14, 3414. [Google Scholar] [CrossRef]
- Wang, Z.; Menenti, M. Challenges and Opportunities in Lidar Remote Sensing. Front. Remote Sens. 2021, 2, 641723. [Google Scholar] [CrossRef]
- Yi, Y.; Wu, D.; Kakdarvishi, V.; Yu, B.; Zhuang, Y.; Khalilian, A. Photonic Integrated Circuits for an Optical Phased Array. Photonics 2024, 11, 243. [Google Scholar] [CrossRef]
- Fu, Y.; Chen, B.; Yue, W.; Tao, M.; Zhao, H.; Li, Y.; Li, X.; Qu, H.; Li, X.; Hu, X.; et al. Target-adaptive optical phased array lidar. Photonics Res. 2024, 12, 904. [Google Scholar] [CrossRef]
- Tontini, A.; Gasparini, L.; Perenzoni, M. Numerical Model of SPAD-Based Direct Time-of-Flight Flash LIDAR CMOS Image Sensors. Sensors 2020, 20, 5203. [Google Scholar] [CrossRef]
- Xia, Z.Q. Flash LiDAR single photon imaging over 50 km. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2023, 2023, 1601–1606. [Google Scholar] [CrossRef]
- Royo, S.; Ballesta-Garcia, M. An Overview of Lidar Imaging Systems for Autonomous Vehicles. Appl. Sci. 2019, 9, 4093. [Google Scholar] [CrossRef]
- Buchner, A.; Hadrath, S.; Burkard, R.; Kolb, F.M.; Ruskowski, J.; Ligges, M.; Grabmaier, A. Analytical Evaluation of Signal-to-Noise Ratios for Avalanche- and Single-Photon Avalanche Diodes. Sensors 2021, 21, 2887. [Google Scholar] [CrossRef]
- Malekmohammadi, S.; Duscha, C.; Jenkins, A.D.; Kelberlau, F.; Gottschall, J.; Reuder, J. Evaluating the Performance of Pulsed and Continuous-Wave Lidar Wind Profilers with a Controlled Motion Experiment. Remote Sens. 2024, 16, 3191. [Google Scholar] [CrossRef]
- Tang, Y.; Li, J.; Xu, L.; Lee, J.-B.; Xie, H. Review of Electrothermal Micromirrors. Micromachines 2022, 13, 429. [Google Scholar] [CrossRef]
- Assaf, E.H.; von Einem, C.; Cadena, C.; Siegwart, R.; Tschopp, F. High-Precision Low-Cost Gimballing Platform for Long-Range Railway Obstacle Detection. Sensors 2022, 22, 474. [Google Scholar] [CrossRef]
- Athavale, R.; Ram, D.H.; Nair, B.B. Low cost solution for 3D mapping of environment using 1D LIDAR for autonomous navigation. 2019 IOP Conf. Ser. Mater. Sci. Eng. 2019, 561, 012104. [Google Scholar] [CrossRef]
- Dogru, S.; Marques, L. Drone Detection Using Sparse Lidar Measurements. IEEE Robot. Autom. Lett. 2022, 7, 3062–3069. [Google Scholar] [CrossRef]
- Hou, X.; Pan, Z.; Lu, L.; Wu, Y.; Hu, J.; Lyu, Y.; Zhao, C. LAEA: A 2D LiDAR-Assisted UAV Exploration Algorithm for Unknown Environments. Drones 2024, 8, 128. [Google Scholar] [CrossRef]
- Gonz, A.; Torres, F. Detection and Classification of Obstacles Using a 2D LiDAR Sensor. In Proceedings of the Fifth International Conference on Advances in Sensors, Actuators, Metering and Sensing (ALLSENSORS), Valencia, Spain, 21–25 November 2020. [Google Scholar]
- Mihálik, M.; Hruboš, M.; Vestenický, P.; Holečko, P.; Nemec, D.; Malobický, B.; Mihálik, J. A Method for Detecting Dynamic Objects Using 2D LiDAR Based on Scan Matching. Appl. Sci. 2022, 12, 5641. [Google Scholar] [CrossRef]
- Fagundes, L.A., Jr.; Caldeira, A.G.; Quemelli, M.B.; Martins, F.N.; Brandão, A.S. Analytical Formalism for Data Representation and Object Detection with 2D LiDAR: Application in Mobile Robotics. Sensors 2024, 24, 2284. [Google Scholar] [CrossRef]
- Tasnim, A.A.; Kuantama, E.; Han, R.; Dawes, J.; Mildren, R.; Nguyen, P. Towards Robust Lidar-based 3D Detection and Tracking of UAVs. In Proceedings of the DroNet ’23: Ninth Workshop on Micro Aerial Vehicle Networks, Systems, and Applications, Helsinki, Finland, 18 June 2023. [Google Scholar]
- Cho, M.; Kim, E. 3D LiDAR Multi-Object Tracking with Short-Term and Long-Term Multi-Level Associations. Remote Sens. 2023, 15, 5486. [Google Scholar] [CrossRef]
- Sun, Z.; Li, Z.; Liu, Y. An Improved Lidar Data Segmentation Algorithm Based on Euclidean Clustering. In Proceedings of the 11th International Conference on Modelling, Identification and Control (ICMIC2019), Tianjin, China, 13–15 July 2019; Lecture Notes in Electrical Engineering. Springer: Singapore, 2019; Volume 582. [Google Scholar]
- Wu, D.; Liang, Z.; Chen, G. Deep learning for LiDAR-only and LiDAR-fusion 3D perception: A survey. Intell. Robot. 2022, 2, 105–129. [Google Scholar] [CrossRef]
- Zheng, L.; Zhang, P.; Tan, J.; Li, F. The Obstacle Detection Method of UAV Based on 2D Lidar. IEEE Access 2019, 7, 163437–163448. [Google Scholar] [CrossRef]
- Xiao, J.; Pisutsin, P.; Tsao, C.W.; Feroskhan, M. Clustering-based Learning for UAV Tracking and Pose Estimation. arXiv 2024, arXiv:2405.16867. [Google Scholar]
- Dow, A.; Manduhu, M.; Dooly, G.; Trslić, P.; Blanck, B.; Knox, C.; Riordan, J. Intelligent Detection and Filtering of Swarm Noise from Drone Acquired LiDAR Data using PointPillars. In Proceedings of the OCEANS 2023, Limerick, Ireland, 5–8 June 2023; pp. 1–6. [Google Scholar]
- Bouazizi, M.; Lorite Mora, A.; Ohtsuki, T. A 2D-Lidar-Equipped Unmanned Robot-Based Approach for Indoor Human Activity Detection. Sensors 2023, 23, 2534. [Google Scholar] [CrossRef]
- Park, C.; Lee, S.; Kim, H.; Lee, D. Aerial Object Detection and Tracking based on Fusion of Vision and Lidar Sensors using Kalman Filter for UAV. Int. J. Adv. Smart Converg. 2020, 9, 232–238. [Google Scholar]
- Deng, T.; Zhou, Y.; Wu, W.; Li, M.; Huang, J.; Liu, S.; Song, Y.; Zuo, H.; Wang, Y.; Yue, Y.; et al. Multi-Modal UAV Detection, Classification and Tracking Algorithm – Technical Report for CVPR 2024 UG2 Challenge. arXiv 2024, arXiv:2405.16464. [Google Scholar]
- Sier, H.; Yu, X.; Catalano, I.; Queralta, J.P.; Zou, Z.; Westerlund, T. UAV Tracking with Lidar as a Camera Sensor in GNSS-Denied Environments. In Proceedings of the 2023 International Conference on Localization and GNSS (ICL-GNSS), Castellón, Spain, 6–8 June 2023; pp. 1–7. [Google Scholar]
- Ge, Y.; Wang, H.; Liu, G.; Chen, Q.; Tang, H. Automated Identification of Rock Discontinuities from 3D Point Clouds Using a Convolutional Neural Network. Rock Mech Rock Eng. 2025, 58, 3683–3700. [Google Scholar] [CrossRef]
- Chen, J.; Lei, B.; Song, Q.; Ying, H.; Chen, Z.; Wu, J. A Hierarchical Graph Network for 3D Object Detection on Point Clouds. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 13–19 June 2020; pp. 392–401. [Google Scholar]
- Liu, X.; Zhang, B.; Liu, N. The Graph Neural Network Detector Based on Neighbor Feature Alignment Mechanism in LIDAR Point Clouds. Machines 2023, 11, 116. [Google Scholar] [CrossRef]
- Peng, H.; Huang, D. Small Object Detection with lightweight PointNet Based on Attention Mechanisms. J. Phys. Conf. Ser. 2024, 2829, 012022. [Google Scholar] [CrossRef]
- Nong, X.; Bai, W.; Liu, G. Airborne LiDAR point cloud classification using PointNet++ network with full neighborhood features. PLoS ONE 2023, 18, e0280346. [Google Scholar] [CrossRef]
- Ye, M.; Xu, S.; Cao, T. HVNet: Hybrid Voxel Network for LiDAR Based 3D Object Detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 1631–1640. [Google Scholar]
- Chen, Y.; Liu, J.; Zhang, X.; Qi, X.; Jia, J. VoxelNeXt: Fully Sparse VoxelNet for 3D Object Detection and Tracking. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 18–22 June 2023; pp. 21674–21683. [Google Scholar]
- Milioto, A.; Vizzo, I.; Behley, J.; Stachniss, C. RangeNet ++: Fast and Accurate LiDAR Semantic Segmentation. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 4213–4220. [Google Scholar]
- Alnaggar, Y.; Afifi, M.; Amer, K.; ElHelw, M. Multi Projection Fusion for Real-Time Semantic Segmentation of 3D LiDAR Point Clouds. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 5–9 January 2021; pp. 1800–1809. [Google Scholar]
- Lis, K.; Kryjak, T. PointPillars Backbone Type Selection for Fast and Accurate LiDAR Object Detection. In Computer Vision and Graphics. ICCVG 2022. Lecture Notes in Networks and Systems; Chmielewski, L.J., Orłowski, A., Eds.; Springer: Cham, Switzerland, 2023; Volume 598. [Google Scholar]
- Manduhu, M.; Dow, A.; Trslic, P.; Dooly, G.; Blanck, B.; Riordan, J. Airborne Sense and Detect of Drones using LiDAR and adapted PointPillars DNN. arXiv 2023, arXiv:2310.09589. [Google Scholar]
- Ma, Z.; Yao, W.; Niu, Y.; Lin, B.; Liu, T. UAV low-altitude obstacle detection based on the fusion of LiDAR and camera. Auton. Intell. Syst. 2021, 1, 12. [Google Scholar] [CrossRef]
- Hammer, M.; Borgmann, B.; Hebel, M.; Arens, M. A multi-sensorial approach for the protection of operational vehicles by detection and classification of small flying objects. In Proceedings of the Electro-Optical Remote Sensing XIV, Online, 21–25 September 2020. [Google Scholar]
- Semenyuk, V.; Kurmashev, I.; Lupidi, A.; Alyoshin, D.; Kurmasheva, L.; Cantelli-Forti, A. Advances in UAV detection: Integrating multi-sensor systems and AI for enhanced accuracy and efficiency. Int. J. Crit. Infrastruct. Prot. 2025, 49, 100744. [Google Scholar] [CrossRef]
- Catalano, I.; Yu, X.; Queralta, J.P. Towards Robust UAV Tracking in GNSS-Denied Environments: A Multi-LiDAR Multi-UAV Dataset. In Proceedings of the 2023 IEEE International Conference on Robotics and Biomimetics (ROBIO), Koh Samui, Thailand, 4–9 December 2023; pp. 1–7. [Google Scholar]
- Zhang, B.; Shao, X.; Wang, Y.; Sun, G.; Yao, W. R-LVIO: Resilient LiDAR-Visual-Inertial Odometry for UAVs in GNSS-denied Environment. Drones 2024, 8, 487. [Google Scholar] [CrossRef]
- Ding, Z.; Sun, Y.; Xu, S.; Pan, Y.; Peng, Y.; Mao, Z. Recent Advances and Perspectives in Deep learning Techniques for 3D Point Cloud Data Processing. Robotics 2023, 12, 100. [Google Scholar] [CrossRef]
- Alaba, S.Y.; Ball, J.E. A Survey on Deep-Learning-Based LiDAR 3D Object Detection for Autonomous Driving. Sensors 2022, 22, 9577. [Google Scholar] [CrossRef] [PubMed]
- Kun, Y. Multi-sensor data fusion for autonomous flight of unmanned aerial vehicles in complex flight environments. Drone Syst. Appl. 2024, 12, 1–12. [Google Scholar]
- Smailov, N.; Uralova, F.; Kadyrova, R.; Magazov, R.; Sabibolda, A. Optimization of machine learning methods for de-anonymization in social networks. Inform. Autom. Pomiary W Gospod. I Ochr. Środowiska 2025, 15, 101–104. [Google Scholar] [CrossRef]
Sensor Type | Strengths | Limitations | Best Use Cases |
---|---|---|---|
Radar [4,5,6] | Long-range detection; differentiates UAVs using micro-Doppler signatures; resistant to adverse weather conditions | Low RCS, high cost, deployment complexity | Airport/military zone monitoring; inclement weather operations |
RF [7,8,9,10] | Detects control signals; long-range detection; passive | Depends on active transmissions; susceptible to encryption and jamming; unable to detect fully autonomous UAVs | Security and surveillance applications |
Acoustic [11,12,13,14] | Low-cost; energy-efficient solution for LoS-limited environments | Limited performance and detection range due to wind and background noise | Indoor, rural, low-altitude, and LoS-restricted environments |
Visual cameras [15,16,17,18,19,20,21,22,23] | Rich visual detail; low-cost and flexible | Sensitivity to lighting, weather, and line-of-sight conditions; lack of depth | Daytime detection; UAV classification |
LiDAR [24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42] | High-resolution 3D spatial data; 3D point cloud generation | Irregular, sparse, high-dimensional data; complicated data processing; computationally intensive | Airspace security; surveillance; autonomous navigation; GNSS-denied zones |
LiDAR Sensor Type | Description | Field of View (FoV) | Scanning Mechanism | Use Cases | Advantages | Limitations |
---|---|---|---|---|---|---|
MEMS LiDAR [30,31,32,33,34] | Uses moving micro-mirror plates to steer laser beam in free space while the rest of the system’s components remain motionless | Moderate, depends on mirror steering angle | Quasi-solid-state scanning (a combination of solid-state LiDAR and mechanical scanning ) | Autonomous vehicles; drones and robotics; medical imaging; space exploration; mobile devices | Accurate steering with minimal moving components; superior in terms of size, resolution, scanning speed, and cost | Limited range and FoV; sensitivity to vibrations and environmental factors |
Optomechanical LiDAR [35,36,37] | Uses mechanical/moving components (mirrors, prisms, or entire sensor heads) to steer the laser beam and scan the environment | Wide FoV (up to 360°) | Rotating and oscillating mirror, spinning prism | Remote sensing, self-driving cars, aerial surveying and mapping, robotics, security | Long range, high accuracy and resolution, wide FoV, fast scanning | Bulky and heavy, high cost and power consumption |
Electromechanical LiDAR [38] | Uses electrically controlled motors or actuators to move mechanical parts that allow to steer the laser beam in various directions | Wide FoV (up to 360°) | Mirror, prism, or entire sensor head | Autonomous vehicles, remote sensing, atmospheric studies, surveying, and mapping | Enhanced scanning patterns, wide FoV, moderate cost, long range, high precision, and accuracy | High power consumption, limited durability, bulky |
OPA LiDAR [39,40] | Employs optical phased arrays (OPAs) to steer the laser beam without any moving components. | Flexible FoV, electronically controlled, can be narrow or wide | Solid-state beam (non-mechanical) scanning mechanism | Autonomous vehicles; high-precision sensing; compact 3D mapping systems | No moving parts; rapid beam steering; compact size; energy efficiency | Limited steering range and beam quality; high manufacturing costs |
Flash LiDAR [41,42] | Employs a broad laser beam and a huge photodetector array to gather 3D data in a single shot | Wide FoV (up to 120° horizontally, 90° vertically) | No scanning mechanism | terrestrial and space applications; advanced driving assistance systems (ADAS); | No moving parts; instantaneous capture; real-time 3D imaging | Limited range; lower resolution; sensitive to light and weather conditions |
Application | Sensors | DL Method | Evaluation Metrics | Environment |
---|---|---|---|---|
Drone self-noise filtering and denoising [60] | airborne LiDAR | PointPillars + synthetic data | percentage and number of removed points | a coastal railway bridge inspection (Ireland) |
human localization and human activity recognition (HAR) [61] | 2D LiDAR | Conv-LSTM deep network | Accuracy, Precision, Recall, F1 score | Simulated indoor household (Unity + ROS2) |
UAV detection and 3D tracking (BVLOS) [62] | LiDAR + RGB Camera | YOLOv2 + Kalman Filter (DL-based fusion) | detection rate and accuracy | Simulated UAV flight (Gazebo) |
drone detection, UAV type classification, and 2D/3D trajectory estimation [63] | Stereo fisheye camera, Conic and Peripheral 3D LiDARs, 77 GHz mmWave radar | YOLOv9 + multimodal 3D pose estimation | Mean Square Error (Pose MSE Loss); UAV type calssification accuracy | Real-world outdoor UAV flights (MMUAD challenge) |
Indoor UAV tracking [64] | Ouster OS0-128 (LiDAR + signal image) | Signal image fusion + DL model | Average Pose Error (APE); Root Mean Squared Error (RMSE); detection range; frame rate (FPS) | Indoor GNSS-denied area |
Rock discontinuity detection (geological engineering) [65] | Terrestrial LiDAR scanner | GoogLeNet CNN + PCA | Accuracy; average dip direction difference and a dip angle difference | Tianjin Longtangou Pumped Storage Power Station (mountainous region in Jizhou District, Tianjin, China) |
3D object detection from raw point clouds [66] | RGB-D sensors | hierarchical graph network (HGNet) | mean average precision (mAP) and coefficient of variation for AP (cvAP) | Indoor 3D environments: SUN RGB-D, ScanNet-V2 datasets |
Autonomous driving [67] | 3D LiDAR | GNN based on neighbor feature alignment mechanism | 3D object detection and location performance in terms of three detection difficulty cases (easy, moderate, hard) | Outdoor road scenes from the KITTI benchmark |
DL Approach | Data Representation | Main Techniques | Strengths | Limitations | Examples |
---|---|---|---|---|---|
Point-based [68,69] | Point clouds | Directly processes point clouds and captures features using shared MLPs | Direct processing of raw point clouds; efficient for sparse data; prevents voxelization and quantization concerns | Computationally expensive due to large-scale and irregular point clouds | PointNet, PointNet++ |
Voxel-based [70,71] | Voxel grids | Converts the sparse and irregular point cloud into a volumetric 3D grid of voxels | well-structured representation; easy-to-use 3D CNNs; suitable for capturing global context | High memory usage and computational cost due to voxelization; loss of precision in 3D space due to quantization; loss of detail in sparse data region. | VoxelNet, SECOND |
Projection-based [72,73] | Plane (image), spherical, cylindrical, BEV projection | Projects the 3D point cloud onto a 2D plane | Efficient processing using 2D CNNs | Loss of spatial features due to 3D-to-2D projection | RANGENet++, BEV, PIXOR, SqueezeSeg |
Graph-based [66,67] | Adjacency matrix, feature matrices, graph Laplacian | Models point clouds as a graph, where each point is regarded as a node and edges represent the interactions between them | Effective for dealing with sparse, non-uniform point clouds; enables both local and global context-aware detection; ideal for capturing spatial relationships between points | High computational complexity due to large point clouds | GNN, HGNet |
Hybrid appoach [74,75] | Combination of raw point clouds, voxels, projections, etc. | Combines several methods to improve the accuracy of 3D object detection, segmentation, and classification tasks | Improved object localization and segmentation accuracy; flexibility | High memory and computational resources, complex architecture | PointPillars, MVF |
Challenge | Description | Affected Component (s) |
---|---|---|
UAV shape/material variability [54] | UAVs differ in reflectivity, size, and dynamics. | LiDAR, radar |
Sensor calibration and alignment [62,76,80] | Misalignment between sensor outputs (e.g., spatial or temporal offsets) degrades fusion accuracy. | LiDAR, Camera, IMU |
Sparse or noisy point clouds [64,76] | Fast UAV motion or occlusions result in low-density, disordered data. | LiDAR |
Environmental vulnerability [62,64] | Rain, fog, or low-light conditions degrade sensor reliability. | Camera, LiDAR |
Data synchronization [63] | Sensor streams operate at different frame rates, causing latency. | All sensors |
High computational load [63,80] | Fusion and DL inference increase latency and resource demand. | Fusion module, DL model |
Limited datasets [79] | Few public datasets include synchronized multi-sensor UAV data. | Model training |
Sensor failure/dropout [80] | Temporary loss of sensors disrupts detection. | Any sensor node |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Seidaliyeva, U.; Ilipbayeva, L.; Utebayeva, D.; Smailov, N.; Matson, E.T.; Tashtay, Y.; Turumbetov, M.; Sabibolda, A. LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques. Sensors 2025, 25, 2757. https://doi.org/10.3390/s25092757
Seidaliyeva U, Ilipbayeva L, Utebayeva D, Smailov N, Matson ET, Tashtay Y, Turumbetov M, Sabibolda A. LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques. Sensors. 2025; 25(9):2757. https://doi.org/10.3390/s25092757
Chicago/Turabian StyleSeidaliyeva, Ulzhalgas, Lyazzat Ilipbayeva, Dana Utebayeva, Nurzhigit Smailov, Eric T. Matson, Yerlan Tashtay, Mukhit Turumbetov, and Akezhan Sabibolda. 2025. "LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques" Sensors 25, no. 9: 2757. https://doi.org/10.3390/s25092757
APA StyleSeidaliyeva, U., Ilipbayeva, L., Utebayeva, D., Smailov, N., Matson, E. T., Tashtay, Y., Turumbetov, M., & Sabibolda, A. (2025). LiDAR Technology for UAV Detection: From Fundamentals and Operational Principles to Advanced Detection and Classification Techniques. Sensors, 25(9), 2757. https://doi.org/10.3390/s25092757