Two-Step CFAR-Based 3D Point Cloud Extraction Method for Circular Scanning Ground-Based Synthetic Aperture Radar
Abstract
:1. Introduction
2. Material and Dataset
3. Method
3.1. CA-CFAR Detector
3.2. Conventional DBSCAN
3.3. Proposed Method
- Three view extractionTo begin with, we extract the three views of maximum value projection of three dimensions in the SAR 3D image and present the corresponding projection diagram in Figure 6. In each of the three views, the maximum value projection for each dimension ensures that all target points are preserved appropriately, thereby minimizing any potential loss of target points during subsequent steps of target point cloud extraction.3D data are , the projection left-view , where is the direction vector of X axis; the projection front-view , where is the direction vector of Y axis; the projection top-view , where is the direction vector of the Z axis. As shown in Equation (7).
- The first step CFARThe first step CFAR is to apply conventional CA-CFAR to the three-view image to obtain the three-view mask, as shown in Equation (8). They are left-view mask , front-view mask and top-view mask .: is left-view;: is front-view;: is top-view;: is left-view mask;: is front-view mask;: is top-view mask.By inverse projecting the three-view mask, the volume mask in the original 3D image is obtained, which is subsequently intersected with the SAR 3D image. This can remove strong sidelobes outside the potential target region and obtain potential target area data. Figure 7 depicts the process schematic. In Figure 6 and Figure 7, red, green, and blue correspond to the projections of pixels in the left view, the main view, and the top view, respectively. The purple sphere represents the intersection of projections in all data.Three-view mask inverse projection process can obtain 3D mask . The corresponding derived formula is given as Equation (9).The 3D mask is then intersected with the 3D data to obtain the potential target area data . The corresponding derivation as shown in Equation (10).
- The second step CFARThe second step CFAR is initiated to perform global two-parameter CFAR detection on the potential target area data . This produces the coarse 3D target point cloud represented by . The global CFAR is perform detection on the whole data rather than using the sliding window, because to the sidelobes are mostly eliminated in the previous step. In practice, it converts the whole extracted 3D data into a row vector, computing its mean and variance to determine the detection threshold , based on the constant false alarm probability. Subsequently, the central pixel of the sliding window is traversed in the potential target area data to detect more accurate target points. The derived formula is presented in Equation (11).
- The modified DBSCANIn response to the uneven cluster density of 3D data target points and their often linear structure in the vertical plane, we have proposed a modification to the conventional DBSCAN clustering algorithm. By imposing more constrained clustering conditions, we aim to obtain more accurate results for such cases. Specifically, we have introduced a change to the shape of the clustering metric from a sphere to a cylinder, as illustrated in Figure 8. This adjustment is particularly useful in addressing scenarios where there may be significant variation in point densities both within and between clusters. In Figure 8, the blue scatter points represent the pixels in the data. The blue sphere and the blue cylinder are the clustering constraint shape of the traditional DBSCAN method and the modified DBSCAN method, respectively.The modified DBSCAN clustering algorithm utilizes three key parameters to describe the compactness of the point cloud distribution, namely, the neighborhood radius r in the x and z planes, the neighborhood distance h along the y-axis, and MinPts, which sets the minimum number of pixels required to form a cluster, which is the density threshold. These parameters should be tuned to achieve optimal clustering performance.Figure 9 is the diagram of a new clustering constraint for the cylindrical shape template. The red point represents any two points in the data. After derivation, Equation (12) provides a formula for determining the distance between different dimensions, while Equation (13) specifies the constraint conditions. It should be noted that these parameters are currently determined based on empirical experience, but future research may explore more optimal ways to identify suitable parameter configurations that can further enhance performance.: is the distance of x and z planes;: is the distance of height;r: is the cylindrical shape template radius;h: is the cylindrical shape template half the height.Our algorithm starts by randomly searching for pixels in the 3D target point cloud . For each identified pixel, the algorithm conducts a search within the cylindrical area, applying the previously defined constraints. Any cluster of pixels that contains more Minpts pixels as defined by our clustering conditions is marked and classified accordingly. Pixels that do not fit these criteria are labeled as noise. This process continues until all pixels are either classified into one or more clusters or classified as noise. As we note, the modified DBSCAN algorithm is represented by mDBSCAN in Equation (14). Through this method, we obtain the final clustering result , with a precise spatial distribution of points, which suit the 3D CS-GBSAR data.After the clustering process, we analyze the resulting data to filter out strong sidelobe classes based on target characteristics such as angle, shape, size and location. This filtering step improves the accuracy of our 3D point cloud extraction.
4. Experiment
4.1. The First Step CFAR Detection Results
4.2. The Second Step CFAR Detection Results
5. Discussion
5.1. Comparison with the Results of Traditional Methods
5.2. Summary of Proposed Method
6. Conclusions
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Lin, Y.; Song, Y.; Wang, Y.; Li, Y. The large field of view fast imaging algorithm for arc synthetic aperture radar. J. Signal Process. 2019, 35, 499–506. [Google Scholar]
- Zhang, J.; Wu, J.; Wang, X. Application of ground based synthetic aperture radar in monitoring and early warning of mining area slope. Geotech. Investig. Surv. 2021, 49, 59–62. [Google Scholar]
- Wang, Y.; Song, Y.; Hong, W.; Zhang, Y.; Lin, Y.; Li, Y.; Bai, Z.; Zhang, Q.; Lv, S.; Liu, H. Ground-based differential interferometry SAR: A review. IEEE Geosci. Remote Sens. Mag. 2020, 8, 43–70. [Google Scholar] [CrossRef]
- Lee, H.; Ji, Y.; Han, H. Experiments on a Ground-Based Tomographic Synthetic Aperture Radar. Remote Sens. 2016, 8, 667. [Google Scholar] [CrossRef] [Green Version]
- Tebaldini, S. Single and multipolarimetric SAR tomography of forested areas: A parametric approach. IEEE Trans. Geosci. Remote Sens. 2010, 48, 2375–2387. [Google Scholar] [CrossRef]
- Aguilera, E.; Nannini, M.; Reigber, A. A data-adaptive compressed sensing approach to polarimetric SAR tomography of forested areas. IEEE Geosci. Remote Sens. Lett. 2013, 10, 543–547. [Google Scholar] [CrossRef] [Green Version]
- Zhu, X.X.; Bamler, R. Very high resolution spaceborne SAR tomography in urban environment. IEEE Trans. Geosci. Remote Sens. 2010, 48, 4296–4308. [Google Scholar] [CrossRef] [Green Version]
- Zhu, X.X.; Bamler, R. Demonstration of super-resolution for tomographic SAR imaging in urban environment. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3150–3157. [Google Scholar] [CrossRef] [Green Version]
- Wang, Y.; Zhang, Q.; Lin, Y.; Zhao, Z.; Li, Y. Multi-Phase-Center Sidelobe Suppression Method for Circular GBSAR Based on Sparse Spectrum. IEEE Access 2020, 8, 133802–133816. [Google Scholar] [CrossRef]
- Zhang, Y.; Zhang, Q.; Wang, Y.; Lin, Y.; Li, Y.; Bai, Z.; Li, F. An approach to wide-field imaging of linear rail ground-based SAR in high squint multi-angle mode. J. Syst. Eng. Electron. 2020, 31, 722–733. [Google Scholar]
- Wang, Y.; He, Z.; Zhan, X.; Fu, Y.; Zhou, L. Three-dimensional sparse SAR imaging with generalized lq regularization. J. Remote Sens. 2022, 14, 288. [Google Scholar] [CrossRef]
- Feng, S.; Lin, Y.; Wang, Y.; Yang, Y.; Shen, W.; Teng, F.; Hong, W. DEM Generation With a Scale Factor Using Multi-Aspect SAR Imagery Applying Radargrammetry. Remote Sens. 2020, 12, 556. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Zhang, J.; Zhang, X.; Lang, H. Task distribution balancing for parallel two-parameter CFAR ship detection. J. Remote Sens. 2016, 20, 344–351. [Google Scholar]
- Du, L.; Wang, Z.; Wang, Y.; Wei, D.; Li, L. Survey of research progress on target detection and discrimination of single-channel SAR images for complex scenes. J. Radars. 2020, 9, 34–54. [Google Scholar]
- Ai, J.; Yang, X.; Song, J.; Dong, Z.; Jia, L.; Zhou, F. An adaptively truncated clutter-statistics-based two-parameter CFAR detector in SAR imagery. IEEE J. Ocean. Eng. 2018, 43, 269–279. [Google Scholar] [CrossRef]
- Zhu, Z.; Zhou, N.; He, S. Identification of offshore fixed facilities and dynamic ships based on sentinel-1A. Remote Sens. Inform. 2022, 37, 77–86. [Google Scholar]
- Ester, M.; Kriegel, H.P.; Sander, J.; Xu, X. A Density-based Algorithm for Discovering Clusters in Large Spatial Databases with Noise. In Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, Portland, OR, USA, 2–4 August 1996; pp. 226–231. [Google Scholar]
- Bushra, A.A.; Yi, G. Comparative Analysis Review of Pioneering DBSCAN and Successive Density-based Clustering Algorithms. IEEE Access 2021, 9, 87918–87935. [Google Scholar] [CrossRef]
- Schubert, E.; Sander, J.; Ester, M.; Kriegel, H.P.; Xu, X. DBSCAN revisited, revisited: Why and how you should (still) use DBSCAN. ACM Trans. Database Syst. 2017, 42, 1–21. [Google Scholar] [CrossRef]
- Li, S.S. An Improved DBSCAN Algorithm Based on the Neighbor Similarity and Fast Nearest Neighbor Query. IEEE Access 2020, 8, 47468–47476. [Google Scholar] [CrossRef]
- Li, L. Study of SAR Target Detection and Recognition with Feature Fusion. Ph.D. Dissertation, Xidian University, Xi’an, China, 2021. [Google Scholar]
- Banerjee, A.; Burlina, P.; Chellappa, R. Adaptive target detection in foliage-penetrating SAR images using alpha-stable models. IEEE Trans. Image Process. 1999, 8, 1823–1831. [Google Scholar] [CrossRef] [Green Version]
- Zhang, L.; Zhang, Z.; Lu, S.; Xiang, D.; Su, Y. Fast Superpixel-Based Non-Window CFAR Ship Detector for SAR Imagery. Remote Sens. 2022, 14, 2092. [Google Scholar] [CrossRef]
- Wang, C.; Liao, M.; Li, X. Ship Detection in SAR Image Based on the Alpha-stable Distribution. Sensors 2008, 8, 4948–4960. [Google Scholar] [CrossRef] [PubMed]
- Gao, G.; Liu, L.; Zhao, L.; Shi, G.; Kuang, G. An adaptive and fast CFAR algorithm based on automatic censoring for target detection in high-resolution SAR images. IEEE Trans. Geosci. Remote Sens. 2008, 47, 1685–1697. [Google Scholar] [CrossRef]
- Li, T.; Liu, Z.; Xie, R.; Ran, L. An Improved Superpixel-Level CFAR Detection Method for Ship Targets in High-Resolution SAR Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 184–194. [Google Scholar] [CrossRef]
- Zhang, L.; Ding, G.; Wu, Q.; Zhu, H. Spectrum Sensing Under Spectrum Misuse Behaviors: A Multi-Hypothesis Test Perspective. IEEE Trans. Inf. Forensics Secur. 2018, 13, 993–1007. [Google Scholar] [CrossRef]
- Zhang, F.; Zhou, H.; Lei, L. An CFAR detection algorithm based on local fractal dimension. Signal Process. 2012, 28, 7. [Google Scholar]
- Altuntas, C. Review of Scanning and Pixel Array-Based LiDAR Point-Cloud Measurement Techniques to Capture 3D Shape or Motion. Appl. Sci. 2023, 13, 6488. [Google Scholar] [CrossRef]
- Xu, J.; Yao, C.; Ma, H.; Qian, C.; Wang, J. Automatic Point Cloud Colorization of Ground-Based LiDAR Data Using Video Imagery without Position and Orientation System. Remote Sens. 2023, 15, 2658. [Google Scholar] [CrossRef]
- Cui, Y.; Zhou, G.; Yang, J.; Yamaguchi, Y. On the Iterative Censoring for Target Detection in SAR Images. IEEE Geosci. Remote Sens. Lett. 2011, 8, 5. [Google Scholar] [CrossRef]
- Chen, X.; Sun, J.; Yin, K.; Yu, J. An algorithm of ship target detection in SAR images based on cascaded CFAR. Modern Radar. 2012, 34, 50–54, 58. [Google Scholar]
- Song, W.; Wang, Y.; Liu, H. An automatic block-to-block censoring target detector for high resolution SAR image. J. Electron. Inf. Technol. 2016, 38, 1017–1025. [Google Scholar]
- Huang, Y.; Liu, F. Detecting cars in VHR SAR images via semantic CFAR algorithm. IEEE Geosci. Remote Sens. Lett. 2016, 13, 801–805. [Google Scholar] [CrossRef]
- Pappas, O.; Achim, A.; Bull, D. Superpixel-Level CFAR Detectors for Ship Detection in SAR Imagery. IEEE Geosci. Remote Sens. Lett. 2018, 15, 1397–1401. [Google Scholar] [CrossRef] [Green Version]
- Ai, J.; Mao, Y.; Luo, Q.; Xing, M.; Jiang, K.; Jia, L.; Yang, X. Robust CFAR Ship Detector Based on Bilateral-Trimmed-Statistics of Complex Ocean Scenes in SAR Imagery: A Closed-Form Solution. IEEE Trans. Aerosp. Electron. Syst. 2021, 57, 1872–1890. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, S.; Mao, C.; Yan, H.; Li, C.; Duan, C. Multi-source Fusion CFAR Detection Method Based on the Contrast of Sliding Window. Modern Radar. 2023, 45, 25–30. [Google Scholar]
Parameter | Numerical Value |
---|---|
Center frequency | 17∼17.5 GHz |
Signal bandwidth | 500 MHz |
Radar transmit signal | FMCW |
Synthetic aperture | 360 |
Radius of arm | m |
Parameter | Proposed Method (CS-GBSAR) | Truth Data (LiDAR) | Data Error |
---|---|---|---|
Length | m | 10 m | m |
Height | m | 28 m | m |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shen, W.; Zhi, J.; Wang, Y.; Sun, J.; Lin, Y.; Li, Y.; Jiang, W. Two-Step CFAR-Based 3D Point Cloud Extraction Method for Circular Scanning Ground-Based Synthetic Aperture Radar. Appl. Sci. 2023, 13, 7164. https://doi.org/10.3390/app13127164
Shen W, Zhi J, Wang Y, Sun J, Lin Y, Li Y, Jiang W. Two-Step CFAR-Based 3D Point Cloud Extraction Method for Circular Scanning Ground-Based Synthetic Aperture Radar. Applied Sciences. 2023; 13(12):7164. https://doi.org/10.3390/app13127164
Chicago/Turabian StyleShen, Wenjie, Jie Zhi, Yanping Wang, Jinping Sun, Yun Lin, Yang Li, and Wen Jiang. 2023. "Two-Step CFAR-Based 3D Point Cloud Extraction Method for Circular Scanning Ground-Based Synthetic Aperture Radar" Applied Sciences 13, no. 12: 7164. https://doi.org/10.3390/app13127164
APA StyleShen, W., Zhi, J., Wang, Y., Sun, J., Lin, Y., Li, Y., & Jiang, W. (2023). Two-Step CFAR-Based 3D Point Cloud Extraction Method for Circular Scanning Ground-Based Synthetic Aperture Radar. Applied Sciences, 13(12), 7164. https://doi.org/10.3390/app13127164